Miscellaneous

New Warehouse, who dis?

Moving to a new home is always a challenge. Here’s the current progress on getting set up, with notes about what’s left to do.

Image

Welcome! This garage is too small for my car, but plenty for making stuff.

Image

Welcome to my office, which apparently is turning into some kind of man cave. As I set out these chairs I couldn’t help but imagine running DND campaigns out here.

Image

Half of this is exclusively Makelangelo parts. The bottom half is stuff I want to make leave.

Image

My 2013 G Weike 60W C02 laser cutter, and some wood. The plan is to hook this up with a portable air filter. Until then I have to rely on third party suppliers for my laser cutting.

Image

Reachable tools available immediately, and less available tools below. I’ve managed to get rid of that griding wheel already. The big fan is part of the laser cutter.

Image

None of this is staying here. The entire shelf has to be torn out. The right hand side will be for the filter system, on wheels, so I can push it out the door to the right when I’m using it.

Image

This is where the laser cutter will live when these shelves come out, too. The chiller for the laser cutter is on the bottom shelf.

Image

Almost all of this is packing material I don’t need any more (wrong box size), and two SIXI 2 robots in the back. I plan to disassemble at least one of them for reuse. SIXI 2 was my fourth attempt at an arm. In the back on top is SIXI 3, my *fifth* attempt at an arm.

Image

Bars, tubes, screw thread, rods (all in bucket), continuous servos, various SIXI 2 parts – at least enough to build a new robot arm. Bearings, more robots, and some misc boxes I don’t even know how to start sorting. A lot of parts built up from… well let’s call them tests that didn’t give me the results I wanted. The blue arm is my second attempt.

Image

Even less sorted stuff. Hog drive robot, line followers, 3 stewart platforms in two types.

Image

I was thinking about having this shelf be the trophy case where I put all the robots I want to keep. The only problem there is that PLA, in the sun, will probably get soft. So I might go one shelf over. The red arm is my third attempt.

You’ll notice there’s no workbench but there is a pile of bench parts. The plan is that when the shelves come out there should be just enough room to set up one bench here. In the meantime I’m doing all my assembly in the home office. I am getting a lot of steps in a day traveling between the two…

Final thoughts

My next step is to sell enough that I can empty the white shelves; take down the shelves and sell them; install the laser cutter; and finally get back to producing my own parts. It’s a process that will take some time so I tackle a bit more every day and try to find satisfaction in the incremental improvements.

Send me your kindest thoughts. I have so many NEMA17 stepper motors and servos for sale. Don’t be shy, hmu.

Robot Arm

Record & Playback 4

I have been building a robot arm. You may have seen it on my Instagram. I also have an open source Java app called Robot Overlord, which can simulate the arm in 3D. Robot overlord can drive the simulation AND the real arm by moving the joystick. All the moves in this video were done with the Playstation controller:

In a previous post on Hackaday.io, I briefly covered a system I wrote into Robot Overlord that would capture all the joystick data and play it back on command. Technically, that worked. Qualified success.

Watch I stream robot related tutorials from imakerobots on www.twitch.tv

However! Driving this way is way inefficient. new instructions are sent to the arm 30 times a second. The arm can’t see where it is going, so it can’t plan to reach high speeds. It’s like a very fast game of Marco Polo. Also if you’re a novice driver like me it’s really easy to go the wrong way. It would be great if I could move the simulated arm to a destination pose BUT only update the real robot when I’m sure I like my destination pose. Then the arm would then move in a straight line from start pose to end pose at top speed.

First I needed a way to save any pose to a file on disk and then bring it back. Then I could save and load single poses. Then I could play those poses back to the real robot, same as I did with the joystick model. Then I could repeat tests, which helps me confirm things work correctly.

If I have a start and an end pose then I can find a way to interpolate between two poses – I can split that line into sub poses if needed. I can already send poses to the robot. So what I can do is find the happy trade off between too many poses (no acceleration) and too few (less accurate movement).

Looking through my daily notes I see I started on the new system some time before 2019-8-13, because that was when the weirdness started: I found cases where recording to disk and coming back were out of sync. Not identically 1:1. Discombobulated. When I tried to play back a recording the hand of the robot (J5) was always turned 90 degrees from the original recording. As I began to dig into why, I opened a whole can of worms. Bigguns.

Worm: The robot sim in Robot Overlord was wrong.

When Jin Han and I built the computer model of the robot arm in Fusion360, the design was started in November 2018 and back then we started facing the wrong direction.

Arm designed pointing at -Z

When I say it was built facing the wrong direction, I mean that I imagined That both Fusion360 and Robot Overlord would have the hand pointing at +X and up was +Z. In fact, in Fusion360 the hand is pointing at -Z and up is +Y, and in Robot Overlord I reassembled the arm with the hand facing -Y and up is +Z. Copying the model over was stupid hard and I didn’t realize that was partly because I was doing it the wrong way, turned 90 degrees on two axies. It would have been easier if it was upside down and backwards!

My method to solve it was to load one joint at a time starting at the base, get it turned facing upwards, and then add another link and so on. Once all the bones were in their relative positions, build D-H parameters that matched.

Worm: The D-H model of the arm was wrong.

The Sixi was the first robot arm I ever coded that used Denavit–Hartenberg parameters. One of the reasons I used D-H parameters is that they’re well documented and supported by other people into robotics. I can easily use D-H to calculate Forward Kinematics (FK), where i know the angle of every joint in the arm and I want to get the pose of the hand. (A pose is a position in space and an orientation. One common way to describe this combo is with a 4×4 matrix). I could also use Youtube videos that explained how to calculate Inverse Kinematics for a robot arm with D-H parameters. Especially tricky is the spherical wrist:

I found the videos on spherical wrists were incomplete and it wasn’t until I stumbled on these notes from York University in Canada that I found the missing piece.

Worm: Inverse Kinematics calculations were wrong.

Of course my code didn’t quite match the stuff I’d been taught because my model was facing -Y instead of +Z – a 90 degree turn. Every time the tutorials said use atan(y,x) I had to write atan(-x,y).

Not knowing that I’d done all this stuff wrong yet, I had to diagnose the problem. I build a jUnit test in src.test.java.com.marginallyclever.robotOverlord.MiscTests.java:TestFK2IK(). This test sweeps the arm through the set of all angles keyframe0. Every instance in keyframe0 creates some possible pose m0. Some m0 can be solved with Inverse Kinematics to make some other keyframe1. keyframe1 can create a pose m1. m1 should ALWAYS match m0. I got lot reams of big data, all of which told me Yes, there’s definitely something wrong. It took about a week of nail-biting research until I figured out and unscrambled each of those worms.

So what does all that mean? It means I can now build meaningful recordings and now I can start to search for the right happy trade off .