Live motion tracking

It is a bank holiday, and we are all quite…cheerful, post company barbecue, so I will keep this brief. Here’s a motion tracking demo from Erik Haberup. He says:

In case the Raspberry Pi team would like another example of the versatility of their product.

This is my capstone project for the University of Nebraska – Lincoln (Computer Electronics Engineering) which uses a Raspberry Pi to wirelessly transmit live motion tracking data from a set of 13 inertial measurement units.

I’ll ask Erik for some more information, but in the meantime, I thought you might like to enjoy this *outstanding* video.

6 comments

Avatar

I am surprised by how well this appears to work based on the video. Is this using *only* inertial sensors (eg. accelerometers and turn-rate sensors) and not any other kind of sensor? So to get displacement you have to integrate twice with respect to time? So any DC zero-offset will give you a quadratically-increasing displacement error? I did a basically similar experiment a year ago using a single cell-phone type 3-axis accelerometer and decided it was hopeless. Apparently his sensors are dramatically more sensitive and low-drift?

Avatar

Well, the video was pretty short. I performed a similar experiment and used kalman filtering to help reduce drifting. I was able to go several minutes(10 or so) before drifting would get me off into the weeds.

Avatar

Only inertial sensors are employed. Each IMU is an MPU6050 containing a 3 axis accelerometer and 3 axis gyroscope. The calculations are done via relative orientations between the IMUs not via displacement so drift is only an issue along the Z-axis rotation. The IMUs have a internal sample rate of up to 8kHz so the quantization errors are minimal. The Z-axis drift could also be removed if an accurate magnometer was employed to compensate, but this was not done due to cost constraints (and to simplify calculations a little).

Avatar

13 IMUs seems to be overkill for the number of degrees of freedom involved. The human shoulder-arm-hand has a very large number of degrees of freedom, while the virtual manipulator that’s shown has (relatively) few. I’m questioning, not quibbling, because I have some experience with 6 DOF and 7 DOF manipulators, and I know them to be computationally difficult. Overall, I congratulate Erik on an interesting project and a nice presentation. Good job!

Avatar

Thank you Tom, Only two IMUs are used to track the arm, one additional on the wrist and ten are used on the fingers (two per digit) like so:

https://www.dropbox.com/s/06jcb98r6lh8dkx/IMU%20Locations.png

The model used is from a second part of the project, which there was not sufficient time to complete:

https://www.dropbox.com/s/71lenmjwyyq2kt1/_1018004.JPG

While the set of inertial sensors can be mapped to even more degrees of freedom, they were only mapped to the range of motion that could theoretically be produced by the robotic arm.

Avatar

Erik,
Thanks for the clarification. I review student engineering projects at a couple of local universities, and yours is among the best I have seen. Best luck in your engineering career!

Leave a Comment

Comments are closed