After a gentle nudge from Jeremy and his pointy stick yesterday, I had a quick play with Kitty++. Follow the link for what she does and how, but in summary, she uses the Raspicam video to churn out the ‘motion vectors’ between video frames as part of the h.264 video compression algorithm. Because the video frames are at a fixed rate (10fps in my case), then how ‘far’ the frame have moved is actually how ‘fast’ – i.e. a velocity vector that could be merged / blended / fused with the accelerometer velocity as a way to constrain drift.
The sub-project stalled last year due to lack of believable data, but yesterday, after a couple of bug fixes, here’s a chart showing the movement over a 10s ‘flight’ of my prototyping Pi. I was moving the Pi forwards and backwards and left and right in a cross shape – you can see the movement, but the cross shape gets corrupted as periodically the video compression algorithm resets itself to maintain accuracy – a problem for this integrated distance chart, but not for the velocity vector I need.
If you follow the line from the 0,0 point, you can see the line travelling left / right first, then up / down. This is 100 frames over 10 seconds.
There’s a lot of details to be tweaked to make this work properly especially to get units correct but it’s now another viable solution in addition to the laser dot tracking for motion control. The advantage of this over the laser tracking is I can get lots more frames a second so the motion fusion can happen much faster than the 1s limit I’m seeing using the laser tracking.
P.S. The only real down side of this is that without a camera, Zoe can’t use it, and Zoe will continue to be my preferred show-and-tell quad as she’s easier to transport. So I will continue to work out why Zoe drift the way she shouldn’t be according to the 0g offsets. I do have one idea to test later today if I get the chance.