I’m doing some very careful testing before I set Hermione loose live to fly in a circle. This morning, I’ve confirmed the video lateral motion block tracking is working well.
For this first unpowered flight, I walked her forwards about 3m and then left by about the same. Note that she always pointed in the same direction; I walked sideways to get the left movement:
Forward – Left
For this second unpowered flight, again, I walked her forwards about 3m, but then rotated her by 90° CCW before walking another. Because of the yaw, from her point of view, she only flew forwards, and the yaw is not exposed on the graph. This is exactly how it should be:
Forward – Yaw 90° CCW – Forward
So I’m happy the lateral motion tracking is working perfectly. Next I need to look at the target. I can go that with the same stats.
The only problem I had was that the sun needs to be shining bright for the video tracking to ‘fly’ above the lawn; clearly it needs the high contrast in the grass when sunlit.
The problem: the camera point of view is in the quad frame; the garmin point of view is in the earth frame. They need to both be in the same frame to produce a vector that’s meaningful. A pretty radical rewrite of this area last night resulted. A test flight this morning sadly was pretty much the same as yesterday: a very stable hover, but shooting off right when she should have gone left. More stats:
The top pair of accelerometer vs camera show pretty good alignment, right up to the point of 0.4m to the right. I believe this is correct, but I wouldn’t put money on it yet!
The middle pair are accelerometer vs LiDAR height over time, which is excellent.
The bottom pair are the flight plans in earth and quad frames (the quad one is simply the earth one rotated from my to her POV) – this is where there’s clearly a problem – they should be the same but they are wrong once the flight rotates. I can’t see an obvious bug in the code, which makes me suspect there’s an obvious bug in my understanding instead.