“To mow, or not to mow, that is the question:

Whether ’tis easier for the macro-blocks to track
The clovers and daisies of contrasting colour..”

The answer is no, I shouldn’t have mown the lawn.  With the kids’ toys moved out of the way, and any ungreen contrasting features slain, there was nothing to distinguish one blade of shorn grass from another and Hermione drifted wildly.  Reinstating the kids’ chaos restored high quality tracking over 5m.

The point of the flight was to compare GPS versus macro-block lateral tracking.  Certainly over this 5m flight, down-facing video beat GPS hands down:

Lateral tracking

Lateral tracking

My best guess interpretation of the GPS graph is that the flight was actually from the 2 – 7m diagonally heading north west.  The camera POV doesn’t include compass data, so it’s correctly showing her flying forwards by 5m.  The compass code is not working accurately yet – it needs more investigation why not – it was showing ~90° (i.e. East) rather than the true 45° (i.e. North East) shown by the GPS and a handheld compass.

I’ve done some more refinements to scheduling the sensor reads, and also accuracy of GPS data streamed from the GPS process.  It’s worth viewing this graph full screen – each spike shows the time in seconds between motion processing loops – i.e. the time spent processing other sensors – 10ms indicates just IMU data was processed.  The fact no loop takes more than 0.042s* even with full diagnostics running means I could up the sampling rate back to 1kHz – it’s at 500Hz at the moment.  More importantly, it shows processing is nicely spread out and each sensor is getting it’s fair share of the processing and nobody is hogging the limelight.

Motion processing

Motion processing

As a result, I’ve updated the code on GitHub.


*42ms is the point where the IMU FIFO overflows at 1kHz sampling – 512 FIFO size / 12 bytes sample size / 1kHz sampling rate

GPS + compass + yaw control code

Coding this combination of features has taken me several days, covering complex interaction depending on whether each is installed and how they are configured.  This has required a rework of the GPS processing and Flight Plan objects to ensure they interact cleanly, along with enhanced configuration parameters and use of compass.  Here’s the overview.

yaw control compass gps control
yaw control Defines always forward flight with yaw to turn. Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift. No interactions without compass
compass Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift. Defines NSEW during flight Defines direction of flight between waypoints.
gps control No interaction without compass Defines direction of flight between waypoints. Defines flight plan waypoints; tracks locations during flights.

I _think_ I’ve got all bases covered now in the code, but there’s a lot of new pieces to be tested individually, and in all possible combinations, passively first, and then in real flights. For the first time in this project, I’m going to need to plan my testing so I don’t miss a combination until it’s too late.

On the plus side, somehow after her last video, Hermione’s shoulder joint broke while I was testing higher video frame sizes and rates, and the camera stopped working.  Until the replacement parts arrive, there’s a lot of spare time for thinking and coding, and also occasionally tickling my fancy – more on the latter anon.

Lateral motion tracking with yaw

I’m doing some very careful testing before I set Hermione loose live to fly in a circle.  This morning, I’ve confirmed the video lateral motion block tracking is working well.

For this first unpowered flight, I walked her forwards about 3m and then left by about the same.  Note that she always pointed in the same direction; I walked sideways to get the left movement:

Forward - Left

Forward – Left

For this second unpowered flight, again, I walked her forwards about 3m, but then rotated her by 90° CCW before walking another.  Because of the yaw, from her point of view, she only flew forwards, and the yaw is not exposed on the graph.  This is exactly how it should be:

Forward - Yaw 90° CCW - Forward

Forward – Yaw 90° CCW – Forward

So I’m happy the lateral motion tracking is working perfectly.  Next I need to look at the target.  I can go that with the same stats.

The only problem I had was that the sun needs to be shining bright for the video tracking to ‘fly’ above the lawn; clearly it needs the high contrast in the grass when sunlit.

A difference of opinion.

By lowering the video frame rate to 10Hz, I’ve been able to increase the video resolution to 720² pixels.  In addition I’ve increased the contrast on the video to 100%.  Together these now provide enough detail to track lateral motion on the lawn.  Drift with hover is non-existent, so next step was to try a flight around a 2m square.  That’s where the disagreement showed itself:

Difference of opinion

Difference of opinion

  • Top left is the flight plan up to the point I killed the flight: 2 meters forwards and left by 0.35m
  • Top right shows the 90° anticlockwise yaw so she points the way she’s going
  • Bottom left is the track picked up by the PiCamera macro-blocks
  • Bottom right is the track derived by double integrating the accelerometer.

Both agree on the forward motion of about 2 meters, but the disagreement arises at the point she turns left.  The right of the pair is correct based on my independent third-party view of the flight; although she was pointing left, she flew right from my point of view i.e. backwards from her point of view.  I’ve clearly got the maths back-to-front in the lateral motion tracking.  I’m pretty sure of the offending line of code, and the fix is trivial, but I’m really struggling to convince myself why what’s there is wrong.

Luckily, during the flights, there were a number of high-torque landings which ultimately broke the bracket for one of Hermione’s legs.  Until the replacement arrives from Poland, I have plenty of time to kill convincing myself why the existing code is wrong.

Buttercups and daisies…

are lacking yet this spring, and having mown the lawn yesterday, features are hard to find for the video lateral tracking.  So I think this is a pretty good 37s hover.  In fact, I think it’s as good as it can be until the daisies start sprouting:

This is with a frame size of 640² pixels.  There’s an check in the code which reports whether the code keeps up with the video frame rate.  At 640² it does; I tried 800² and 720² but the code failed to keep up with the video frame rate of 20fps.

As a result, I’ve uploaded the changes to GitHub.  There’s work-in-progress code there for calibrating the compass “calibrateCompass()”, although that’s turning out to be a right PITA.  I’ll explain more another time.

As a side note, my Mavic uses two forward facing camera to stereoscopically track horizontal movement, combined with GPS and a corresponding ground facing pair of cameras and the IMU accelerometer integration, yet if you watch the frisbee / baseball bat to the left, even the Mavic drifts.

Chicken poo tracking

If you look at yesterday’s video full screen, from top left to right, you can see a muddy patch and two chicken poos, the second poo of which is close to Hermione’s front left prop on take-off.  I was back out in the dark last night, tracking them down.  Here’s why:

Lateral tracking

Lateral tracking

Although the graph of camera lateral tracking and the Mavic video are almost facsimiles in direction, the scale is out; the graph shows the distance from take-off to landing to be about 1.7m whereas a tape measure from chicken poo #2 to the cotoneaster shrubbery landing point measures about 4.2m.  Given how accurate the direction is, I don’t think there’s any improvement needed for the macro-block processing – simply a scale factor change of ≈ 2.5.  I wish I knew more about the video compression method for generating macro-blocks to understand what this 2.5 represents – I don’t like the idea of adding an arbitrary scale of 2.5.

One further point from yesterday’s video, you can see she yaws clockwise by a few degrees on takeoff – Hermione’s always done this, and I think the problem is with her yaw rate PID needing more P and less I gain.  Something else for me to try next.


I had tried Zoe first as she’s more indestructible. However, her Pi0W can only cope with 400 x 400 pixels video, whereas Hermione’s Pi B 2+ can cope with 680 x 680 pixel videos  (and perhaps higher with the 5 phase motion processing) which seem to work well with the chicken trashed lawn.

Weird yaw behaviour

I’ve implemented the yaw code such that Hermione points in the direction that she should be travelling based upon the flight plan velocity vector.  She should take-off, then move left at 0.25 m/s for 6 seconds, while also rotating anti-clockwise by 90° to face the way she’s supposed to be travelling.  However, here’s what my Mavic & I saw:

My best guess is the camera lateral tracking which simply looks for peaks in macro-block after stashing them all in a dictionary indexed by the vectors.  This ignores yaw, which was fine up to now, as I’d set the yaw target to zero.  I think I need to add an extra stage which un-yaws each macro-block vector before adding them to the dictionary and looking for peaks.  That’s relatively easy code, involving tracking yaw between video frame, but costly as it adds an extra phase to unraw each MB vector, before dictionarying them and checking for peaks.  Time will tell.

One last sanity check

Before moving on to compass and GPS usage, there’s one last step I want to ensure works: lateral movement.

The flight plan is defined thus:

  • take-off in the center of a square flight plan to about 1m height
  • move left by 50cm
  • move forward by 50cm – this place her in to top left corner of the square
  • move right by 1m
  • move back by 1m
  • move left by 1m
  • move forwards by 50cm
  • move right by 50cm
  • land back at the take-off point.

The result’s not perfect despite running the ground facing camera at 640 x 640 pixels; to be honest, with lawn underneath her, I still think she did pretty well.  She’s still a little lurchy, but I think some pitch / roll rotation PID tuning over the IKEA mat should resolve this quickly.  Once again, you judge whether she achieved this 34 second flight well enough?

Cluster tracking

I took Zoe into the lounge and flew her over her IKEA mat.  She started at the bottom right of the mat. Then I moved her

  • forwards over a meter
  • right by about a meter
  • back by over a meter
  • left by about a meter to the starting point
  • diagnonally to the centre of the mat
  • right to the edge of the mat
  • in an anti-clockwise circular loop around the mat.

She finished at the start of the circle on the right hand side of the mat about half way up.

Here’s what my cluster tracking thinks happened:

Cluster tracking

Cluster tracking

The track around the square, and the move to the centre of the mat are good, both in direction and distance, but when she then circled anticlockwise around the mat, there’s clearly drift to the right: on the vertical axis of the graph, the start and end points of the circle are both about 0.5m which is a close match to what happened; however horizontally, there’s a 1.4m difference between the start and end of the circle.  Not sure why yet.

A couple of other points worth noting:

  • I had to half the video frame rate to 10fps to avoid FIFO overflows; I’ll need to check whether the original 20fps works with diagnostic off.
  • The test ‘flight’ lasted 30s so even with the 1.4m false drift to the right, that’s still hugely better than just the double integrated accelerometer.

Zoe the videographer

Zoe the videographer

Zoe the videographer

You can just see the brown video cable looping to the underside of the frame where the camera is attached.

She’s standing on a perspex box as part of some experimenting as to why this happens:

Lazy video feed

Lazy video feed

It’s taking at least 4.5 seconds before the video feed kicks in (if at all).  Here the video data is only logged.  What’s plotted here is the cumulative distance; what’s shown is accurate in time and space, but I need to investigate further why the delay.  It’s definitely not to do with the starting up of the camera video process – I already have prints showing when it starts and stops, and those happen at the right time; it’s either related to the camera processing itself, or how the FIFO works.  More anon as I test my ideas.