Lateral motion tracking with yaw

I’m doing some very careful testing before I set Hermione loose live to fly in a circle.  This morning, I’ve confirmed the video lateral motion block tracking is working well.

For this first unpowered flight, I walked her forwards about 3m and then left by about the same.  Note that she always pointed in the same direction; I walked sideways to get the left movement:

Forward - Left

Forward – Left

For this second unpowered flight, again, I walked her forwards about 3m, but then rotated her by 90° CCW before walking another.  Because of the yaw, from her point of view, she only flew forwards, and the yaw is not exposed on the graph.  This is exactly how it should be:

Forward - Yaw 90° CCW - Forward

Forward – Yaw 90° CCW – Forward

So I’m happy the lateral motion tracking is working perfectly.  Next I need to look at the target.  I can go that with the same stats.

The only problem I had was that the sun needs to be shining bright for the video tracking to ‘fly’ above the lawn; clearly it needs the high contrast in the grass when sunlit.

A difference of opinion.

By lowering the video frame rate to 10Hz, I’ve been able to increase the video resolution to 720² pixels.  In addition I’ve increased the contrast on the video to 100%.  Together these now provide enough detail to track lateral motion on the lawn.  Drift with hover is non-existent, so next step was to try a flight around a 2m square.  That’s where the disagreement showed itself:

Difference of opinion

Difference of opinion

  • Top left is the flight plan up to the point I killed the flight: 2 meters forwards and left by 0.35m
  • Top right shows the 90° anticlockwise yaw so she points the way she’s going
  • Bottom left is the track picked up by the PiCamera macro-blocks
  • Bottom right is the track derived by double integrating the accelerometer.

Both agree on the forward motion of about 2 meters, but the disagreement arises at the point she turns left.  The right of the pair is correct based on my independent third-party view of the flight; although she was pointing left, she flew right from my point of view i.e. backwards from her point of view.  I’ve clearly got the maths back-to-front in the lateral motion tracking.  I’m pretty sure of the offending line of code, and the fix is trivial, but I’m really struggling to convince myself why what’s there is wrong.

Luckily, during the flights, there were a number of high-torque landings which ultimately broke the bracket for one of Hermione’s legs.  Until the replacement arrives from Poland, I have plenty of time to kill convincing myself why the existing code is wrong.

Buttercups and daisies…

are lacking yet this spring, and having mown the lawn yesterday, features are hard to find for the video lateral tracking.  So I think this is a pretty good 37s hover.  In fact, I think it’s as good as it can be until the daisies start sprouting:

This is with a frame size of 640² pixels.  There’s an check in the code which reports whether the code keeps up with the video frame rate.  At 640² it does; I tried 800² and 720² but the code failed to keep up with the video frame rate of 20fps.

As a result, I’ve uploaded the changes to GitHub.  There’s work-in-progress code there for calibrating the compass “calibrateCompass()”, although that’s turning out to be a right PITA.  I’ll explain more another time.

As a side note, my Mavic uses two forward facing camera to stereoscopically track horizontal movement, combined with GPS and a corresponding ground facing pair of cameras and the IMU accelerometer integration, yet if you watch the frisbee / baseball bat to the left, even the Mavic drifts.

Chicken poo tracking

If you look at yesterday’s video full screen, from top left to right, you can see a muddy patch and two chicken poos, the second poo of which is close to Hermione’s front left prop on take-off.  I was back out in the dark last night, tracking them down.  Here’s why:

Lateral tracking

Lateral tracking

Although the graph of camera lateral tracking and the Mavic video are almost facsimiles in direction, the scale is out; the graph shows the distance from take-off to landing to be about 1.7m whereas a tape measure from chicken poo #2 to the cotoneaster shrubbery landing point measures about 4.2m.  Given how accurate the direction is, I don’t think there’s any improvement needed for the macro-block processing – simply a scale factor change of ≈ 2.5.  I wish I knew more about the video compression method for generating macro-blocks to understand what this 2.5 represents – I don’t like the idea of adding an arbitrary scale of 2.5.

One further point from yesterday’s video, you can see she yaws clockwise by a few degrees on takeoff – Hermione’s always done this, and I think the problem is with her yaw rate PID needing more P and less I gain.  Something else for me to try next.


I had tried Zoe first as she’s more indestructible. However, her Pi0W can only cope with 400 x 400 pixels video, whereas Hermione’s Pi B 2+ can cope with 680 x 680 pixel videos  (and perhaps higher with the 5 phase motion processing) which seem to work well with the chicken trashed lawn.

Weird yaw behaviour

I’ve implemented the yaw code such that Hermione points in the direction that she should be travelling based upon the flight plan velocity vector.  She should take-off, then move left at 0.25 m/s for 6 seconds, while also rotating anti-clockwise by 90° to face the way she’s supposed to be travelling.  However, here’s what my Mavic & I saw:

My best guess is the camera lateral tracking which simply looks for peaks in macro-block after stashing them all in a dictionary indexed by the vectors.  This ignores yaw, which was fine up to now, as I’d set the yaw target to zero.  I think I need to add an extra stage which un-yaws each macro-block vector before adding them to the dictionary and looking for peaks.  That’s relatively easy code, involving tracking yaw between video frame, but costly as it adds an extra phase to unraw each MB vector, before dictionarying them and checking for peaks.  Time will tell.

One last sanity check

Before moving on to compass and GPS usage, there’s one last step I want to ensure works: lateral movement.

The flight plan is defined thus:

  • take-off in the center of a square flight plan to about 1m height
  • move left by 50cm
  • move forward by 50cm – this place her in to top left corner of the square
  • move right by 1m
  • move back by 1m
  • move left by 1m
  • move forwards by 50cm
  • move right by 50cm
  • land back at the take-off point.

The result’s not perfect despite running the ground facing camera at 640 x 640 pixels; to be honest, with lawn underneath her, I still think she did pretty well.  She’s still a little lurchy, but I think some pitch / roll rotation PID tuning over the IKEA mat should resolve this quickly.  Once again, you judge whether she achieved this 34 second flight well enough?

Cluster tracking

I took Zoe into the lounge and flew her over her IKEA mat.  She started at the bottom right of the mat. Then I moved her

  • forwards over a meter
  • right by about a meter
  • back by over a meter
  • left by about a meter to the starting point
  • diagnonally to the centre of the mat
  • right to the edge of the mat
  • in an anti-clockwise circular loop around the mat.

She finished at the start of the circle on the right hand side of the mat about half way up.

Here’s what my cluster tracking thinks happened:

Cluster tracking

Cluster tracking

The track around the square, and the move to the centre of the mat are good, both in direction and distance, but when she then circled anticlockwise around the mat, there’s clearly drift to the right: on the vertical axis of the graph, the start and end points of the circle are both about 0.5m which is a close match to what happened; however horizontally, there’s a 1.4m difference between the start and end of the circle.  Not sure why yet.

A couple of other points worth noting:

  • I had to half the video frame rate to 10fps to avoid FIFO overflows; I’ll need to check whether the original 20fps works with diagnostic off.
  • The test ‘flight’ lasted 30s so even with the 1.4m false drift to the right, that’s still hugely better than just the double integrated accelerometer.

Zoe the videographer

Zoe the videographer

Zoe the videographer

You can just see the brown video cable looping to the underside of the frame where the camera is attached.

She’s standing on a perspex box as part of some experimenting as to why this happens:

Lazy video feed

Lazy video feed

It’s taking at least 4.5 seconds before the video feed kicks in (if at all).  Here the video data is only logged.  What’s plotted here is the cumulative distance; what’s shown is accurate in time and space, but I need to investigate further why the delay.  It’s definitely not to do with the starting up of the camera video process – I already have prints showing when it starts and stops, and those happen at the right time; it’s either related to the camera processing itself, or how the FIFO works.  More anon as I test my ideas.

Tyranosaurus

Tyranosaurus

Tyranosaurus

Another walk up the side of the house, but then walking a square as best I could, finishing where I started, and as you can see, the camera tracked this amazingly well – I’m particularly delighted the start and end points of the square are so close.  Units are pretty accurate too.

I’m now very keen for Hermione’s parts to arrive, as I suspect this is going to work like a dream, both stabilising long term hover, and also allowing accurate traced flight plans with horizontal movement.  Very, very excited!

Shame about the trip to DisneyLand Paris next week – I’m not going to get everything done before then, which means Disney is going to be more of a frustrating, annoying waste of my time than usual!

Phoebe, where on earth do you think you’re going?

The YouTube motion control video got me thinking: in their setup, their sponge ball tracking systems gives them centimeter accuracy of the speed, position and orientation of the quad(s).

Phoebe uses the accelerometer with coordinates reorientated to the earth’s axes using the various angle calculations to do the same for speed and position.  She currently doesn’t track yaw ( => orientation) – I just try to minimize it using the z-axis gyro – I’ll need a compass to track it accurately.

And despite all my best attempts, she drifts and I’m really struggling to track down why, primarily due to a lack of a plan.  I’d just assumed the drift was due to bad PID tuning, or poor angles reorientating the quad accelerometer axis to earth coordinates; my approach has just been to tweak PIDs and other config parameters in search of the holy grail combination, but so far to no avail.

So in today’s tests, I’ve converted her sensor plots into a map of where Phoebe thinks she’d been compared to where I saw her go.

 

Phoebe's Flight Map

Phoebe’s Flight Map

And actually, despite having to double integrate acceleration to distance, it’s a reasonable match of the flight.

She moves forward by roughly 0.9 meters, and moves to the right (negative is reverse / right direction) by about 0.5 meter.

However…

  1. it’s clear she’s trying to correct left / right drift for a bit before she just goes off on one
  2. it’s also clear she doesn’t make a similar effort to correct the forward drift
  3. the left / right scale is out by a factor of 5 – she eventually drifted right by about 2.5 meters

The first problem suggests that once the complementary filters starts to favour the accelerometer, then drift protection starts to fail.

The second problem suggests I may have my forward / backward drift compensation the wrong way round, so she accelerates forward in compensation for forward acceleration – though I’m not convinced here.

The cause of the third problem I can’t explain, but the difference in resolution in the accelerometer axes makes it hard for consistent calculations for drift – not good

For me, I’m delighted, as by mapping her path, I’ve found lots of useful facts that previously I’ve just been speculating on.

Further thoughts a few hours later…

The consistent forward acceleration (rather than drift compensation) is due to a bug I fixed a while back that seems to have crept back in  – my bad.

The short term left / right drift compensation seems to just need PID tuning – I think oscillations like this are usually due to an excessive P, so I need to reduce that, and consider adding some I (for “return to where you a started”) and D (for “get a shift on now!”.

The longer term left / right drift’s (lack of) compensation is most interesting: the equidistant dots suggest strongly Phoebe believes she is moving to the right at constant speed and is horizontal; that matches what I saw. Bearing in mind this is sensor data, it’s used as PID feedback, so the Y axis horizontal speed PID should do something about this, yet it doesn’t appear to.  For the moment, I have no idea how this can come about so it’s frustrating – interesting too, but primarily frustrating.

I have no idea where to start about the factor of five scaling difference between the axes; I’ll just have to keep my eye on it during Monday’s testing.  Currently, the only solution is a dirty hack to multiply the left-right earth coordinate acceleration by 5 before integrating.

Finally, I’m still delighted that producing this map works, and provides a new kind of insight into flight problems.

P.S. Anyone else think this looks like an eagle soaring?