GPS + compass + yaw control code

Coding this combination of features has taken me several days, covering complex interaction depending on whether each is installed and how they are configured.  This has required a rework of the GPS processing and Flight Plan objects to ensure they interact cleanly, along with enhanced configuration parameters and use of compass.  Here’s the overview.

yaw control compass gps control
yaw control Defines always forward flight with yaw to turn. Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift. No interactions without compass
compass Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift. Defines NSEW during flight Defines direction of flight between waypoints.
gps control No interaction without compass Defines direction of flight between waypoints. Defines flight plan waypoints; tracks locations during flights.

I _think_ I’ve got all bases covered now in the code, but there’s a lot of new pieces to be tested individually, and in all possible combinations, passively first, and then in real flights. For the first time in this project, I’m going to need to plan my testing so I don’t miss a combination until it’s too late.

On the plus side, somehow after her last video, Hermione’s shoulder joint broke while I was testing higher video frame sizes and rates, and the camera stopped working.  Until the replacement parts arrive, there’s a lot of spare time for thinking and coding, and also occasionally tickling my fancy – more on the latter anon.

Still stuck

Hermione is still causing trouble with yaw control flights despite lots of refinements.  Here’s the latest.

Hermione's troubles

Hermione’s troubles

@3s she’s climbed to about a meter high and then hovered for a second.  All the X, Y, and Z flight plan targets and sensor inputs are nicely aligned.

The ‘fun’ starts at 4 seconds.  The flight plan, written from my point of view says move left by 1m over 4 seconds.  From Hermione’s point of view, with the yaw code in use, this translates to rotate anti-clockwise by 90° while moving forwards by 1m over 4 seconds.  The yaw graph from the sensors shows the ACW rotation is happening correctly.  The amber line in the Y graph shows the left / right distance target from H’s POV is correctly zero.  Similarly, the amber line in the X graph correctly shows she should move forwards by 1m over 4s.  All’s good as far as far as the targets are concerned from her and my POV.

But there’s some severe discrepancy from the sensors inputs POV.  From my POV, she rotated ACW 90° as expected, but then she moved forwards away from me, instead of left.  The blue line on the Y graph (the LiDAR and ground-facing video inputs) confirms this; it shows she moves right by about 0.8m from her POV.  But the rusty terracotta line in the Y graph (the double integrated accelerometer – gravity readings) shows exactly the opposite.  The grey fusion of the amber and terracotta cancel each other out thus following the target perfectly but for completely the wrong reasons.

There are similar discrepancies in the X graph, where the LiDAR + Video blue line is the best match to what I saw: virtually no forward movement from H’s POV except for some slight forward movement after 8s when she should be hovering.

So the net of this?  The LiDAR / Video processing is working perfectly.  The double integrated IMU accelerometer results are wrong, and I need to work out why?  The results shown are taken directly from the accelerometer, and double integrated in excel (much like what the code does too), and I’m pretty convinced I’ve got this right.  Yet more digging to be done.

In other news…

  • Ö has ground facing lights much like Zoe had.  Currently they are always on, but ultimately I intend to use them in various ways such as flashing during calibration etc – this requires a new PCB however to plug a MOSFET gate into a GPIO pin.
  • piNet has changed direction somewhat: I’m testing within the bounds of my garden whether I can define a target destination with GPS, and have enough accuracy for the subsequent flight from elsewhere to get to that target accurately.  This is step one in taking the GPS coordinates of the centre of a maze, and then starting a flight from the edge to get back there.

That’s all for now, folks.  Thanks for sticking with me during these quiet times.


P.S. I’ve got better things to do that worry about why everything goes astray @ 7s, 3s after the yaw to move left started; it’s officially on hold as I’ve other stuff lurking in the background that’s about the flower.

Lateral motion tracking with yaw

I’m doing some very careful testing before I set Hermione loose live to fly in a circle.  This morning, I’ve confirmed the video lateral motion block tracking is working well.

For this first unpowered flight, I walked her forwards about 3m and then left by about the same.  Note that she always pointed in the same direction; I walked sideways to get the left movement:

Forward - Left

Forward – Left

For this second unpowered flight, again, I walked her forwards about 3m, but then rotated her by 90° CCW before walking another.  Because of the yaw, from her point of view, she only flew forwards, and the yaw is not exposed on the graph.  This is exactly how it should be:

Forward - Yaw 90° CCW - Forward

Forward – Yaw 90° CCW – Forward

So I’m happy the lateral motion tracking is working perfectly.  Next I need to look at the target.  I can go that with the same stats.

The only problem I had was that the sun needs to be shining bright for the video tracking to ‘fly’ above the lawn; clearly it needs the high contrast in the grass when sunlit.

Stats analysis

From the same run as the last two posts, I’ve been analysing the stats.

First the integrated gyro showing the yaw it detected:

yaw

yaw

This shows the unintentional clockwise (negative) yaw of about -10° shortly after take-off followed by the intentional anticlockwise yaw of +90° as Hermione headed off left.  I still need to have a play with the yaw rate PID to try to kill that initial -10° yaw.

More interesting, to me at least, is the motion processing:

FIFO stats

FIFO stats

The graph shows how often the FIFO was emptied.  In an unhindered world, the FIFO is emptied every 10ms (0.01s) and the PWM to the ESCs update.  Peaks above that means something else was taking up the spare time.  The FIFO overflows at just above 84ms (512 bytes total FIFO size / 12 bytes per IMU sample / 500Hz IMU sampling rate =  85.333ms), and the highest shown here is 38ms, well within safety limits.  I’m particularly delighted that the majority of the spikes are within the 10 to 20ms range – that strongly suggests the split phases of macro-block processing is working like a dream.

The 20ms norm means the PWM is updated at 50Hz.  Were the PWM consistently updated at less than 50Hz, it would really start to show in the stability of the flight.  But luckily it isn’t, meaning there’s probably just enough room to finally squeeze in compass and GPS processing.

In passing, it’s worth saying that such levels of stats would be impossible if I was using a microcontroller (Arduino etc) – this 11s flight logged 1.46MB of data to shared memory, and ultimately to SD card.  It logs both initial hard coded constants, and every dynamic variable for every cycle of motion processing – that means nothing is missing and it’s possible to diagnose any problem as long as the reader knows the code intimately.  Without these logs, it would have made it nigh on impossible for the ignorant me 4+ years ago to achieve what I have now.


* I rate myself as experienced having spent over 4 years on this.

That’s better…

not perfect, but dramatically better.  The flight plan was:

  • take off to 1m
  • move left over 6s at 0.25m/s while simultaneously rotating ACW 90° to point in the direction of travel
  • land.

I’d spent yesterday’s wet weather rewriting the macro-block processing code, breaking it up into 5 phases:

  1. Upload the macro block vectors into a list
  2. For each macro-block vector in the list, undo yaw that had happened between this frame and the previous one
  3. Fill up a dictionary indexed with the un-yawed macro-block vectors
  4. Scan the directory, identifying clusters of vectors and assigned scores, building a list of highest scoring vector clusters
  5. Average the few, highest scoring clusters, redo the yaw of the result from step 2, and return the resultant vector

Although this is quite a lot more processing, splitting it into five phases compared to yesterday’s code’s two means that between each phase, the IMU FIFO can be checked, and processed if it’s filling up thus avoiding a FIFO overflow.

Two remaining more subtle problems remain:

  1. She should have stayed in frame
  2. She didn’t quite rotate the 90° to head left.

Nonetheless, I once more have a beaming success smile.

Weird yaw behaviour

I’ve implemented the yaw code such that Hermione points in the direction that she should be travelling based upon the flight plan velocity vector.  She should take-off, then move left at 0.25 m/s for 6 seconds, while also rotating anti-clockwise by 90° to face the way she’s supposed to be travelling.  However, here’s what my Mavic & I saw:

My best guess is the camera lateral tracking which simply looks for peaks in macro-block after stashing them all in a dictionary indexed by the vectors.  This ignores yaw, which was fine up to now, as I’d set the yaw target to zero.  I think I need to add an extra stage which un-yaws each macro-block vector before adding them to the dictionary and looking for peaks.  That’s relatively easy code, involving tracking yaw between video frame, but costly as it adds an extra phase to unraw each MB vector, before dictionarying them and checking for peaks.  Time will tell.

yaw control.

Currently, yaw angle PID target stays set at zero throughout the flight; the piDrone always faces the same direction regardless of the way it’s travelling.  Next step on the way to ultimately tracking between GPS points is to have her point in the direction she’s travelling; it’s not actually necessary for my girls; the Mavic Pro does it as the forward facing camera video is streamed back to it’s remote control so I get an FPV (first-person view – i.e. as though I was a pilot sitting on it).

My intention is that the yaw angle tracks the earth frame flight plan velocity vector, so the piDrone points the way it _should_ be going.  This is roughly what the Mavic does.

I though it would be trivial, and added the one new line of code, and then I realised the gotchas which led to me blogging the details.  There are 3 problems.

  • Conversion of the lateral velocity vector to an angle.  tangent only covers ±90°; this means the 1/1 and -1/-1 vectors both come out as 45° angles.  Also 1/0 would throw a exception rather than 90°.  Luckily math.atan2(x, y) resolves this spitting out angles of ±180°.  Thats my yaw angle PID target resolved.
  • Alignment of the integrated gyro input in the same scale as above.  If the piDrone flies 2.25 clockwise loops around the circle, the integrated gyro will read 810° when it needs to read 90° – doing yaw = yaw % 180° should sort this out.  That’s the yaw angle PID input sorted
  • Finally if the yaw PID input is 179° and the yaw PID target is -179°, the PID (target – input) needs to come out at +2° not -358° i.e. the angle must always be <= |180°|.  I’ve sorted this out in the code by adding a custom subclass overriding the default error = (target – input):
    ####################################################################################################
    #
    # PID algorithm subclass to cope with the yaw angles error calculations.
    #
    ####################################################################################################
    class YAW_PID(PID):
    
        def Error(self, input, target):
            #-------------------------------------------------------------------------------------------
            # An example in degrees is the best way to explain this.  If the target is -179 degrees 
            # and the input is +179 degrees, the standard PID output would be -358 degrees leading to 
            # a very high yaw rotation rate to correct the -358 degrees error.  However, +2 degrees
            # achieves the same result, with a much lower rotation rate to fix the error.
            #-------------------------------------------------------------------------------------------
            error = target - input
            return (error if abs(error) < math.pi else (error - 2 * math.pi * error / abs(error)))
    

Now I just need the spring wind and torrential rain showers to ease up for an hour.

yaw

I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

Everything but the kitchen sink from macro blocks?

Currently the macros blocks are just averaged to extract lateral distance increment vectors between frames.  Due to the fixed frame rate, these can produce velocity increments.  Both can then be integrated to produce absolutes.  But I suspect there’s even more information available.

Imagine videoing an image like this multi-coloured circle:

Newton Disc

Newton Disc

It’s pretty simple to see that if the camera moved laterally between two frames, it’d be pretty straight forward for the video compression algorithm to break the change down into a set of identical macro-block vectors, each showing the direction and distance the camera had shifted between frames.  This is what I’m doing now by simply averaging the macro-blocks.

But imagine the camera doesn’t move laterally, but instead it rotates and the distance between camera and circle increases.  By rotating each macro-block vector by the position it is in the frame compared to the center point and averaging the results, what results is a circumferential component representing yaw change, and an axial component representing height change.

I think the way to approach this is first to get the lateral movement by simply averaging the macro-block vectors as now; the yaw and height components will cancel themselves out.

Then by shifting the contents of the macro-block frame by the averaged lateral movement, the axis is brought to the center of the frame – some macro-blocks will be discarded to ensure the revised macro-block frame is square around the new center point.

Each of the macro-block vectors is then rotated according to the position in the new square frame.The angle of each macro-block in the frame is pretty easy to work out (e.g. a 4×4 square has rotation angles of 45, 135, 225 and 315 degrees, 9×9 square has blocks to be rotated by 0, 45, 90, 135, 180, 225, 270, 315 degrees), so now averaging the X and Y axis of these rotated macro-block vectors gives a measure of yaw and size change (height).  I’d need to include distance from the center when averaging out these rotated blocks.

At a push, even pitch and roll could be obtained because they would distort the circle into an oval.

Yes, there’s calibration to do, and there’s a dependency on textured multicoloured surfaces, and the accuracy will be very dependent on frame size and rate.  Nevertheless, in the perfect world, it should all just work(TM).  How cool would that be to having the Raspberry Pi camera providing this level of information!  No other sensors would be required except for a compass for orientation, GPS for location, and LiDAR for obstacle detection and avoidance.  How cool would that be!

Anyway, enough dreaming for now, back to the real world!

I ψ with my little eye…

something beginning with Y.  See Beard for why I’ve referred to psi.

Sometimes I amaze myself with how stupid I can be.  I’ve been ignoring yaw for a long time because:

  • a quad shouldn’t yaw due to its prop rotations
  • if it does, there’s nowt can be done to stop it.

But from another long chat on the Raspberry Pi forum, the completely obvious point finally became obvious to me: although yaw can’t be prevented, it still needs to be accounted for in drift correction.  The fix for forwards drift plus yaw is almost certainly not reverse acceleration, and in the extreme case, with say 180° yaw, it may be forward acceleration needed probably along with some left or right also.

I’ve wasted months on drift control which couldn’t possibly work without including yaw.

For the moment, I’ll just cross my fingers and use the integrated gyro z axis, but I now definitely need a compass for longer flights.