Stats analysis

From the same run as the last two posts, I’ve been analysing the stats.

First the integrated gyro showing the yaw it detected:



This shows the unintentional clockwise (negative) yaw of about -10° shortly after take-off followed by the intentional anticlockwise yaw of +90° as Hermione headed off left.  I still need to have a play with the yaw rate PID to try to kill that initial -10° yaw.

More interesting, to me at least, is the motion processing:

FIFO stats

FIFO stats

The graph shows how often the FIFO was emptied.  In an unhindered world, the FIFO is emptied every 10ms (0.01s) and the PWM to the ESCs update.  Peaks above that means something else was taking up the spare time.  The FIFO overflows at just above 84ms (512 bytes total FIFO size / 12 bytes per IMU sample / 500Hz IMU sampling rate =  85.333ms), and the highest shown here is 38ms, well within safety limits.  I’m particularly delighted that the majority of the spikes are within the 10 to 20ms range – that strongly suggests the split phases of macro-block processing is working like a dream.

The 20ms norm means the PWM is updated at 50Hz.  Were the PWM consistently updated at less than 50Hz, it would really start to show in the stability of the flight.  But luckily it isn’t, meaning there’s probably just enough room to finally squeeze in compass and GPS processing.

In passing, it’s worth saying that such levels of stats would be impossible if I was using a microcontroller (Arduino etc) – this 11s flight logged 1.46MB of data to shared memory, and ultimately to SD card.  It logs both initial hard coded constants, and every dynamic variable for every cycle of motion processing – that means nothing is missing and it’s possible to diagnose any problem as long as the reader knows the code intimately.  Without these logs, it would have made it nigh on impossible for the ignorant me 4+ years ago to achieve what I have now.

* I rate myself as experienced having spent over 4 years on this.

That’s better…

not perfect, but dramatically better.  The flight plan was:

  • take off to 1m
  • move left over 6s at 0.25m/s while simultaneously rotating ACW 90° to point in the direction of travel
  • land.

I’d spent yesterday’s wet weather rewriting the macro-block processing code, breaking it up into 5 phases:

  1. Upload the macro block vectors into a list
  2. For each macro-block vector in the list, undo yaw that had happened between this frame and the previous one
  3. Fill up a dictionary indexed with the un-yawed macro-block vectors
  4. Scan the directory, identifying clusters of vectors and assigned scores, building a list of highest scoring vector clusters
  5. Average the few, highest scoring clusters, redo the yaw of the result from step 2, and return the resultant vector

Although this is quite a lot more processing, splitting it into five phases compared to yesterday’s code’s two means that between each phase, the IMU FIFO can be checked, and processed if it’s filling up thus avoiding a FIFO overflow.

Two remaining more subtle problems remain:

  1. She should have stayed in frame
  2. She didn’t quite rotate the 90° to head left.

Nonetheless, I once more have a beaming success smile.

Weird yaw behaviour

I’ve implemented the yaw code such that Hermione points in the direction that she should be travelling based upon the flight plan velocity vector.  She should take-off, then move left at 0.25 m/s for 6 seconds, while also rotating anti-clockwise by 90° to face the way she’s supposed to be travelling.  However, here’s what my Mavic & I saw:

My best guess is the camera lateral tracking which simply looks for peaks in macro-block after stashing them all in a dictionary indexed by the vectors.  This ignores yaw, which was fine up to now, as I’d set the yaw target to zero.  I think I need to add an extra stage which un-yaws each macro-block vector before adding them to the dictionary and looking for peaks.  That’s relatively easy code, involving tracking yaw between video frame, but costly as it adds an extra phase to unraw each MB vector, before dictionarying them and checking for peaks.  Time will tell.

yaw control.

Currently, yaw angle PID target stays set at zero throughout the flight; the piDrone always faces the same direction regardless of the way it’s travelling.  Next step on the way to ultimately tracking between GPS points is to have her point in the direction she’s travelling; it’s not actually necessary for my girls; the Mavic Pro does it as the forward facing camera video is streamed back to it’s remote control so I get an FPV (first-person view – i.e. as though I was a pilot sitting on it).

My intention is that the yaw angle tracks the earth frame flight plan velocity vector, so the piDrone points the way it _should_ be going.  This is roughly what the Mavic does.

I though it would be trivial, and added the one new line of code, and then I realised the gotchas which led to me blogging the details.  There are 3 problems.

  • Conversion of the lateral velocity vector to an angle.  tangent only covers ±90°; this means the 1/1 and -1/-1 vectors both come out as 45° angles.  Also 1/0 would throw a exception rather than 90°.  Luckily math.atan2(x, y) resolves this spitting out angles of ±180°.  Thats my yaw angle PID target resolved.
  • Alignment of the integrated gyro input in the same scale as above.  If the piDrone flies 2.25 clockwise loops around the circle, the integrated gyro will read 810° when it needs to read 90° – doing yaw = yaw % 180° should sort this out.  That’s the yaw angle PID input sorted
  • Finally if the yaw PID input is 179° and the yaw PID target is -179°, the PID (target – input) needs to come out at +2° not -358° i.e. the angle must always be <= |180°|.  I’ve sorted this out in the code by adding a custom subclass overriding the default error = (target – input):
    # PID algorithm subclass to cope with the yaw angles error calculations.
    class YAW_PID(PID):
        def Error(self, input, target):
            # An example in degrees is the best way to explain this.  If the target is -179 degrees 
            # and the input is +179 degrees, the standard PID output would be -358 degrees leading to 
            # a very high yaw rotation rate to correct the -358 degrees error.  However, +2 degrees
            # achieves the same result, with a much lower rotation rate to fix the error.
            error = target - input
            return (error if abs(error) < math.pi else (error - 2 * math.pi * error / abs(error)))

Now I just need the spring wind and torrential rain showers to ease up for an hour.


I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

Everything but the kitchen sink from macro blocks?

Currently the macros blocks are just averaged to extract lateral distance increment vectors between frames.  Due to the fixed frame rate, these can produce velocity increments.  Both can then be integrated to produce absolutes.  But I suspect there’s even more information available.

Imagine videoing an image like this multi-coloured circle:

Newton Disc

Newton Disc

It’s pretty simple to see that if the camera moved laterally between two frames, it’d be pretty straight forward for the video compression algorithm to break the change down into a set of identical macro-block vectors, each showing the direction and distance the camera had shifted between frames.  This is what I’m doing now by simply averaging the macro-blocks.

But imagine the camera doesn’t move laterally, but instead it rotates and the distance between camera and circle increases.  By rotating each macro-block vector by the position it is in the frame compared to the center point and averaging the results, what results is a circumferential component representing yaw change, and an axial component representing height change.

I think the way to approach this is first to get the lateral movement by simply averaging the macro-block vectors as now; the yaw and height components will cancel themselves out.

Then by shifting the contents of the macro-block frame by the averaged lateral movement, the axis is brought to the center of the frame – some macro-blocks will be discarded to ensure the revised macro-block frame is square around the new center point.

Each of the macro-block vectors is then rotated according to the position in the new square frame.The angle of each macro-block in the frame is pretty easy to work out (e.g. a 4×4 square has rotation angles of 45, 135, 225 and 315 degrees, 9×9 square has blocks to be rotated by 0, 45, 90, 135, 180, 225, 270, 315 degrees), so now averaging the X and Y axis of these rotated macro-block vectors gives a measure of yaw and size change (height).  I’d need to include distance from the center when averaging out these rotated blocks.

At a push, even pitch and roll could be obtained because they would distort the circle into an oval.

Yes, there’s calibration to do, and there’s a dependency on textured multicoloured surfaces, and the accuracy will be very dependent on frame size and rate.  Nevertheless, in the perfect world, it should all just work(TM).  How cool would that be to having the Raspberry Pi camera providing this level of information!  No other sensors would be required except for a compass for orientation, GPS for location, and LiDAR for obstacle detection and avoidance.  How cool would that be!

Anyway, enough dreaming for now, back to the real world!

I ψ with my little eye…

something beginning with Y.  See Beard for why I’ve referred to psi.

Sometimes I amaze myself with how stupid I can be.  I’ve been ignoring yaw for a long time because:

  • a quad shouldn’t yaw due to its prop rotations
  • if it does, there’s nowt can be done to stop it.

But from another long chat on the Raspberry Pi forum, the completely obvious point finally became obvious to me: although yaw can’t be prevented, it still needs to be accounted for in drift correction.  The fix for forwards drift plus yaw is almost certainly not reverse acceleration, and in the extreme case, with say 180° yaw, it may be forward acceleration needed probably along with some left or right also.

I’ve wasted months on drift control which couldn’t possibly work without including yaw.

For the moment, I’ll just cross my fingers and use the integrated gyro z axis, but I now definitely need a compass for longer flights.

Back on track

Finally, with the new motors, blades, arm and legs, things are back on track.

3 test flights today aiming to find an average ‘hover’ speed.  Only inner most angular rate PID engaged, so there was drift due to the drift of the integrated gyro.  My first flight was once more scary as it also shot up into the sky because the vertical speed PID tried to undo the excessive vertical power applied my guesstimate of the default hover speed (but note not the manic rise that “Scary, very scary” revealed)

The average hover speed is now about 1550us pulse width suggesting these new 1045 carbon props are much more powerful than the 1150’s I had – they are broader and therefore more aerofoil surface area, so it doesn’t surprise me..

The even better news though was no yaw – the stats show no common sharing of power between pairs (front left + back right) and (front right + back left) which is always the sign of the quad trying to compensate for the yaw.  Whether that’s due to motors, props, or frame fixes, I frankly don’t care!  It now means my future flight diagnostics won’t be swamped by yaw.

The only negative was the the integrated gyro drift caused a diagonal drift landing, and once more, another leg died as it stuck a couple of inches in the mud while the quad drifted on!  I’m more than happy to pay that price!

Chinook Helicopter

Chinook dual-rotor 'copter

Chinook dual-rotor ‘copter

The Chinook helicopter has two sets of rotors which rotate in the same direction, leading to 0 yaw.  A quadcopter takes the same approach – diagonally opposite blades rotate in the same direction, so both pairs result in 0 yaw.

So how does my bloomin’ quadcopter have yaw problems?  I’m bringing this up again as it was a problem in the PID testing; during the first few tests, although the 2 motors started at full hover speed, they slowed down dramatically during the test.

The cause was the yaw PID which I’d forgotten to take out, and once I did, the tests ran consistently at hover speed.

Yet physically there was not yaw to correct – only one diagonal pair of motors was operating and due to the step-ladder, there was no way the quad could rotate around its z axis.  And that suggests the yaw sensor is seriously duff.

Next time she’s flying outside, I think I’ll set the yaw rate PID gains to 0 and see what happens – something must have suggested to me I needed them in the first place?

Back to basics

First step on tracking down the yaw problems is to check that all the blades are spinning in the right direction and at the same speed for a given PWM pulse width.  So I added it little test to the code to step through each motor in turn, spinning them up to well under take-off speed.  I started off with 10% and one motor did sound a little underpowered, but once I’d upped this to 20%, all motors were humming the same tune.  They were also all spinning in the correct direction (CW or CCW) for their given location, and they are all using the same props, and were all blowing air downwards, so I really struggle to see how the yaw can be caused by any of these parts.  Hmmm – I think I’ll have to put this on the back-burner until I come up with another great idea.