yaw control.

Currently, yaw angle PID target stays set at zero throughout the flight; the piDrone always faces the same direction regardless of the way it’s travelling.  Next step on the way to ultimately tracking between GPS points is to have her point in the direction she’s travelling; it’s not actually necessary for my girls; the Mavic Pro does it as the forward facing camera video is streamed back to it’s remote control so I get an FPV (first-person view – i.e. as though I was a pilot sitting on it).

My intention is that the yaw angle tracks the earth frame flight plan velocity vector, so the piDrone points the way it _should_ be going.  This is roughly what the Mavic does.

I though it would be trivial, and added the one new line of code, and then I realised the gotchas which led to me blogging the details.  There are 3 problems.

  • Conversion of the lateral velocity vector to an angle.  tangent only covers ±90°; this means the 1/1 and -1/-1 vectors both come out as 45° angles.  Also 1/0 would throw a exception rather than 90°.  Luckily math.atan2(x, y) resolves this spitting out angles of ±180°.  Thats my yaw angle PID target resolved.
  • Alignment of the integrated gyro input in the same scale as above.  If the piDrone flies 2.25 clockwise loops around the circle, the integrated gyro will read 810° when it needs to read 90° – doing yaw = yaw % 180° should sort this out.  That’s the yaw angle PID input sorted
  • Finally if the yaw PID input is 179° and the yaw PID target is -179°, the PID (target – input) needs to come out at +2° not -358° i.e. the angle must always be <= |180°|.  I’ve sorted this out in the code by adding a custom subclass overriding the default error = (target – input):
    ####################################################################################################
    #
    # PID algorithm subclass to cope with the yaw angles error calculations.
    #
    ####################################################################################################
    class YAW_PID(PID):
    
        def Error(self, input, target):
            #-------------------------------------------------------------------------------------------
            # An example in degrees is the best way to explain this.  If the target is -179 degrees 
            # and the input is +179 degrees, the standard PID output would be -358 degrees leading to 
            # a very high yaw rotation rate to correct the -358 degrees error.  However, +2 degrees
            # achieves the same result, with a much lower rotation rate to fix the error.
            #-------------------------------------------------------------------------------------------
            error = target - input
            return (error if abs(error) < math.pi else (error - 2 * math.pi * error / abs(error)))
    

Now I just need the spring wind and torrential rain showers to ease up for an hour.

Better macro-block interpretation

Currently, getting lateral motion from a frame full of macro-blocks is very simplistic:  find the average SAD value for a frame, and then only included those vectors whose SAD is lower.

I’m quite surprised this works as well as it does but I’m fairly sure it can be improved.  There are four factors to the content of a frame of macro-blocks.

  • yaw change: all macro-block vectors will circle around the centre of the frame
  • height change: all macro-blocks vectors will point towards or away from the centre of the frame.
  • lateral motion change: all macro-blocks vectors are pointing in the same direction in the frame.
  • noise: the whole purpose of macro-blocks is simply to find the best matching blocks between two frame; doing this with a chess set (for example) could well have any block from the first frame matching any one of the 50% of the second frame.

Given a frame of macro-blocks, yaw increment between frames can found from the gyro, and thus be removed easily.

The same goes for height too derived from LiDAR.

That leaves either noise or a lateral vector.  By then averaging these values out, we can pick the vectors that are similar to the distance / direction of the average vector. SAD doesn’t come into the matter.

This won’t be my first step however: that’s to work out why the height of the flight wasn’t anything like as stable as I’d been expecting.

Pirouette

Zoe now has her new PCB, a Garmin LiDAR-Lite and camera.  Initial tests aren’t showing any of the I2C, FIFO or power black-outs.  The first test with the motor-power disengaged is to check the combination of video vectors, height and yaw.

So  standing in the center of the lounge, I held Zoe at arms length and rotated her around in a few circles where she was always facing the direction she was going;  the video output processing averages out yaw thus only produces linear movement; the yaw is reinstated from the gyro:

Pirouette

Pirouette

The lighting in the lounge wasn’t bright, and the rug she pirouetted around was low contrast but heavily textured.  She ‘landed’ at the same spot she took off from.  Overall I call this a success – 0.75 meters drift error over a 36 second flight is amazing!

Next step: power up the motors and take her for a flight outside, probably tomorrow when the torrential rain and 42mph winds subside.  The question here is whether the errors come back once she’s powered by the LiPo via a 5V 1,5A switching regulator.

 

yaw

I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

A walk in the park

I wired up PIX4FLOW to my test rig, knocked together some test code, set up the I2C baudrate to 400kbps to make sure it worked at the same rate as the IMU needs.

PX4FLOW test rig

PX4FLOW test rig

I took it for a walk around the garden: from the office to the garden via the hallway, then an anticlockwise 6m diameter circle around the garden before returning back to the office.  The code was sampling at about 20Hz, with the test rig about 60cm from the ground with the Y axis always pointing away from me.  The walk took about 80s.

Here’s the X, Y distance graph based upon integrating the velocities the PIX4FLOW gives.

Garden plot

Garden plot

A quick walk through:

  • 0,0 is in the office
  • throughout the test the Y axis pointed away from me
  • beyond the 4m point, I walked in an anti-clockwise circle
  • once complete I doubled back and headed back to the office.

I’m delighted with the garden section; because the y axis was always facing forwards, from the PX4FLOW point of view, it always went forwards, but when transformed to the earth frame using the gyro readings, it shows a really accurate plot shape.  Given this was just a green grass surface, I’m amazed the down facing camera did such a good job

Here’s the height graph from the inbuilt URF:

Untitled

It’s good except for the spikes – I may need LEDDAR to make this better.  On the plus side, the height is not integrated, so the spikes do not have a persistent effect.

There were a few problems or inaccuracies:

  • the sensors should timestamp each read, but the register value did not change so I had to do this myself with time.time() – I have a second sensor on the way to see if it’s the sensors faul (ebay PX4FLOW to find them)
  • the scale of the circle is wrong:  the graph shows the circle to be about 3m diameter, but it should be more like 6m – this may just be a problem in my maths
  • the end of the walk should return to the start, yet it’s 6m out – the shape of the walk out of and back to the office match, but there’s a 30° error by the end of the flight.  I suspect only compass will fix this.

One unexpected positive was that although I’ve heard the I2C SMBus only supports 32 byte buffers, it seemed fine reading the 77 byte registers in a single sequence.

Overall then as a first trial I’m pretty delighted to the extent it’s now worth getting the new PCB for Chloe / Hermione.

Σ some sensors

So far, I’ve been looking at LiDAR sensors

  • the LEDDAR One gives me quadframe z-axis height and with differentiation, velocity
  • the Scanse sweep gives a 360° quadframe x- and y-axis plot of the borders of the flight area; to turn that into something useful is going to require more processing; something along the lines of using the known pitch, roll and yaw gyros to rotate the previous scan into the current scan frame, and then compare the misalignment between the two to guesstimate movement direction and speed.  It’s not clear yet how much of this will be provided by the Scance processing and how much I’ll have to do, but a 360° scan feels like a lot of data to process.

My friend in China is using the equivalent of a PX4FLOW optical flow sensor: a downward facing low resolution camera with gyro and URF.  Like I described above, with the height and the incremental angles from the gyro, they process picture frame changes to come up with horizontal velocities – critically, the PX4FLOW is doing this, and spitting out velocities over I2C.  Follow the link and page down to the “Accuracy” section to see how well the tracking works for a manually controlled flight; the integrated-velocities distance / direction plot of the flight path overlayed on the satellite image is a very convincing match.

A long time ago, in a galaxy far far away, I’d seen the PX4FLOW, seen its price and moved on.  But now, I’m starting to wonder whether I should think again.

Having said that, perhaps with scanse, I could do the same, but much simplified by only matching the outline of the flight space rather 2 photos of the same space.  And perhaps I can break this down into small enough pieces that a whole outline can be processed in pieces during 100 motion periods (i.e. 1s).  This is starting to feel viable and is a lot more aligned with my DIY desire rather than buying a PX4FLOW that does it all for me.

 

Hmm, that’s interesting, but…

not in the way I’d expected…

Scrot screen capture

Scrot screen capture

What’s interesting:

  • using the finest level of data error checking, the numbers are huge – 36 errors during warm-up rising to 50 after the flight – some of these may actually not be a problem because Phoebe was stationary throughout; an error is spotted when the sensors partially read as 0xFFFF starting from the Z gyro, but for the Z-gyro, 0xFFFF = -1 = 0.008°/s
  • the yaw angle read prior to takeoff is ridiculous – Phoebe doesn’t move throughout this flight, yet yaw says she span clockwise by 70° during the warm-up period!
  • The IMU core temperature only varied by 0.02°C between boot-up and the end of the flight – this is a good thing and again suggests any temperature drift during a flight is due to breeze

The second is the only real problem as it will skew the initial angle calculations made during warm up; it also add this during flight but there isn’t the corresponding yaw it should introduce. I have to assume the yaw PID is doing its job well?.

Anyway, putting the protecting code back in place lead to this.

Scrot screen capture

Scrot screen capture

All’s looking a lot happier here.

The whole point of this is to try to compensate in software for the temperature drift during warm-up time, and afterwards when the props start spinning and the breeze cools the IMU back down; this was my speculation for why indoor flights don’t drift, but outdoor flights do. I think it worked but it was hard to just: comparing flights with and without boot-up prediction, the predictive code did seem to drift less, but both flights failed to reach their intended altitude, and the flights were both killed after a few seconds to stop Phoebe tripping over her toes. I’ll try later on in the day once the ambient temperature has risen about 10°C.

P.S. I don’t think there is a way to fix the height changes in different ambient temperatures – I think only an altimeter can sort that.

Blind, deaf, disorientated, lost and scared of heights…

and yet, amazing.

So in that last video of a ~10s flight, Phoebe drifted about 1m when she shouldn’t have.  To me, that’s amazing; to you, that may be underwhelming.  So here’s why you should be amazed.  Basic school math(s):

distance = ½ x acceleration x time² ⇒ plugging in the numbers above says she was accelerating at 0.02m/s² (2cm/s²) instead of zero to travel 1 meter in 10 seconds.

Gravity is roughly 10m/s².  The accelerometer is set to read up to ±4g.  So that’s a range of 8g or roughly 80m/s²

So 0.02/80 * 100 = 0.025% error or a ±8 error value in the sensor range of ±32768.

Now the critical part – the sensors are only rated at 2% (±655) accuracy, and yet I’m getting 0.025% or 80 times that degree of accuracy.

And that’s why I don’t think I can get things very much better, whatever I do.

There is a slight chance that when the A2 is released (sounds like that’s shifted to next year now), I may be able to run the sampling rate at 1kHz without missing samples (I’ve had to drop to 500Hz to ensure I don’t miss samples at the moment).

Other than that though, she needs more sensors:

  • camera for motion tracking (!blind)
  • ultrasonic sensors for range finding (!deaf)
  • compass (!disorientated)
  • GPS (!lost)
  • altimeter (!scared of heights).

but they’ll also need an A2 for the extra processing.  So this is most definitely it for now.


My aversion to more sensors

The PC World mention, along with some recent comments got me thinking about why I lack interest in adding an altimeter / magnetometer / GPS / remote control to HoG.  After all, it’s the obvious next step.

I have the sensors already, and none of the above would add much in the way of overheads to the code processing – perhaps only 1Hz polls of the GPS, compass and altimeter fused with the existing data from the accelerometer and gyro about orientation, direction and speed feeding into the existing flight targets.  All relatively straightforward.

Autonomy vs. remote control was purely a personal decision based upon my ineptness at controlling machines with joysticks.  I stopped computer gaming with Half-Life2 2 when I was within seconds of the end and yet  lacked the hand-eye coordination to win that final battle to stop the launch of the missile / rocket / bomb.

It’s a combination of time and testing that is the biggest problem.  Up to now, all testing happens in the back garden – it now takes me less than 5 minutes to run a flight, and get back indoors to analyze the diagnostics.  Even when the weather is terrible, those 5 minute slots exist virtually every day.  But with GPS movement tracking resolution of 10m, the garden is nowhere near big enough to test autonomous path tracking – just too many high stone walls to crash into.  I could move testing to the village play park a hundred meters down the road, but it’s called a kids play park for a good reason.  I could move into the fields a couple of hundred meters away, but that’s just far enough away that time steps in – to-ing and fro-ing between the fields and the “engineering lab” (home office) would just be too inefficient for the limited time available due to a full-time job and two kids under 7.  Oh, and the farmers probably wouldn’t appreciate HoG scything down their crops or sheering their sheep!

So I settled on short term autonomous flights within the bounds of the back garden.

Having said all that, I’m off to the kids’ play park tomorrow during school time just to see how long HoG can maintain a stable hover with limited drift, and perhaps add some horizontal movement to her flight plan if all goes well.  Weather forecast is good, sunshine and only a gentle breeze so hopefully I’ll be able to get a longer video of what she can do!

Circles

Another day, another type of test to track down where these strange readings are from.  This time, Phoebe sits on a flat platform which sits on a set of sponge balls on the floor.

I then roll her round in circles on the floor, which should produce sine waves of acceleration from the X and Y sensors 90° out of sync.  And ignoring the noise (I’d turned down dlpf to 40Hz, that’s what’s shown:

Circular acceleration

Circular acceleration

Scale seems plausible at up to 0.4g acceleration. qax has a slight negative offset suggesting she’s leaning slightly – she’s picking up -0.025g or as a first order estimate of the angle -0.025 radians or -1.5° – again plausible.

So the deadzone theory is dead!  But there is something interesting here: above is the complete acceleration including gravity; once gravity is subtracted, then this is what you get:

net acceleration

net acceleration

And this is where it gets interesting – the -0.025g offset has gone, which is good, but so has the bulk of the sine wave shape leaving only spikes – compare the scales of the two graphs.

The measurement for gravity in the earth reference frame is fixed throughout the flight – it’s measured as part of the pre-flight checks.  But critically, the measured acceleration is used to calculate the angles used to redistribute fixed gravity to the new quadcopter reference frame – and I think it’s these angles and their reorientation of gravity between reference frames that’s also removing the sine waves. And sure enough, the angles graph below shows Phoebe pitching and rolling by ±20°, whereas I know full well she is not because she’s sitting on a flat platform that can’t pitch and roll.

Pitch and Roll

Pitch and Roll

So what I need is a way to calculate these rotation matrix angles ignoring the net acceleration; again we’re back to complementary or Kalman filters and using the gyro angles.  Except there’s another problem: the gyro angles are all in the quadcopter reference frame; what’s needed is a way to convert the gyro angles to the various intermediate Euler angles of a rotation matrix which lie in three different reference frame.  I vaguely remember something about this in a paper I read one – time to go on a hunt again.