A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.

yaw control.

Currently, yaw angle PID target stays set at zero throughout the flight; the piDrone always faces the same direction regardless of the way it’s travelling.  Next step on the way to ultimately tracking between GPS points is to have her point in the direction she’s travelling; it’s not actually necessary for my girls; the Mavic Pro does it as the forward facing camera video is streamed back to it’s remote control so I get an FPV (first-person view – i.e. as though I was a pilot sitting on it).

My intention is that the yaw angle tracks the earth frame flight plan velocity vector, so the piDrone points the way it _should_ be going.  This is roughly what the Mavic does.

I though it would be trivial, and added the one new line of code, and then I realised the gotchas which led to me blogging the details.  There are 3 problems.

  • Conversion of the lateral velocity vector to an angle.  tangent only covers ±90°; this means the 1/1 and -1/-1 vectors both come out as 45° angles.  Also 1/0 would throw a exception rather than 90°.  Luckily math.atan2(x, y) resolves this spitting out angles of ±180°.  Thats my yaw angle PID target resolved.
  • Alignment of the integrated gyro input in the same scale as above.  If the piDrone flies 2.25 clockwise loops around the circle, the integrated gyro will read 810° when it needs to read 90° – doing yaw = yaw % 180° should sort this out.  That’s the yaw angle PID input sorted
  • Finally if the yaw PID input is 179° and the yaw PID target is -179°, the PID (target – input) needs to come out at +2° not -358° i.e. the angle must always be <= |180°|.  I’ve sorted this out in the code by adding a custom subclass overriding the default error = (target – input):
    ####################################################################################################
    #
    # PID algorithm subclass to cope with the yaw angles error calculations.
    #
    ####################################################################################################
    class YAW_PID(PID):
    
        def Error(self, input, target):
            #-------------------------------------------------------------------------------------------
            # An example in degrees is the best way to explain this.  If the target is -179 degrees 
            # and the input is +179 degrees, the standard PID output would be -358 degrees leading to 
            # a very high yaw rotation rate to correct the -358 degrees error.  However, +2 degrees
            # achieves the same result, with a much lower rotation rate to fix the error.
            #-------------------------------------------------------------------------------------------
            error = target - input
            return (error if abs(error) < math.pi else (error - 2 * math.pi * error / abs(error)))
    

Now I just need the spring wind and torrential rain showers to ease up for an hour.

Better macro-block interpretation

Currently, getting lateral motion from a frame full of macro-blocks is very simplistic:  find the average SAD value for a frame, and then only included those vectors whose SAD is lower.

I’m quite surprised this works as well as it does but I’m fairly sure it can be improved.  There are four factors to the content of a frame of macro-blocks.

  • yaw change: all macro-block vectors will circle around the centre of the frame
  • height change: all macro-blocks vectors will point towards or away from the centre of the frame.
  • lateral motion change: all macro-blocks vectors are pointing in the same direction in the frame.
  • noise: the whole purpose of macro-blocks is simply to find the best matching blocks between two frame; doing this with a chess set (for example) could well have any block from the first frame matching any one of the 50% of the second frame.

Given a frame of macro-blocks, yaw increment between frames can found from the gyro, and thus be removed easily.

The same goes for height too derived from LiDAR.

That leaves either noise or a lateral vector.  By then averaging these values out, we can pick the vectors that are similar to the distance / direction of the average vector. SAD doesn’t come into the matter.

This won’t be my first step however: that’s to work out why the height of the flight wasn’t anything like as stable as I’d been expecting.

Pirouette

Zoe now has her new PCB, a Garmin LiDAR-Lite and camera.  Initial tests aren’t showing any of the I2C, FIFO or power black-outs.  The first test with the motor-power disengaged is to check the combination of video vectors, height and yaw.

So  standing in the center of the lounge, I held Zoe at arms length and rotated her around in a few circles where she was always facing the direction she was going;  the video output processing averages out yaw thus only produces linear movement; the yaw is reinstated from the gyro:

Pirouette

Pirouette

The lighting in the lounge wasn’t bright, and the rug she pirouetted around was low contrast but heavily textured.  She ‘landed’ at the same spot she took off from.  Overall I call this a success – 0.75 meters drift error over a 36 second flight is amazing!

Next step: power up the motors and take her for a flight outside, probably tomorrow when the torrential rain and 42mph winds subside.  The question here is whether the errors come back once she’s powered by the LiPo via a 5V 1,5A switching regulator.

 

yaw

I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

A walk in the park

I wired up PIX4FLOW to my test rig, knocked together some test code, set up the I2C baudrate to 400kbps to make sure it worked at the same rate as the IMU needs.

PX4FLOW test rig

PX4FLOW test rig

I took it for a walk around the garden: from the office to the garden via the hallway, then an anticlockwise 6m diameter circle around the garden before returning back to the office.  The code was sampling at about 20Hz, with the test rig about 60cm from the ground with the Y axis always pointing away from me.  The walk took about 80s.

Here’s the X, Y distance graph based upon integrating the velocities the PIX4FLOW gives.

Garden plot

Garden plot

A quick walk through:

  • 0,0 is in the office
  • throughout the test the Y axis pointed away from me
  • beyond the 4m point, I walked in an anti-clockwise circle
  • once complete I doubled back and headed back to the office.

I’m delighted with the garden section; because the y axis was always facing forwards, from the PX4FLOW point of view, it always went forwards, but when transformed to the earth frame using the gyro readings, it shows a really accurate plot shape.  Given this was just a green grass surface, I’m amazed the down facing camera did such a good job

Here’s the height graph from the inbuilt URF:

Untitled

It’s good except for the spikes – I may need LEDDAR to make this better.  On the plus side, the height is not integrated, so the spikes do not have a persistent effect.

There were a few problems or inaccuracies:

  • the sensors should timestamp each read, but the register value did not change so I had to do this myself with time.time() – I have a second sensor on the way to see if it’s the sensors faul (ebay PX4FLOW to find them)
  • the scale of the circle is wrong:  the graph shows the circle to be about 3m diameter, but it should be more like 6m – this may just be a problem in my maths
  • the end of the walk should return to the start, yet it’s 6m out – the shape of the walk out of and back to the office match, but there’s a 30° error by the end of the flight.  I suspect only compass will fix this.

One unexpected positive was that although I’ve heard the I2C SMBus only supports 32 byte buffers, it seemed fine reading the 77 byte registers in a single sequence.

Overall then as a first trial I’m pretty delighted to the extent it’s now worth getting the new PCB for Chloe / Hermione.

Σ some sensors

So far, I’ve been looking at LiDAR sensors

  • the LEDDAR One gives me quadframe z-axis height and with differentiation, velocity
  • the Scanse sweep gives a 360° quadframe x- and y-axis plot of the borders of the flight area; to turn that into something useful is going to require more processing; something along the lines of using the known pitch, roll and yaw gyros to rotate the previous scan into the current scan frame, and then compare the misalignment between the two to guesstimate movement direction and speed.  It’s not clear yet how much of this will be provided by the Scance processing and how much I’ll have to do, but a 360° scan feels like a lot of data to process.

My friend in China is using the equivalent of a PX4FLOW optical flow sensor: a downward facing low resolution camera with gyro and URF.  Like I described above, with the height and the incremental angles from the gyro, they process picture frame changes to come up with horizontal velocities – critically, the PX4FLOW is doing this, and spitting out velocities over I2C.  Follow the link and page down to the “Accuracy” section to see how well the tracking works for a manually controlled flight; the integrated-velocities distance / direction plot of the flight path overlayed on the satellite image is a very convincing match.

A long time ago, in a galaxy far far away, I’d seen the PX4FLOW, seen its price and moved on.  But now, I’m starting to wonder whether I should think again.

Having said that, perhaps with scanse, I could do the same, but much simplified by only matching the outline of the flight space rather 2 photos of the same space.  And perhaps I can break this down into small enough pieces that a whole outline can be processed in pieces during 100 motion periods (i.e. 1s).  This is starting to feel viable and is a lot more aligned with my DIY desire rather than buying a PX4FLOW that does it all for me.

 

Hmm, that’s interesting, but…

not in the way I’d expected…

Scrot screen capture

Scrot screen capture

What’s interesting:

  • using the finest level of data error checking, the numbers are huge – 36 errors during warm-up rising to 50 after the flight – some of these may actually not be a problem because Phoebe was stationary throughout; an error is spotted when the sensors partially read as 0xFFFF starting from the Z gyro, but for the Z-gyro, 0xFFFF = -1 = 0.008°/s
  • the yaw angle read prior to takeoff is ridiculous – Phoebe doesn’t move throughout this flight, yet yaw says she span clockwise by 70° during the warm-up period!
  • The IMU core temperature only varied by 0.02°C between boot-up and the end of the flight – this is a good thing and again suggests any temperature drift during a flight is due to breeze

The second is the only real problem as it will skew the initial angle calculations made during warm up; it also add this during flight but there isn’t the corresponding yaw it should introduce. I have to assume the yaw PID is doing its job well?.

Anyway, putting the protecting code back in place lead to this.

Scrot screen capture

Scrot screen capture

All’s looking a lot happier here.

The whole point of this is to try to compensate in software for the temperature drift during warm-up time, and afterwards when the props start spinning and the breeze cools the IMU back down; this was my speculation for why indoor flights don’t drift, but outdoor flights do. I think it worked but it was hard to just: comparing flights with and without boot-up prediction, the predictive code did seem to drift less, but both flights failed to reach their intended altitude, and the flights were both killed after a few seconds to stop Phoebe tripping over her toes. I’ll try later on in the day once the ambient temperature has risen about 10°C.

P.S. I don’t think there is a way to fix the height changes in different ambient temperatures – I think only an altimeter can sort that.

Blind, deaf, disorientated, lost and scared of heights…

and yet, amazing.

So in that last video of a ~10s flight, Phoebe drifted about 1m when she shouldn’t have.  To me, that’s amazing; to you, that may be underwhelming.  So here’s why you should be amazed.  Basic school math(s):

distance = ½ x acceleration x time² ⇒ plugging in the numbers above says she was accelerating at 0.02m/s² (2cm/s²) instead of zero to travel 1 meter in 10 seconds.

Gravity is roughly 10m/s².  The accelerometer is set to read up to ±4g.  So that’s a range of 8g or roughly 80m/s²

So 0.02/80 * 100 = 0.025% error or a ±8 error value in the sensor range of ±32768.

Now the critical part – the sensors are only rated at 2% (±655) accuracy, and yet I’m getting 0.025% or 80 times that degree of accuracy.

And that’s why I don’t think I can get things very much better, whatever I do.

There is a slight chance that when the A2 is released (sounds like that’s shifted to next year now), I may be able to run the sampling rate at 1kHz without missing samples (I’ve had to drop to 500Hz to ensure I don’t miss samples at the moment).

Other than that though, she needs more sensors:

  • camera for motion tracking (!blind)
  • ultrasonic sensors for range finding (!deaf)
  • compass (!disorientated)
  • GPS (!lost)
  • altimeter (!scared of heights).

but they’ll also need an A2 for the extra processing.  So this is most definitely it for now.


My aversion to more sensors

The PC World mention, along with some recent comments got me thinking about why I lack interest in adding an altimeter / magnetometer / GPS / remote control to HoG.  After all, it’s the obvious next step.

I have the sensors already, and none of the above would add much in the way of overheads to the code processing – perhaps only 1Hz polls of the GPS, compass and altimeter fused with the existing data from the accelerometer and gyro about orientation, direction and speed feeding into the existing flight targets.  All relatively straightforward.

Autonomy vs. remote control was purely a personal decision based upon my ineptness at controlling machines with joysticks.  I stopped computer gaming with Half-Life2 2 when I was within seconds of the end and yet  lacked the hand-eye coordination to win that final battle to stop the launch of the missile / rocket / bomb.

It’s a combination of time and testing that is the biggest problem.  Up to now, all testing happens in the back garden – it now takes me less than 5 minutes to run a flight, and get back indoors to analyze the diagnostics.  Even when the weather is terrible, those 5 minute slots exist virtually every day.  But with GPS movement tracking resolution of 10m, the garden is nowhere near big enough to test autonomous path tracking – just too many high stone walls to crash into.  I could move testing to the village play park a hundred meters down the road, but it’s called a kids play park for a good reason.  I could move into the fields a couple of hundred meters away, but that’s just far enough away that time steps in – to-ing and fro-ing between the fields and the “engineering lab” (home office) would just be too inefficient for the limited time available due to a full-time job and two kids under 7.  Oh, and the farmers probably wouldn’t appreciate HoG scything down their crops or sheering their sheep!

So I settled on short term autonomous flights within the bounds of the back garden.

Having said all that, I’m off to the kids’ play park tomorrow during school time just to see how long HoG can maintain a stable hover with limited drift, and perhaps add some horizontal movement to her flight plan if all goes well.  Weather forecast is good, sunshine and only a gentle breeze so hopefully I’ll be able to get a longer video of what she can do!