A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.

Hermione’s progress

Here’s Hermione with her new PCB.  It’s passed the passive tests; next step is to make sure each of the 8 motor ESCs are connected the right way to the respective PWM output on the PCB, and finally, I’ll do a quick flight with only the MPU-9250 as the sensors to tune the X8 PID gains.  Then she’s getting shelved.

Hermione's progress

Hermione’s progress

Zoe’s getting a new PCB so I can run the camera and Garmin LiDAR-Lite V3 on her too.  Hermione is huge compared to Zoe, and with the winter weather setting in, I’m going to need a system that’s small enough to test indoors.

Hermione will still be built – I need her extra size to incorporate the Scance Sweep and GPS, but I suspect only when an A3 arrives on the market – Hermione’s processing with a new 512MB A+ overclocked to 1GHz is nearly maxed out with the camera and diagnostics.  She’s probably just about got CPU space for the compass and Garmin LiDAR lite over I2C but I think that’s it until the A3 comes to market.  My hope for the A3 is that it uses the same 4 core CPU as the B2 with built in Bluetooth and WiFi as per the B3 but no USB / ethernet hub to save power.  Fingers crossed.



I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

The future’s bright; the future’s orange!

Well, tangerine actually!

With the IMU FIFO code taking the pressure off critical timings for reading the IMU, a huge range of options have opened up for what to do next with the spare time the FIFO has created:

  • A simple keyboard remote control is tempting where the QC code polls stdin periodically during a hover phase and amends the flight plan dynamically; ultimately, I’d like to do this via a touch screen app on my Raspberry Pi providing joystick buttons.  However for this to work well, I really need to add further sensor input providing feedback on longer-term horizontal and vertical motion…
  • A downward facing Ultrasonic Range Finder (URF) would provide the vertical motion tracking when combined with angles from the IMU.  I’d looked at this a long time ago but it stalled as I’d read the sensors could only run at up to 100kbps I2C baudrate which would prevent use of the higher 400kbps required for reading the FIFO.  However a quick test just now shows the URF working perfectly at 400kbps.
  • A downward facing RPi camera when combined with the URF would provide horizontal motion tracking.  Again I’d written this off due to the URF, but now it’s worth progressing with.  This is the Kitty++ code I started looking at during the summer and abandoned almost immediately due both to the lack of spare time in the code, and also the need for extra CPU cores to do the camera motion processing; Chloe with her Raspberry Pi B2 and her tangerine PiBow case more than satisfy that requirement now.
  • The magnetometer / compass on the IMU can provide longer term yaw stability which currently relies on just the integrated Z-axis gyro.

I do also have barometer and GPS sensors ‘in-stock’ but their application is primarily for long distance flights over variable terrain at heights above the range of the URF.  This is well out of scope for my current aims, so for the moment then, I’ll leave the parts shelved.

I have a lot of work to do, so I’d better get on with it.



The MPU-9250 has a built in magnetometer which up to now I’ve been ignoring.  However while I’m stuck waiting for stuff in the post, I’ve given it more than a seconds thought and it could be useful.

The magnetometer is reading the strength and direction of the local magnetic field, primarily from the Earth’s magnetic north pole.  So it can be used for knowing the direction a flight is taking which is useful if the flight is tracking between a series of GPS positions.  That aspect is of no interest to me at the moment, but there’s another trick.

Because the output of the magnetometer is a vector of the local magnetic field, it can be used to calculate the absolute quad-frame pitch, roll and yaw angles since take-off – exactly what the integrated gyro is doing short term.  But because the magnetometer gives absolute angles based upon an absolute fixed source, the magnetometer angles can be fused with short-term integrated gyro readings to keep them very tightly on the straight and narrow.

Use of a magnetometer as a backup a yaw sensor is common, but I’ve not come across it being used as an absolute pitch, roll and yaw sensor.

There is only one problem I can see, and that’s that metals in the flight area distort the earth’s magnetic field, so some care is needed fusing the readings to ignore any rapid changes that don’t match the gyro integration.

For now though, I’ll just be adding the magnetometer readings into the diagnostics, polling it perhaps once a second just to check the validity of the readings compared to the integrated gyro readings.  If that holds out, then this could lead to much longer zero drift flights.

Blind, deaf, disorientated, lost and scared of heights…

and yet, amazing.

So in that last video of a ~10s flight, Phoebe drifted about 1m when she shouldn’t have.  To me, that’s amazing; to you, that may be underwhelming.  So here’s why you should be amazed.  Basic school math(s):

distance = ½ x acceleration x time² ⇒ plugging in the numbers above says she was accelerating at 0.02m/s² (2cm/s²) instead of zero to travel 1 meter in 10 seconds.

Gravity is roughly 10m/s².  The accelerometer is set to read up to ±4g.  So that’s a range of 8g or roughly 80m/s²

So 0.02/80 * 100 = 0.025% error or a ±8 error value in the sensor range of ±32768.

Now the critical part – the sensors are only rated at 2% (±655) accuracy, and yet I’m getting 0.025% or 80 times that degree of accuracy.

And that’s why I don’t think I can get things very much better, whatever I do.

There is a slight chance that when the A2 is released (sounds like that’s shifted to next year now), I may be able to run the sampling rate at 1kHz without missing samples (I’ve had to drop to 500Hz to ensure I don’t miss samples at the moment).

Other than that though, she needs more sensors:

  • camera for motion tracking (!blind)
  • ultrasonic sensors for range finding (!deaf)
  • compass (!disorientated)
  • GPS (!lost)
  • altimeter (!scared of heights).

but they’ll also need an A2 for the extra processing.  So this is most definitely it for now.

My aversion to more sensors

The PC World mention, along with some recent comments got me thinking about why I lack interest in adding an altimeter / magnetometer / GPS / remote control to HoG.  After all, it’s the obvious next step.

I have the sensors already, and none of the above would add much in the way of overheads to the code processing – perhaps only 1Hz polls of the GPS, compass and altimeter fused with the existing data from the accelerometer and gyro about orientation, direction and speed feeding into the existing flight targets.  All relatively straightforward.

Autonomy vs. remote control was purely a personal decision based upon my ineptness at controlling machines with joysticks.  I stopped computer gaming with Half-Life2 2 when I was within seconds of the end and yet  lacked the hand-eye coordination to win that final battle to stop the launch of the missile / rocket / bomb.

It’s a combination of time and testing that is the biggest problem.  Up to now, all testing happens in the back garden – it now takes me less than 5 minutes to run a flight, and get back indoors to analyze the diagnostics.  Even when the weather is terrible, those 5 minute slots exist virtually every day.  But with GPS movement tracking resolution of 10m, the garden is nowhere near big enough to test autonomous path tracking – just too many high stone walls to crash into.  I could move testing to the village play park a hundred meters down the road, but it’s called a kids play park for a good reason.  I could move into the fields a couple of hundred meters away, but that’s just far enough away that time steps in – to-ing and fro-ing between the fields and the “engineering lab” (home office) would just be too inefficient for the limited time available due to a full-time job and two kids under 7.  Oh, and the farmers probably wouldn’t appreciate HoG scything down their crops or sheering their sheep!

So I settled on short term autonomous flights within the bounds of the back garden.

Having said all that, I’m off to the kids’ play park tomorrow during school time just to see how long HoG can maintain a stable hover with limited drift, and perhaps add some horizontal movement to her flight plan if all goes well.  Weather forecast is good, sunshine and only a gentle breeze so hopefully I’ll be able to get a longer video of what she can do!


Every time I threaten to retire Phoebe, she throws a spanner in the works.

I took her out for some horizontal velocity PID tuning this morning.  It’s below freezing point outside and my expectations weren’t high, but her thermostat code still brought her sensors up to the 40° calibrated temperature.  Probably 5 flights consisting of 3 seconds take-off at 0.5m/s, 5 seconds hover, and 3 seconds descent at 0.5m/s.

Of those 5 flights, one was, well, there’s no other word for it, perfect.  A perfect pass of the Ronseal test.  Don’t get me wrong, the other did well wrt drift management and getting to approximate hover over the take-off point but this particular flight just worked: take-off from a sloping ground stabilized up to 1.5m, completely static hover for 5s, and then an elegant descent back to the original take-off point.

The cause: gravity calibration in the Z axis.  Every flight starts with Phoebe peacefully sitting on the ground, measuring gravity in her X, Y, and Z axis.  That then produces a set of Euler angles used to rotate gravity back to earth axes where gravity should read 0, 0, 1 in the X, Y and Z axes respectively.  X and Y always measure zero to 8 decimal places.  Z seems to have a variability of 0.2% still though despite the thermostat – gravity could read between 0.998 and 1.002g and it seems entirely random.

But this particular flight, the sensors said earth gravity was 0, 0, 1.000009g!  That’s 0.001% error in the Z-axis or 0.1mms-2 acceleration error.  That correspond to a maximum drift velocity over the course of the 11s flight of 0.55mms-1!

That has multiple effects, the key two being angles and velocity calculation: with 0.001% error in the Z axes, then all angles and velocities were essentially right for the duration of the flight, hence the fact she rose to the right height and hovered over her takeoff position.

I’d love to know the cause of the random variance of accelerometer values from the Z axis – is it sensor tolerance, or ground vibrations or something else I’ve missed (the motors aren’t running at this stage of initialized before you ask)?

A barometer / altimeter + magnetometer / compass may fix it entirely:

  •  the barometer / altimeter will provide an alternate measure of vertical velocity @ ±10cm resolution
  • the magnetometer with provide orientation, and critically pitch, roll and yaw compared to the earth magnetic core.

A fusion of these sensor readings with their current accelerometer source could well improve matters – “fusion” just being the merging of the sources through a complementary or Kalman filter.

So why don’t I just get on with it?  Because still I cannot find a sensor that meets critical physical properties – none of them provide a double row of pins with 0.1″ pitch in both directions to allow connection to a breadboard or prototype PCB.

I think I need to put my politest head on and see whether DroTek have fixed their pin pitch spacing for their 10 DOF IMU yet.  Wish me luck!