Back on the wagon

I’ve flown my Mavic probably for 20 minutes over the course of 5 short flights simply to get familiar with the controls while dodging the rain showers of the last couple of days.  I’m back inside again trying to track down why Hermione has started throwing her I²C wobbly again.

Motion processing is working well, keeping processing close to the minimum 100Hz regardless of other sensor inputs – here 156 samples were processed in 1.724s.

Processing rate

Processing rate

Garmin’s height is running stably at the intended 20Hz and it’s well withing the accuracy possible for distances less than 1m

Garmin LiDAR v3

Garmin LiDAR v3

Here’s the problem though: the IMU is fine for 862 samples averaged into the 155 motion processing blocks, showing just gravity as Hermione sits on the ground, but suddenly the IMU values spike for no reason for the 156 sample average.  Note that this happens only when the Garmin is plugged in.  There are in fact two spikes: the first is shown, the second causes an I/O exception and the diagnostics are dumped:

IMU stats

IMU stats

I’ve tried power supplies up to 3.4A, both battery and mains powered; I’ve resoldered various critical PCB joins; I’ve added the 680uF capacitor as the Garmin spec suggests despite Zoe being fine without it, and I’ve used a newly flashed SD card, all to no avail.

I have two things left to try:

  • currently the Garmin is read every motion processing loop, despite being updated at 20Hz; the spec says there’s an interrupt, but as yet, I’ve not got it to work.  Must try harder!
  • Failing that, I’ll have to replace the MPU-9250 with another, and see if the current one is faulty.

Beyond these two, I’m out for ideas.


Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks.  She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.

The results?:

And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*.  Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind.  But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.

I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.

* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.

30s with, <10s without

A few test runs.  In summary, with the LiDAR and Camera fused with the IMU, Zoe stays over her play mat at a controlled height for the length of the 30s flight.  Without the fusion, she lasted just a few seconds before she drifted off the mat, lost her height, or headed to me with menace (kill ensued).  I think that’s pretty conclusive code fusion works!

With Fusion:

Without Fusion:

Babbage takes to the air

It’s Charles Babbage‘ 224th Birthday today, so how better to celebrate than to take his namesake, the Raspberry Pi Babbage Bear for a flight!

I’m so pleased with the (Quadcopter Inertial Motion Unit First In First Out) code that I’ve decided to make it the primary development source, renaming to (Quadcopter Data Ready Interrupt).  They are both on GitHub along with a version of which makes it easier to select which to run.

FIFO food for thought

OK, so FIFO is good, and definitely better than using the hardware interrupt but far from perfect.  It does capture every sample regardless of what else if going on, which is great, but due to two factors, it doesn’t actually create free time to read other inputs to be read.  This doesn’t mean other inputs can’t be read but reading those input delays the next ESC update, meaning that the flight might be jittery perhaps to the extent of being unstable.

The two factors are that

  • reading the FIFO register is a bit by bit operation rather than a single 14 byte read when reading the sensor registers directly – this is slower
  • To ensure the ESCs are updated at a reasonable frequency (100Hz is a good value), it’s now necessary to call time.time() a couple of times, which as I’ve mentioned before, ironically it wastes time.

There are a couple of plus sides too:

  • because it doesn’t use the hardware interrupt, it doesn’t need the custom GPIO performance tweaks I had to make – this satifies my desire to use standard python libraries if at all possible
  • Not using the GPIO library (mine or the standard one) partially opens up the possibility of using PyPy, although that still needs testing as the RPIO library is still required for the hardware PWM.

Anyway, the new props for Zoe arrived today, so the next step is to check both the interrupt and FIFO code to see how they perform in a real flight.

FIFO Phoenix

The Tangerine Dream train of thought puffed through the IMU FIFO station, and got me thinking I needn’t wait for the B2 – I can test it with Zoe.

Cutting to the chase, the results of my FIFO code test suggests the flight controller is a time traveller!  The HoG has got the Infinite Improbability Drive working, and there’s a lovely cuppa tea steaming away!  It can spend time doing slow things like reading lots of other non-time-critical inputs (altimeter, compass, GPS, remote control…) and then go back in time to pick up all the readings from the time critical accelerometer and gyro knowing exactly when they happened and process them as though each had only just happened. 

To be autonomous, my code has to catch every sample from the accelerometer and gyro.  Any missed readings result in drift.  The code until today did this by waiting for a hardware interrupt that is configured to happen every 4ms (250Hz sampling).  So every 4ms, it catches the interrupt, reads the sensors and processes them.  That doesn’t leave very much time to do anything else before the next interrupt.  This has been a psychological block for me: I have camera motion trackers, ultrasonic range finders, GPS, altimeter and compass sensors all waiting to be used.  But I never got very far with any of them as ultimately I knew I only had a fraction of the millisecond spare.  By using the IMU FIFO, the code could take control of time by reading the cache of sensor readings when it wanted to (within reason), and thus make the space to process input from other sensors.

I’ve known for a while that the IMU FIFO could take back control over time from the hardware interrupt, but my previous investigation hit a very hard brick wall: I2C errors were corrupting the data I was reading from the FIFO.  For example, here’s gravity from 6 months ago while Phoebe was sitting passively on the floor:

FIFO stats

FIFO stats

But I’ve not seen I2C errors since the move to Jessie, so I gave it another go with Zoe:

IMU FIFO Accelerometer

IMU FIFO Accelerometer

Obviously, this is just a single test run, but if this proves to be reliable, it is truly liberating.  It opens up a whole new world of sensors, the hard part only being which to add first!

Maiden voyage

HoG lost her flight virginity today, and she lost it with enthusiasm – a little too much to be honest.  Three second flight plan – one second takeoff to 0.5m, one second hover and one second descent.  All maiden flights are a huge gamble: in HoGs case, she had

  • new arms
  • new props
  • new motors
  • new frame
  • new calibration method
  • new butterworth filter parameters.

Given that, I’d say her performance was surprisingly good!

She took off vertically from sloping ground.  That alone is nearly an unqualified success.  For it to have been a complete success though, she would have stopped at 0.5m off the ground and hovered.  Instead, she whizzed up to 3m  and then I hit the kill switch even before she had the change to try to hover.

A few lessons learnt even from such a short flight though:

  • zero g calibration seems to work, but it needs doing for X, Y and Z axis
  • having dlpf set to 180Hz rather than the normal 20Hz probably wasn’t a smart move regardless of how good the Butterworth might be
  • aluminium arms bend and don’t straighten when they hit the ground at over 7.5ms-1!

New arms are on the way and will arrive tomorrow, allowing me to do the zero g calibration of the Z axis also!

But what’s Zero-G calibration, and how do you do it without going into space?

Historically, I’ve been jumping through hoops trying to get sensor calibration stable, controlling the temperature to 40°C while rotating her in the calibration cube to measure ±g in all three axes to get gains and offsets.  Yet despite all that effort, the sensors, and hence Zoë, still drifted, even if only modestly over time, still enough that she couldn’t fly in the back garden for more than a few seconds without hitting a wall.

The move to the MPU-9250 for HoG from Zoë’s MPU-6050 IMU initially seemed a retrograde step – it didn’t seem to be able to measure absolute temperature, only the difference from when the chip was powered up.  And that meant the 40°C calibration could no longer work.  Lots and lots of reading the spec’s yielded nothing initially,

But in passing I’d spotted some new registers for storing accelerometer offsets to allow them to be included in the IMU motion processing.  That suggested there was a way to get valid offsets.  Additionally, again in passing, I’d spotted a couple of Zero-G specifications: critically that the Zero-G level change against temperature was only ±1.5mg / ºC.  That means an offset measured in a Zero-G environment hardly drifts against temperature.   And a Zero-G environment doesn’t mean going up to space – it simply means reading the X and Y axis values when the Z-axis is aligned with gravity.  So with HoG sat on the floor, X and Y offsets are read, and then holding her against a wall gives the Z offset.  So calibration and updating the code takes only 5 minutes and requires no special equipment.

Delight and despair at the same time: delight that I now had a way forwards with the MPU-9250 (and it would work with the MPU-6050 also), but despair at the time and money I’d spent trying to sort out calibration against temperature.


My initial tests of HoG* have revealed an unexpected surprise; the MPU-9250 is not back compatibly with the MPU-6050 / MPU-9150 as far as registers are concerned.  I should have read the specs beforehand!

Don’t get me wrong, it’s working in the briefest of tests, but there has been a radical change in temperature scale (now configurable), dlpf (settable separately for gyro and accelerometer), sampling frequencies (way up to 8kHz!), plus new registers to do really useful things like storing calibration data so the IMU can do the work.  And that’s only from the briefest of scans from the spec.

So BYOAQ-BAT articles have stalled again until I’ve found all these changes and added a new class to the code for the MPU-9250, and probably a generic IMU wrapper the uses the common WHO_AM_I register to determine which MPU class code should be used.

To make matters worse, my kids have used all the printer paper in the house for drawing on.  I need printouts of these data sheets to stand a chance to find the changes.  Off to buy some printer paper tomorrow!

*HoG = Heart of Gold – the superclone of Phoebe, Chloë and Zoë upon which the BYOAQ-BAT articles are based.