5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Speedy Gonzales

Currently, all speeds, both horizontally and vertically are set to 0.3m/s for the sake of safety in enclosed arenas like indoors and the walled back garden.  The down side is that in the park with the GPS waypoints perhaps 20m apart, it takes a very long time, often over a minute between waypoints, wearing out the batteries in a few flights.

The limitation other than safety is to ensure the down-facing video can track the difference between frames, which means there needs to be a significant overlap between consecutive frames.

The video runs at 10Hz*.  The RPi camera angle of view (AOV) is 48.8°.  With the camera 1m off the ground (the standard set throughout all flights)**, 48.8° corresponds to 80cm horizontal distance (2 x 1m * tan (AOV / 2)). Assuming there needs to be a 90% overlap between frames to get accurate video macro-block vectors, every 0.1s, Hermione can move up to 8cm (10%) or 0.80m/s compared to the current 0.3m/s.  I’ll be trying this out on the GPS tracking flights in the park tomorrow.


*10Hz seems to be about the highest frequency for the video that the macro-block processing can handle without causing other sensor processing to overflow – specifically the IMU FIFO.

**1 meter height is for the sake of safety, and because the video 320² pixels macro-blocks can resolve distance accurately on grass and gravel.  Doubling the height requires quadrupling the video frame size to 640² to get the same resolution required for grass / gravel, and once again, the processing time required will cause IMU FIFO overflowing.


P.S. The weather isn’t as good as I’d hoped to do the GPS tracking flights in the park yet, but I did take Hermione into the back garden this morning to test her increased horizontal velocity changes; she happily ran at 1m/s over the grass, so that will be the new speed used for the much longer distance GPS flights to reduce Hermione’s flight time and hence her and the DJI Mavic’s battery drain.

MPU-9250 vs. Garmin LiDAR Lite

I had hoped yesterday to get going with Sweep integration, with a sanity check flight beforehand just to ensure all is running well – I can’t afford to have crashes with sweep installed.

And sure enough, Hermione crashed.  In the middle of the climbing phase of the flight, she suddenly leapt into the air, and the protection code killed her at the point her height exceeded the flight plan height by 50cm.  At the speed she was climbing, she continued to rise to a couple more meters before crashing down into a shrub bed, luckily minimising damage to components I had spares for.

A second mandatory flight to collect diagnostics (and more crash damage) revealed a conflict over I2C by the IMU and ground facing LiDAR.  The LiDAR won, and the IMU started seeing gravity as just about 0g.  This isn’t the first time this has happened, and I’ve tried various guessed solutions to fix it.

Accelerometer vs. Garmin LiDAR Lite

Accelerometer vs. Garmin LiDAR Lite

The left graph is height: blue is Garmin and is right; orange is the target – what should be happening, and grey is double integrated acceleration which is a very close match to Garmin right  up to the point it  all goes very wrong.  Looking in more detail at the right graph shows the accelerometer results dropped just before 3.5s and about 0.5s before hover would have started.

This ain’t my code; best guess is an interaction over I2C of the LiDAR and IMU, and the IMU loses.  I’ve seen similar IMU damage before, and without more detail, my only option is to add a new one and try again.

Back on the wagon

I’ve flown my Mavic probably for 20 minutes over the course of 5 short flights simply to get familiar with the controls while dodging the rain showers of the last couple of days.  I’m back inside again trying to track down why Hermione has started throwing her I²C wobbly again.

Motion processing is working well, keeping processing close to the minimum 100Hz regardless of other sensor inputs – here 156 samples were processed in 1.724s.

Processing rate

Processing rate

Garmin’s height is running stably at the intended 20Hz and it’s well withing the accuracy possible for distances less than 1m

Garmin LiDAR v3

Garmin LiDAR v3

Here’s the problem though: the IMU is fine for 862 samples averaged into the 155 motion processing blocks, showing just gravity as Hermione sits on the ground, but suddenly the IMU values spike for no reason for the 156 sample average.  Note that this happens only when the Garmin is plugged in.  There are in fact two spikes: the first is shown, the second causes an I/O exception and the diagnostics are dumped:

IMU stats

IMU stats

I’ve tried power supplies up to 3.4A, both battery and mains powered; I’ve resoldered various critical PCB joins; I’ve added the 680uF capacitor as the Garmin spec suggests despite Zoe being fine without it, and I’ve used a newly flashed SD card, all to no avail.

I have two things left to try:

  • currently the Garmin is read every motion processing loop, despite being updated at 20Hz; the spec says there’s an interrupt, but as yet, I’ve not got it to work.  Must try harder!
  • Failing that, I’ll have to replace the MPU-9250 with another, and see if the current one is faulty.

Beyond these two, I’m out for ideas.

Zoe++

Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks.  She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.

The results?:

And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*.  Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind.  But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.

I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.


* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.

30s with, <10s without

A few test runs.  In summary, with the LiDAR and Camera fused with the IMU, Zoe stays over her play mat at a controlled height for the length of the 30s flight.  Without the fusion, she lasted just a few seconds before she drifted off the mat, lost her height, or headed to me with menace (kill ensued).  I think that’s pretty conclusive code fusion works!

With Fusion:

Without Fusion:

Babbage takes to the air

It’s Charles Babbage‘ 224th Birthday today, so how better to celebrate than to take his namesake, the Raspberry Pi Babbage Bear for a flight!

I’m so pleased with the QCIMUFIFO.py (Quadcopter Inertial Motion Unit First In First Out) code that I’ve decided to make it the primary development source, renaming Quadcopter.py to QCDRI.py (Quadcopter Data Ready Interrupt).  They are both on GitHub along with a version of qc.py which makes it easier to select which to run.

FIFO food for thought

OK, so FIFO is good, and definitely better than using the hardware interrupt but far from perfect.  It does capture every sample regardless of what else if going on, which is great, but due to two factors, it doesn’t actually create free time to read other inputs to be read.  This doesn’t mean other inputs can’t be read but reading those input delays the next ESC update, meaning that the flight might be jittery perhaps to the extent of being unstable.

The two factors are that

  • reading the FIFO register is a bit by bit operation rather than a single 14 byte read when reading the sensor registers directly – this is slower
  • To ensure the ESCs are updated at a reasonable frequency (100Hz is a good value), it’s now necessary to call time.time() a couple of times, which as I’ve mentioned before, ironically it wastes time.

There are a couple of plus sides too:

  • because it doesn’t use the hardware interrupt, it doesn’t need the custom GPIO performance tweaks I had to make – this satifies my desire to use standard python libraries if at all possible
  • Not using the GPIO library (mine or the standard one) partially opens up the possibility of using PyPy, although that still needs testing as the RPIO library is still required for the hardware PWM.

Anyway, the new props for Zoe arrived today, so the next step is to check both the interrupt and FIFO code to see how they perform in a real flight.

FIFO Phoenix

The Tangerine Dream train of thought puffed through the IMU FIFO station, and got me thinking I needn’t wait for the B2 – I can test it with Zoe.

Cutting to the chase, the results of my FIFO code test suggests the flight controller is a time traveller!  The HoG has got the Infinite Improbability Drive working, and there’s a lovely cuppa tea steaming away!  It can spend time doing slow things like reading lots of other non-time-critical inputs (altimeter, compass, GPS, remote control…) and then go back in time to pick up all the readings from the time critical accelerometer and gyro knowing exactly when they happened and process them as though each had only just happened. 

To be autonomous, my code has to catch every sample from the accelerometer and gyro.  Any missed readings result in drift.  The code until today did this by waiting for a hardware interrupt that is configured to happen every 4ms (250Hz sampling).  So every 4ms, it catches the interrupt, reads the sensors and processes them.  That doesn’t leave very much time to do anything else before the next interrupt.  This has been a psychological block for me: I have camera motion trackers, ultrasonic range finders, GPS, altimeter and compass sensors all waiting to be used.  But I never got very far with any of them as ultimately I knew I only had a fraction of the millisecond spare.  By using the IMU FIFO, the code could take control of time by reading the cache of sensor readings when it wanted to (within reason), and thus make the space to process input from other sensors.

I’ve known for a while that the IMU FIFO could take back control over time from the hardware interrupt, but my previous investigation hit a very hard brick wall: I2C errors were corrupting the data I was reading from the FIFO.  For example, here’s gravity from 6 months ago while Phoebe was sitting passively on the floor:

FIFO stats

FIFO stats

But I’ve not seen I2C errors since the move to Jessie, so I gave it another go with Zoe:

IMU FIFO Accelerometer

IMU FIFO Accelerometer

Obviously, this is just a single test run, but if this proves to be reliable, it is truly liberating.  It opens up a whole new world of sensors, the hard part only being which to add first!

Maiden voyage

HoG lost her flight virginity today, and she lost it with enthusiasm – a little too much to be honest.  Three second flight plan – one second takeoff to 0.5m, one second hover and one second descent.  All maiden flights are a huge gamble: in HoGs case, she had

  • new arms
  • new props
  • new motors
  • new frame
  • new calibration method
  • new butterworth filter parameters.

Given that, I’d say her performance was surprisingly good!

She took off vertically from sloping ground.  That alone is nearly an unqualified success.  For it to have been a complete success though, she would have stopped at 0.5m off the ground and hovered.  Instead, she whizzed up to 3m  and then I hit the kill switch even before she had the change to try to hover.

A few lessons learnt even from such a short flight though:

  • zero g calibration seems to work, but it needs doing for X, Y and Z axis
  • having dlpf set to 180Hz rather than the normal 20Hz probably wasn’t a smart move regardless of how good the Butterworth might be
  • aluminium arms bend and don’t straighten when they hit the ground at over 7.5ms-1!

New arms are on the way and will arrive tomorrow, allowing me to do the zero g calibration of the Z axis also!

But what’s Zero-G calibration, and how do you do it without going into space?

Historically, I’ve been jumping through hoops trying to get sensor calibration stable, controlling the temperature to 40°C while rotating her in the calibration cube to measure ±g in all three axes to get gains and offsets.  Yet despite all that effort, the sensors, and hence Zoë, still drifted, even if only modestly over time, still enough that she couldn’t fly in the back garden for more than a few seconds without hitting a wall.

The move to the MPU-9250 for HoG from Zoë’s MPU-6050 IMU initially seemed a retrograde step – it didn’t seem to be able to measure absolute temperature, only the difference from when the chip was powered up.  And that meant the 40°C calibration could no longer work.  Lots and lots of reading the spec’s yielded nothing initially,

But in passing I’d spotted some new registers for storing accelerometer offsets to allow them to be included in the IMU motion processing.  That suggested there was a way to get valid offsets.  Additionally, again in passing, I’d spotted a couple of Zero-G specifications: critically that the Zero-G level change against temperature was only ±1.5mg / ºC.  That means an offset measured in a Zero-G environment hardly drifts against temperature.   And a Zero-G environment doesn’t mean going up to space – it simply means reading the X and Y axis values when the Z-axis is aligned with gravity.  So with HoG sat on the floor, X and Y offsets are read, and then holding her against a wall gives the Z offset.  So calibration and updating the code takes only 5 minutes and requires no special equipment.

Delight and despair at the same time: delight that I now had a way forwards with the MPU-9250 (and it would work with the MPU-6050 also), but despair at the time and money I’d spent trying to sort out calibration against temperature.