IMU + Servo

The IMU Gyro now controls the Servo motion. The aim is it compensates for vibration and drift in flight so the attached camera always points stably at the target.  Ultimately, the RPi0W will be attached to the servo with servo movement reversed to achieve this.

The problem at the moment is I can’t find a video player app on a seconds RPI to display the stability of the video output during piDrone instability flight.

Penelope, Percy and Pat.

When I was at the latest Cotswold Jam, one of the regulars suggested adding a camera to one of my piDrones to video its flight firsthand; that planted a seed which blossomed overnight:

  • Set up a live video stream from a RPi0W attached to one of my piDrones, the output of which is sent over WiFi to a RPi3+ RC touch-screen and display the video on a screen app there
  • Add on-screen pseudo-buttons to the RPi3+ RC touch-screen and use those to record the video to disk if specified
  • Add 2 on-screen pseudo-joysticks on the RPI3+ touch-screen RC, sending it to the piDrone, much like the physical joysticks do now
  • Finally, add IMU / servos hardware / software to keep the camera stable when it’s attached to a flying piDrone – trivial compared to the items above.

I’m completely ignorant how to implement all but the last item, much like the challenge to build the piDrones 6 years ago and hence that’s a fab challenge!  And in comparison to the piDrone itself, it’ll be cheap:  the parts either I already own, or are cheap to buy.  And I like the fact it gives a unique role for Penelope – currently she’s just Hermione without the object avoidance.

First job though is to name the Raspberry Pi’s:

 

Why six years?

It probably took a years at the beginning to get the system working at a basic level, and a couple of years at the end adding cool features like GPS location tracking, object-avoidance, human remote-control, custom cool lids etc.  So what happened in the intermediate three years?

The biggest timer-killer was drift in a hover: the accelerometer measures gravity and real acceleration along three perpendicular axes.  At the start of a flight, it reads a value for gravity before takeoff; during the flight, integration of new readings against the initial gravity reading provides velocity.  In a calm summer’s day, all worked well, but in other conditions, there were often collisions with brick-walls etc.  Essentially, the sensors say they weren’t moving whereas reality said it was.  It took years to recognise the correlation between winds, weather and crashes: lots of iterative speculation spanning the seasons were required to recognise the link to temperature variations*.

There are two factors: first, the IMU has been optimised to operate above 20°C – below that and small temperature changes lead to significant drift in values; secondly, once in flight, weather- and prop-breeze cools the sensor compared to the ground measurement of gravity.  I tried lots of ways to handle the temperature changes; at one point, I even bought a beer-fridge for mapping accelerometer gravity values against temperature!  It’s back to its intentional purpose now.

Digging back, it’s clear how others have coped:

  • DIY RC drones used synchronising the IMU and RC each flight along with sheltering the IMU from the wind.  The pilot wasn’t interested in static hover and is part of the feedback loop for where it to go.
  • at the bleeding edge, the DJI Mavic has a dynamic cooling system embedded in the depths of the frame keeping the IMU at a fixed temperature, along with two ground-facing cameras and GPS to long-term fine tuning.
  • All videos I saw were from the California coastline!!!

But I did it my way, the DIY experimentation way resulting ultimately with passive temperature stability by wrapping the IMU in a case to suppress wind cooling, combined with a Butterworth low-pass filter to extract gravity long-term, and the LiDAR / RPi camera to track the middle ground.  I perfected reinvention of the wheel!

Hindsight is wonderful, isn’t it!


*My apologies for the complexity of this sentence, it represents the frustration and complexity I encountered working this out over years!

That’s all, folks…*

I think I’ve done all that can be done with my Raspberry Pi, Python piDrones.  Code is updated on GitHub as a result.  Here’s the vimeo links to the proof-of-the-pudding as it were:

The hardest by far though was the simplest in concept: a stable autonomous hover beyond a few seconds; each of the cool functions listed above probably took a few weeks on average; in contrast, the one-minute-plus hover took years.

There’s lots more videos on Vimeo linked to the blog via pidrone.io/video.

I’ve achieved my personal target and then some: taking a Raspberry Pi and Python to create a piDrone, starting from absolutely ignorance, and doing it my way without using other’s code, ideas or suggestions.

What else could be done?  My only idea is long distance / time flights requiring:

  1. GPS fused with existing distances sensors
  2. Compass fused with existing orientation sensors
  3. Long range wireless connectivity
  4. Nuclear-fusion batteries.

Lack of #4 renders 1-3 pointless.

Also pointless sadly is Penelope; Hermione, my big 3B, is the queen of autonomous control and Zoe, my Pi0W, the queen of human control.  Two’s company, three’s a crowd. The only thing unique I can do with ‘P’ is to get her RPi 3B+ and Stretch O/S completed, and my motivation is lacking; that makes Penelope the queen of spare parts 😥

Time for me to find another hobby to hold back my terminal-boredom-syndrome.  On my bike, me thinks.

So long, and thanks for all the fish!


* …nearly.  I’m doing some refinement for Zoe, primarily so I can take new to the Cotswold Raspberry Jams and anything new and exciting the RPi releases next.

What Zoe saw.

I’ve finally found and almost resolved the instability problem with Zoe.  Zoe is a Pi0W compared to Hermione’s B3.  As a result, she’s a single CPU at lower frequency and she’s struggling to keep up.  In particular the LiDAR / Video distance increasing lag behind the IMU:

Distance

Distance

As a result, once fused, Zoe is reacting to historic movements and so drifting.  The IMU data processing takes priority and by slowing it down to 333Hz (from 500), that allows enough space to process the LiDAR / Video distance to stay in sync with the IMU.  Here’s the result for a simple 10 second hover.

There is still lagging drift but much less than seen previously; this drift is still cumulative; hence my next step is to reduce the video frame size a little more.

While this might not be the reason behind the RC drift, it cannot have been helping.


By the way, the fact the incrementally lagging drift is consistently left / right suggests strongly that I need to reduce the port / starboard PID due to the weight balance, primarily the LiPo battery aligned fore / aft in the frame.  On the plus side, without this flaw, I’d never have been able to diagnose the drift problem so clearly and quickly!

5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Speedy Gonzales

Currently, all speeds, both horizontally and vertically are set to 0.3m/s for the sake of safety in enclosed arenas like indoors and the walled back garden.  The down side is that in the park with the GPS waypoints perhaps 20m apart, it takes a very long time, often over a minute between waypoints, wearing out the batteries in a few flights.

The limitation other than safety is to ensure the down-facing video can track the difference between frames, which means there needs to be a significant overlap between consecutive frames.

The video runs at 10Hz*.  The RPi camera angle of view (AOV) is 48.8°.  With the camera 1m off the ground (the standard set throughout all flights)**, 48.8° corresponds to 80cm horizontal distance (2 x 1m * tan (AOV / 2)). Assuming there needs to be a 90% overlap between frames to get accurate video macro-block vectors, every 0.1s, Hermione can move up to 8cm (10%) or 0.80m/s compared to the current 0.3m/s.  I’ll be trying this out on the GPS tracking flights in the park tomorrow.


*10Hz seems to be about the highest frequency for the video that the macro-block processing can handle without causing other sensor processing to overflow – specifically the IMU FIFO.

**1 meter height is for the sake of safety, and because the video 320² pixels macro-blocks can resolve distance accurately on grass and gravel.  Doubling the height requires quadrupling the video frame size to 640² to get the same resolution required for grass / gravel, and once again, the processing time required will cause IMU FIFO overflowing.


P.S. The weather isn’t as good as I’d hoped to do the GPS tracking flights in the park yet, but I did take Hermione into the back garden this morning to test her increased horizontal velocity changes; she happily ran at 1m/s over the grass, so that will be the new speed used for the much longer distance GPS flights to reduce Hermione’s flight time and hence her and the DJI Mavic’s battery drain.

MPU-9250 vs. Garmin LiDAR Lite

I had hoped yesterday to get going with Sweep integration, with a sanity check flight beforehand just to ensure all is running well – I can’t afford to have crashes with sweep installed.

And sure enough, Hermione crashed.  In the middle of the climbing phase of the flight, she suddenly leapt into the air, and the protection code killed her at the point her height exceeded the flight plan height by 50cm.  At the speed she was climbing, she continued to rise to a couple more meters before crashing down into a shrub bed, luckily minimising damage to components I had spares for.

A second mandatory flight to collect diagnostics (and more crash damage) revealed a conflict over I2C by the IMU and ground facing LiDAR.  The LiDAR won, and the IMU started seeing gravity as just about 0g.  This isn’t the first time this has happened, and I’ve tried various guessed solutions to fix it.

Accelerometer vs. Garmin LiDAR Lite

Accelerometer vs. Garmin LiDAR Lite

The left graph is height: blue is Garmin and is right; orange is the target – what should be happening, and grey is double integrated acceleration which is a very close match to Garmin right  up to the point it  all goes very wrong.  Looking in more detail at the right graph shows the accelerometer results dropped just before 3.5s and about 0.5s before hover would have started.

This ain’t my code; best guess is an interaction over I2C of the LiDAR and IMU, and the IMU loses.  I’ve seen similar IMU damage before, and without more detail, my only option is to add a new one and try again.

Back on the wagon

I’ve flown my Mavic probably for 20 minutes over the course of 5 short flights simply to get familiar with the controls while dodging the rain showers of the last couple of days.  I’m back inside again trying to track down why Hermione has started throwing her I²C wobbly again.

Motion processing is working well, keeping processing close to the minimum 100Hz regardless of other sensor inputs – here 156 samples were processed in 1.724s.

Processing rate

Processing rate

Garmin’s height is running stably at the intended 20Hz and it’s well withing the accuracy possible for distances less than 1m

Garmin LiDAR v3

Garmin LiDAR v3

Here’s the problem though: the IMU is fine for 862 samples averaged into the 155 motion processing blocks, showing just gravity as Hermione sits on the ground, but suddenly the IMU values spike for no reason for the 156 sample average.  Note that this happens only when the Garmin is plugged in.  There are in fact two spikes: the first is shown, the second causes an I/O exception and the diagnostics are dumped:

IMU stats

IMU stats

I’ve tried power supplies up to 3.4A, both battery and mains powered; I’ve resoldered various critical PCB joins; I’ve added the 680uF capacitor as the Garmin spec suggests despite Zoe being fine without it, and I’ve used a newly flashed SD card, all to no avail.

I have two things left to try:

  • currently the Garmin is read every motion processing loop, despite being updated at 20Hz; the spec says there’s an interrupt, but as yet, I’ve not got it to work.  Must try harder!
  • Failing that, I’ll have to replace the MPU-9250 with another, and see if the current one is faulty.

Beyond these two, I’m out for ideas.

Zoe++

Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks.  She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.

The results?:

And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*.  Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind.  But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.

I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.


* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.