I think I’ve done all that can be done with my Raspberry Pi, Python piDrones. Code is updated on GitHub as a result. Here’s the vimeo links to the proof-of-the-pudding as it were:
The hardest by far though was the simplest in concept: a stable autonomous hover beyond a few seconds; each of the cool functions listed above probably took a few weeks on average; in contrast, the one-minute-plus hover took years.
There’s lots more videos on Vimeo linked to the blog via pidrone.io/video.
I’ve achieved my personal target and then some: taking a Raspberry Pi and Python to create a piDrone, starting from absolutely ignorance, and doing it my way without using other’s code, ideas or suggestions.
What else could be done? My only idea is long distance / time flights requiring:
- GPS fused with existing distances sensors
- Compass fused with existing orientation sensors
- Long range wireless connectivity
- Nuclear-fusion batteries.
Lack of #4 renders 1-3 pointless.
Also pointless sadly is Penelope; Hermione, my big 3B, is the queen of autonomous control and Zoe, my Pi0W, the queen of human control. Two’s company, three’s a crowd. The only thing unique I can do with ‘P’ is to get her RPi 3B+ and Stretch O/S completed, and my motivation is lacking; that makes Penelope the queen of spare parts 😥
Time for me to find another hobby to hold back my terminal-boredom-syndrome. On my bike, me thinks.
So long, and thanks for all the fish!
* …nearly. I’m doing some refinement for Zoe, primarily so I can take new to the Cotswold Raspberry Jams and anything new and exciting the RPi releases next.
I’ve finally found and almost resolved the instability problem with Zoe. Zoe is a Pi0W compared to Hermione’s B3. As a result, she’s a single CPU at lower frequency and she’s struggling to keep up. In particular the LiDAR / Video distance increasing lag behind the IMU:
As a result, once fused, Zoe is reacting to historic movements and so drifting. The IMU data processing takes priority and by slowing it down to 333Hz (from 500), that allows enough space to process the LiDAR / Video distance to stay in sync with the IMU. Here’s the result for a simple 10 second hover.
There is still lagging drift but much less than seen previously; this drift is still cumulative; hence my next step is to reduce the video frame size a little more.
While this might not be the reason behind the RC drift, it cannot have been helping.
By the way, the fact the incrementally lagging drift is consistently left / right suggests strongly that I need to reduce the port / starboard PID due to the weight balance, primarily the LiPo battery aligned fore / aft in the frame. On the plus side, without this flaw, I’d never have been able to diagnose the drift problem so clearly and quickly!
It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.
5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:
- Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
- Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
- Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
- Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at. I’ve tried this several times, and it’s always had a problem I couldn’t solve. Third time lucky.
- Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
- Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively. However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.
Looking into the future…
- Implement (ironically named) SLAM object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
- Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion. It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
- Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
- Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!
These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach. Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).
Wishing you all a happy New Year and a great 2018!
Currently, all speeds, both horizontally and vertically are set to 0.3m/s for the sake of safety in enclosed arenas like indoors and the walled back garden. The down side is that in the park with the GPS waypoints perhaps 20m apart, it takes a very long time, often over a minute between waypoints, wearing out the batteries in a few flights.
The limitation other than safety is to ensure the down-facing video can track the difference between frames, which means there needs to be a significant overlap between consecutive frames.
The video runs at 10Hz*. The RPi camera angle of view (AOV) is 48.8°. With the camera 1m off the ground (the standard set throughout all flights)**, 48.8° corresponds to 80cm horizontal distance (2 x 1m * tan (AOV / 2)). Assuming there needs to be a 90% overlap between frames to get accurate video macro-block vectors, every 0.1s, Hermione can move up to 8cm (10%) or 0.80m/s compared to the current 0.3m/s. I’ll be trying this out on the GPS tracking flights in the park tomorrow.
*10Hz seems to be about the highest frequency for the video that the macro-block processing can handle without causing other sensor processing to overflow – specifically the IMU FIFO.
**1 meter height is for the sake of safety, and because the video 320² pixels macro-blocks can resolve distance accurately on grass and gravel. Doubling the height requires quadrupling the video frame size to 640² to get the same resolution required for grass / gravel, and once again, the processing time required will cause IMU FIFO overflowing.
P.S. The weather isn’t as good as I’d hoped to do the GPS tracking flights in the park yet, but I did take Hermione into the back garden this morning to test her increased horizontal velocity changes; she happily ran at 1m/s over the grass, so that will be the new speed used for the much longer distance GPS flights to reduce Hermione’s flight time and hence her and the DJI Mavic’s battery drain.
I had hoped yesterday to get going with Sweep integration, with a sanity check flight beforehand just to ensure all is running well – I can’t afford to have crashes with sweep installed.
And sure enough, Hermione crashed. In the middle of the climbing phase of the flight, she suddenly leapt into the air, and the protection code killed her at the point her height exceeded the flight plan height by 50cm. At the speed she was climbing, she continued to rise to a couple more meters before crashing down into a shrub bed, luckily minimising damage to components I had spares for.
A second mandatory flight to collect diagnostics (and more crash damage) revealed a conflict over I2C by the IMU and ground facing LiDAR. The LiDAR won, and the IMU started seeing gravity as just about 0g. This isn’t the first time this has happened, and I’ve tried various guessed solutions to fix it.
Accelerometer vs. Garmin LiDAR Lite
The left graph is height: blue is Garmin and is right; orange is the target – what should be happening, and grey is double integrated acceleration which is a very close match to Garmin right up to the point it all goes very wrong. Looking in more detail at the right graph shows the accelerometer results dropped just before 3.5s and about 0.5s before hover would have started.
This ain’t my code; best guess is an interaction over I2C of the LiDAR and IMU, and the IMU loses. I’ve seen similar IMU damage before, and without more detail, my only option is to add a new one and try again.
I’ve flown my Mavic probably for 20 minutes over the course of 5 short flights simply to get familiar with the controls while dodging the rain showers of the last couple of days. I’m back inside again trying to track down why Hermione has started throwing her I²C wobbly again.
Motion processing is working well, keeping processing close to the minimum 100Hz regardless of other sensor inputs – here 156 samples were processed in 1.724s.
Garmin’s height is running stably at the intended 20Hz and it’s well withing the accuracy possible for distances less than 1m
Garmin LiDAR v3
Here’s the problem though: the IMU is fine for 862 samples averaged into the 155 motion processing blocks, showing just gravity as Hermione sits on the ground, but suddenly the IMU values spike for no reason for the 156 sample average. Note that this happens only when the Garmin is plugged in. There are in fact two spikes: the first is shown, the second causes an I/O exception and the diagnostics are dumped:
I’ve tried power supplies up to 3.4A, both battery and mains powered; I’ve resoldered various critical PCB joins; I’ve added the 680uF capacitor as the Garmin spec suggests despite Zoe being fine without it, and I’ve used a newly flashed SD card, all to no avail.
I have two things left to try:
- currently the Garmin is read every motion processing loop, despite being updated at 20Hz; the spec says there’s an interrupt, but as yet, I’ve not got it to work. Must try harder!
- Failing that, I’ll have to replace the MPU-9250 with another, and see if the current one is faulty.
Beyond these two, I’m out for ideas.
Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks. She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.
And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*. Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind. But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.
I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.
* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.
A few test runs. In summary, with the LiDAR and Camera fused with the IMU, Zoe stays over her play mat at a controlled height for the length of the 30s flight. Without the fusion, she lasted just a few seconds before she drifted off the mat, lost her height, or headed to me with menace (kill ensued). I think that’s pretty conclusive code fusion works!
It’s Charles Babbage‘ 224th Birthday today, so how better to celebrate than to take his namesake, the Raspberry Pi Babbage Bear for a flight!
I’m so pleased with the QCIMUFIFO.py (Quadcopter Inertial Motion Unit First In First Out) code that I’ve decided to make it the primary development source, renaming Quadcopter.py to QCDRI.py (Quadcopter Data Ready Interrupt). They are both on GitHub along with a version of qc.py which makes it easier to select which to run.
OK, so FIFO is good, and definitely better than using the hardware interrupt but far from perfect. It does capture every sample regardless of what else if going on, which is great, but due to two factors, it doesn’t actually create free time to read other inputs to be read. This doesn’t mean other inputs can’t be read but reading those input delays the next ESC update, meaning that the flight might be jittery perhaps to the extent of being unstable.
The two factors are that
- reading the FIFO register is a bit by bit operation rather than a single 14 byte read when reading the sensor registers directly – this is slower
- To ensure the ESCs are updated at a reasonable frequency (100Hz is a good value), it’s now necessary to call time.time() a couple of times, which as I’ve mentioned before, ironically it wastes time.
There are a couple of plus sides too:
- because it doesn’t use the hardware interrupt, it doesn’t need the custom GPIO performance tweaks I had to make – this satifies my desire to use standard python libraries if at all possible
- Not using the GPIO library (mine or the standard one) partially opens up the possibility of using PyPy, although that still needs testing as the RPIO library is still required for the hardware PWM.
Anyway, the new props for Zoe arrived today, so the next step is to check both the interrupt and FIFO code to see how they perform in a real flight.