It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.
5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:
- Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
- Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
- Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
- Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at. I’ve tried this several times, and it’s always had a problem I couldn’t solve. Third time lucky.
- Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
- Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively. However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.
Looking into the future…
- Implement (ironically named) SLAM object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
- Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion. It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
- Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
- Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!
These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach. Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).
Wishing you all a happy New Year and a great 2018!
For the last few days, the outdoor temperature dropped to the teens from the twenties. As a result, she didn’t get off the ground; she just jittered around. The graph of double integrated (accelerometer – gravity) shows she believed she’s was climbing right at the point the IMU temperature dropped.
Not again 🙁
This looks awfully similar to this.
The problem is that in these cooler temperatures, once the props start spinning, the IMU cools, and the temperature sensitive accelerometer output shifts as a result. Because I take a snapshot of gravity prior to the props spinning, net acceleration (accelerometer – gravity) is wrong, so integrating it twice to get distances is very very wrong.
The problem showed up this time because to add the salad bowl lid, I had to remove her Pimoroni Tangerine Dream PiBow, thus exposing her to the elements. For now, I’ve just moved the initial read of gravity to immediately before take off, but I’m also working out how to reintroduce the PiBox case or similar in a lower profile version to that the salad bowl still fits neatly on top.
Hermione is still causing trouble with yaw control flights despite lots of refinements. Here’s the latest.
@3s she’s climbed to about a meter high and then hovered for a second. All the X, Y, and Z flight plan targets and sensor inputs are nicely aligned.
The ‘fun’ starts at 4 seconds. The flight plan, written from my point of view says move left by 1m over 4 seconds. From Hermione’s point of view, with the yaw code in use, this translates to rotate anti-clockwise by 90° while moving forwards by 1m over 4 seconds. The yaw graph from the sensors shows the ACW rotation is happening correctly. The amber line in the Y graph shows the left / right distance target from H’s POV is correctly zero. Similarly, the amber line in the X graph correctly shows she should move forwards by 1m over 4s. All’s good as far as far as the targets are concerned from her and my POV.
But there’s some severe discrepancy from the sensors inputs POV. From my POV, she rotated ACW 90° as expected, but then she moved forwards away from me, instead of left. The blue line on the Y graph (the LiDAR and ground-facing video inputs) confirms this; it shows she moves right by about 0.8m from her POV. But the rusty terracotta line in the Y graph (the double integrated accelerometer – gravity readings) shows exactly the opposite. The grey fusion of the amber and terracotta cancel each other out thus following the target perfectly but for completely the wrong reasons.
There are similar discrepancies in the X graph, where the LiDAR + Video blue line is the best match to what I saw: virtually no forward movement from H’s POV except for some slight forward movement after 8s when she should be hovering.
So the net of this? The LiDAR / Video processing is working perfectly. The double integrated IMU accelerometer results are wrong, and I need to work out why? The results shown are taken directly from the accelerometer, and double integrated in excel (much like what the code does too), and I’m pretty convinced I’ve got this right. Yet more digging to be done.
In other news…
- Ö has ground facing lights much like Zoe had. Currently they are always on, but ultimately I intend to use them in various ways such as flashing during calibration etc – this requires a new PCB however to plug a MOSFET gate into a GPIO pin.
- piNet has changed direction somewhat: I’m testing within the bounds of my garden whether I can define a target destination with GPS, and have enough accuracy for the subsequent flight from elsewhere to get to that target accurately. This is step one in taking the GPS coordinates of the centre of a maze, and then starting a flight from the edge to get back there.
That’s all for now, folks. Thanks for sticking with me during these quiet times.
P.S. I’ve got better things to do that worry about why everything goes astray @ 7s, 3s after the yaw to move left started; it’s officially on hold as I’ve other stuff lurking in the background that’s about the flower.
Hermione’s “reach for the stars” was due to I²C errors; I suspected powewr brown-outs. Her regulator for the LiPo provided only 1.5A, so I tried her passively with mains PSU of 5V at 1A and 2.5A – the error was the same – shifted outputs from the IMU FIFO without any FIFO overflow. That suggested a interaction with the I²C with the Garmin instead. I rebuilt the cable with two UTPs (unshielded twisted pairs): SCL with Vss / GND and SDA with Vdd / 5V as per the PX4FLOW spec for long I²C wiring. I was stunned – it just worked, regardless of whether the 1A or 2.5A power supply was used, I no longer got any I²C corruption. Next step clearly is to test her live outdoors and check she no longer reaches for the stars.
I also had the bottle to let Zoe loose in the play room. She still hardly got off the ground on the first flight, so it’s not temperature drift. However, her second run was perfect, which reminded me that her first run was always cr@p for some reason. Here’s the stats for both. The stats are logging both accelerometer and Garmin / Camera distances. There’s such a tight correlation between the very difference sensors that I’m very tempted to turn the fusion on. Just a tad more bottle needed. The key one for each is the bottom left: how high was she according to the two sensor sources.
I thinkthat’s my courage bottle empty for the day. When it’s charged up tomorrow, I’ll take the sisters outside to test the above next steps.