That’s all, folks…*

Featured

I think I’ve done all that can be done with my Raspberry Pi, Python piDrones.  Code is updated on GitHub as a result.  Here’s the vimeo links to the proof-of-the-pudding as it were:

The hardest by far though was the simplest in concept: a stable autonomous hover beyond a few seconds; each of the cool functions listed above probably took a few weeks on average; in contrast, the one-minute-plus hover took years.

There’s lots more videos on Vimeo linked to the blog via pidrone.io/video.

I’ve achieved my personal target and then some: taking a Raspberry Pi and Python to create a piDrone, starting from absolutely ignorance, and doing it my way without using other’s code, ideas or suggestions.

What else could be done?  My only idea is long distance / time flights requiring:

  1. GPS fused with existing distances sensors
  2. Compass fused with existing orientation sensors
  3. Long range wireless connectivity
  4. Nuclear-fusion batteries.

Lack of #4 renders 1-3 pointless.

Also pointless sadly is Penelope; Hermione, my big 3B, is the queen of autonomous control and Zoe, my Pi0W, the queen of human control.  Two’s company, three’s a crowd. The only thing unique I can do with ‘P’ is to get her RPi 3B+ and Stretch O/S completed, and my motivation is lacking; that makes Penelope the queen of spare parts 😥

Time for me to find another hobby to hold back my terminal-boredom-syndrome.  On my bike, me thinks.

So long, and thanks for all the fish!


* …nearly.  I’m doing some refinement for Zoe, primarily so I can take new to the Cotswold Raspberry Jams and anything new and exciting the RPi releases next.

What Zoe saw.

I’ve finally found and almost resolved the instability problem with Zoe.  Zoe is a Pi0W compared to Hermione’s B3.  As a result, she’s a single CPU at lower frequency and she’s struggling to keep up.  In particular the LiDAR / Video distance increasing lag behind the IMU:

Distance

Distance

As a result, once fused, Zoe is reacting to historic movements and so drifting.  The IMU data processing takes priority and by slowing it down to 333Hz (from 500), that allows enough space to process the LiDAR / Video distance to stay in sync with the IMU.  Here’s the result for a simple 10 second hover.

There is still lagging drift but much less than seen previously; this drift is still cumulative; hence my next step is to reduce the video frame size a little more.

While this might not be the reason behind the RC drift, it cannot have been helping.


By the way, the fact the incrementally lagging drift is consistently left / right suggests strongly that I need to reduce the port / starboard PID due to the weight balance, primarily the LiPo battery aligned fore / aft in the frame.  On the plus side, without this flaw, I’d never have been able to diagnose the drift problem so clearly and quickly!

Penelope’s progress

I’ve done nothing on Penelope since my introductory post, but now I have reason to proceed:the new Raspberry Pi 3 B+ is already on its way courtesy of pimoroni combined with the new Garmin LiDAR-Lite v3HP and the new Raspian Stretch O/S.  Together these make the motivation I need though only in the background; my main focus is object avoidance with Hermione, and I hope to post on the first results imminently.  Oddly, what’s holding things back is hardware not software: construction of the obstacle.

“Penelope” or “Lucy”…

…as in the Amazon series “Lucifer”?  I’ll stick with Ms. Pitstop despite the colour scheme; Lucifer never shows up on Tuesdays.

Separated at birth?

Separated at birth?

She’s still pending the new version of the Garmin LIDAR-Lite v3HP – the lower-profile, higher-accuracy version of Hermione and Zoes’ height tracking LiDAR, She’s also waiting for a new PCB so she can have a buzzer, though that’s not holding her back in the same way.  She’ll intentionally not have a Scance Sweep as it’s very very expensive for a non-critical sensor.

My intent had been to make her lower profile, sleek and stealthy to enable longer flights per battery hence the shorter legs, and lower hat and the 13 x 4.4 CF props (compared to ‘H’ 12 x 4.4 Beechwoods). However her hat and feet prevent this – the feet are true lacrosse balls, so heavier than Hermione’s indoor ones, and her salad bowl top also seems heavier.  Overall then ‘H’ weighs in at 4.8kg all installed, and Penelope 4.7kg.  Thus the main benefit is likely she’ll be nippier due to slightly more power from the lighter, larger CF props combined with the raised centre of gravity.  And in fact, this raised CoG and lighter, larger props may well reduce the power needed – we shall see.

In the background, I am working on the “Pond Problem”: fusing GPS distance / direction with the other sensors.  Code’s nigh on complete but I’m yet to convince myself it will work well enough to test it immediately over the local gravel lakes.

The lady’s not for turning

Around here in the South Cotswold, there are lakes, hundreds of them left behind once the Cotswold stone rocks and gravel have been mined from the ground.  People swim, yacht, canoe, windsurf and powerboat race around the area.  It’d be cool for a piDrone to fly from one side of a lake to the other, tracking the terrain as shown in this gravel pit just 2 minutes walk from my house.  ‘H’ start at the left, move over the pond, climb up and over the gravel and land on the far side:

Surface gravel mining

Surface gravel mining

But there’s a significant problem: neither the ground facing video nor LiDAR work over water.  For the down-facing video, there’s no contrast over the water surface for it to track horizontal movement.  For the LiDAR, the problem come when moving: the piDrone leans to move, and the laser beam doesn’t reflect back to the receiver and height reading stops working.

But there is a solution already in hand that I suspect is easy to implement and has little code performance impact, but amazing impact over the water survival: GPS is currently used in the autopilot process to compare where she is currently located compared to the target location, and pass the speed and direction through to the motion process; it would be nigh on trivial to also pass the horizontal distance and altitude difference since takeoff through to the motion process too.

These fuse with the existing ed*_input code values thus:

  • Horizontally, the GPS fuses always with the down facing PiCamera such that if / when the ground surfaces doesn’t have enough contrast (or she’s travelling too fast for video frames to overlap), the GPS will still keep things moving in the right direction and speed.
  • Vertically is more subtle; as mentioned above, the LiDAR fails when the ground surfaces doesn’t bounce the laser back to the receiver perhaps due to a surface reflection problem or simply because her maximum height of 40m has been exceeded.  In both cases, the LiDAR returns 1cm as the height to report the problem.  Here’s where GPS kicks in, reporting the current altitude since takeoff until the LiDAR starts getting readings again.

Like I’ve said, it’s only a few lines of relatively simple code.  The problem is whether I have the puppies’ plums to try it out over the local lakes?  I am highly tempted, as it’s a lot more real than the object avoidance code for which there will never be a suitable maze.  I think my mind is changing direction rapidly.

Refinement time.

Sorry it’s been quiet; the weather’s been awful, so no videos to show.  Instead, I’ve been tinkering to ensure the code is as good as it can be prior to moving on to object avoidance / maze tracking.

Zoe is back to life once more to help with the “face the direction you’re flying” tweak testing as these don’t quite work yet.  She’s back as my first few attempts with ‘H’ kept breaking props.  First job for ‘Z’ was to have her run the same code as Hermione but with the autopilot moved back to inline to reduce the number of processes as possible for her single CPU Pi0W in comparison with Hermione’s 4 CPU 3B.

Then

  • I’ve started running the code with ‘sudo python -O ./qc.py’ to enable optimisation.  This disable assertion checking, and hopefully other stuff for better performance.
  • I’ve tweaked the Butterworth parameters to track gravity changes faster as Zoe’s IMU is exposed to the cold winds and her accelerometer values rise rapidly.
  • I’ve refining the Garmin LiDAR-Lite V3 to cope with occasional ‘no reading’ triggered caused by no laser reflection detected; this does happen occasionally (and correctly) if she’s tilted and the surface bouncing the laser points the wrong way.
  • I’d also hoped to add a “data ready interrupt” to the LiDAR to reduce the number of I2C requests made; however, the interrupts still don’t happens despite trying all 12 config options. I think the problem is Garmin’s so I’m awaiting a response from them on whether this flaw is fixed in a new model to be launched in the next few weeks .  In the meantime, I only call the GLL I2C when there’s video results which need the GLL vertical height to convert the video pixel movements into horizontal movement in meters.

Having added and tested all the above sequentially, the net result was failure: less bad a failure than previously, but failure nonetheless; the video tracking lagged in order to avoid the IMU FIFO overflowing.  So in the end, I changed Zoe’s video resolution to 240 pixels² @ 10 fps (‘H’ is at 320 pixel² @ 10 fps, and she now can hover on the grass which means I can get on with the “face where you’re going” code.

I do think all the other changes listed are valid and useful, and as a result, I’ve updated the code on GitHub.

In passing, I had also been investigating whether the magnetometer could be used to back up pitch, roll and yaw angles long term, but that’s an abject failure; with the props on idle prior to takeoff, it works fine giving the orientation to feed to the GPS tracking process, but once airborne, the magnetometer values shift by ±40° and varies depending which way she’s going while facing in the same direction.

Sixty seconds sanity check

A 60 seconds hover in preparations for my next GPS tracking flight on Wednesday when the sun will be out:

Weather conditions are poor today: light contrast is low due to thick clouds and the temperature is about 3°C.  The code is using the butterworth filter to extract gravity from the accelerometer, and the hover height is set higher at 1.5m to avoid the longer grass in the field next door on Wednesday: takeoff at 50cm/s for 3 seconds to get away from the ground quickly; descent is at 0.25cm/s for 6 seconds for a gentle landing.  The aim here is to move the down-facing LiDAR into a zone where it’ll be more accurate / stable vertically, while checking the down-facing video still provides accurate horizontal stability at this higher height, lower contrast lawn surface.

The check has passed well, boding well for Wednesdays GPS field flight!

5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Finally found the fecking wobbles

I ‘think’ there are two factors: the colder temperature reduced the power of the LiPo; this then made the system a little less able to react to distance errors, causing it to rotate more to correct horizontal distance drift, and this in turn exposed a long term bug that completely failed to compensate for horizontal distance changes due to changes in pitch / roll angles (the code was there, but had a crass bug).

The cool LiPo has been fixed with a walkers / skiers pocket-warmer strapped firmly on top, keeping it lovely and cosy.

The crass video horizontal tracking error has been fix also.  As a result, GitHub has been updated, and naively once more I can continue working on the GPS tracking.

More on mapping

It’s been a frustrating week – despite lovely weather, lots broke, and once each was fixed, something else would break.  To top is all, an update to the latest version of Jessie yesterday locked the RPi as soon as I kicked off a passive flight.  I backed this ‘upgrade’ out as a result.  I now I have everything back and working, confirm by hover and 10m lateral flights this morning, although the latter aborted half-way through with an I2C error.  Underlying it all is power to the GLL and RPi 3B – I was seeing lots of brown-out LED flashes from the B3 and lots of I2C and GLL errors.  I’m consider swapping back to a 2B+ overclocked to 1.2GHz as a result.

In the meantime I have been looking at mapping in more detail as it’s complex and it needs breaking down into easy pieces.  Here’s the general idea:

Polystyrene block layout

Polystyrene block layout

Each polystyrene block is 2.5m long, 1.25m high and 10cm thick.  They are pinned together with screw-in camping tent pegs.  The plan is to

  • fly 10m  at 1m height without the ‘maze’, logging compass and GPS to check the results, in particular to see whether
    • GPS can be gently fused with RPi ground facing motion tracking to enhance lateral motion distance measurements
    • compass can be fused with IMU gyro yaw rate to enforce a better linear flight
  • fly 10m without the ‘maze’ again but with fused compass and GPS (assuming the above is OK)
  • add the ‘maze’ and fly in a straight 10m line from bottom to top again as a sanity check
  • add the Sweep and log it’s contents when doing the same 10m again
  • build the flight map in Excel based upon GPS, compass and sweep logs – the results should look like the map with the addition of what garden clutter lies beyond the end of each exit from the ‘maze’
  • add a new mapping process to do dynamically what has been done in Excel above
  • add object avoidance from Sweep and repeat – this is the hardest bit as it introduces dynamic updates to preconfigured flight plans
  • add ‘maze’ tracking code to reach a target GPS position, nominally the center of the ‘maze’ – this stage requires further thought to break it down further.