The lady’s not for turning

Around here in the South Cotswold, there are lakes, hundreds of them left behind once the Cotswold stone rocks and gravel have been mined from the ground.  People swim, yacht, canoe, windsurf and powerboat race around the area.  It’d be cool for a piDrone to fly from one side of a lake to the other, tracking the terrain as shown in this gravel pit just 2 minutes walk from my house.  ‘H’ start at the left, move over the pond, climb up and over the gravel and land on the far side:

Surface gravel mining

Surface gravel mining

But there’s a significant problem: neither the ground facing video nor LiDAR work over water.  For the down-facing video, there’s no contrast over the water surface for it to track horizontal movement.  For the LiDAR, the problem come when moving: the piDrone leans to move, and the laser beam doesn’t reflect back to the receiver and height reading stops working.

But there is a solution already in hand that I suspect is easy to implement and has little code performance impact, but amazing impact over the water survival: GPS is currently used in the autopilot process to compare where she is currently located compared to the target location, and pass the speed and direction through to the motion process; it would be nigh on trivial to also pass the horizontal distance and altitude difference since takeoff through to the motion process too.

These fuse with the existing ed*_input code values thus:

  • Horizontally, the GPS fuses always with the down facing PiCamera such that if / when the ground surfaces doesn’t have enough contrast (or she’s travelling too fast for video frames to overlap), the GPS will still keep things moving in the right direction and speed.
  • Vertically is more subtle; as mentioned above, the LiDAR fails when the ground surfaces doesn’t bounce the laser back to the receiver perhaps due to a surface reflection problem or simply because her maximum height of 40m has been exceeded.  In both cases, the LiDAR returns 1cm as the height to report the problem.  Here’s where GPS kicks in, reporting the current altitude since takeoff until the LiDAR starts getting readings again.

Like I’ve said, it’s only a few lines of relatively simple code.  The problem is whether I have the puppies’ plums to try it out over the local lakes?  I am highly tempted, as it’s a lot more real than the object avoidance code for which there will never be a suitable maze.  I think my mind is changing direction rapidly.

Obstruction Avoidance Overview

Here’s what ‘H’ does detecting an obstacles within one meter:

Now it’s time to get her to move around the obstacle and continue to her destination.  All sensors are installed and working; this is pure code in the autopilot process*.

Here’s the general idea for how the code should work for full maze exploration:

  • Takeoff, then…
    • hover
    • record current GPS location in map dictionary, including previous location (i.e index “here I am” : content “here I came from” in units of meters held in a python dictionary
    • do Sweep 360° scan for obstacles
    • find next direction based on the current map contents either…
      • unexplored unobstructed direction (beyond 2m) biased towards the target GPS point (e.g. the centre of the maze)
      • previously visited location marking the current location in the map dictionary as blocked to avoid further return visits
    • head off on the new direction until
      • obstacle found in the new direction
      • unexplored direction (i.e not in the map so far) found
    • repeat

And in fact, this same set of rules are required for just avoiding obstacles, which is good, as I doubt I’ll ever be able to find / build a suitably sized maze, and if I did, the LiPo will run out long before the centre of the maze is reached.


* The fact it’s pure code means it’s going to be quiet on the blogging front apart from GPS tracking videos when the weather warms up.  I’m also considering building a new version of Hermione from the spares I have in stock, provisionally called “Penelope”.  She’s there for shows only; I can then use Hermione purely for testing new features without worrying about breaking her prior to an event.

Refinement time.

Sorry it’s been quiet; the weather’s been awful, so no videos to show.  Instead, I’ve been tinkering to ensure the code is as good as it can be prior to moving on to object avoidance / maze tracking.

Zoe is back to life once more to help with the “face the direction you’re flying” tweak testing as these don’t quite work yet.  She’s back as my first few attempts with ‘H’ kept breaking props.  First job for ‘Z’ was to have her run the same code as Hermione but with the autopilot moved back to inline to reduce the number of processes as possible for her single CPU Pi0W in comparison with Hermione’s 4 CPU 3B.

Then

  • I’ve started running the code with ‘sudo python -O ./qc.py’ to enable optimisation.  This disable assertion checking, and hopefully other stuff for better performance.
  • I’ve tweaked the Butterworth parameters to track gravity changes faster as Zoe’s IMU is exposed to the cold winds and her accelerometer values rise rapidly.
  • I’ve refining the Garmin LiDAR-Lite V3 to cope with occasional ‘no reading’ triggered caused by no laser reflection detected; this does happen occasionally (and correctly) if she’s tilted and the surface bouncing the laser points the wrong way.
  • I’d also hoped to add a “data ready interrupt” to the LiDAR to reduce the number of I2C requests made; however, the interrupts still don’t happens despite trying all 12 config options. I think the problem is Garmin’s so I’m awaiting a response from them on whether this flaw is fixed in a new model to be launched in the next few weeks .  In the meantime, I only call the GLL I2C when there’s video results which need the GLL vertical height to convert the video pixel movements into horizontal movement in meters.

Having added and tested all the above sequentially, the net result was failure: less bad a failure than previously, but failure nonetheless; the video tracking lagged in order to avoid the IMU FIFO overflowing.  So in the end, I changed Zoe’s video resolution to 240 pixels² @ 10 fps (‘H’ is at 320 pixel² @ 10 fps, and she now can hover on the grass which means I can get on with the “face where you’re going” code.

I do think all the other changes listed are valid and useful, and as a result, I’ve updated the code on GitHub.

In passing, I had also been investigating whether the magnetometer could be used to back up pitch, roll and yaw angles long term, but that’s an abject failure; with the props on idle prior to takeoff, it works fine giving the orientation to feed to the GPS tracking process, but once airborne, the magnetometer values shift by ±40° and varies depending which way she’s going while facing in the same direction.

Whoohoo!


What’s left?

  • Refine the u-blox NEO-M8T to use the “pedestian” model algorithm which is more accurate than the current “portable” model used.
  • Test the yaw code I’ve been working on in the background so ‘H’ always points in the way she’s going.
  • Stop swearing at Microsoft for the Spectre and Meltdown updates they installed this morning which has crippled my PC just loading web pages!

Winter hat.

Hermione’s innards have been exposed all winter; her summer hat doesn’t fit with the LiPo heating element installed.  Today, I finally got round to cutting down my spare salad bowl to make her winter hat, which also required a longer CF pole for the ublox NEO-M8T GPS receiver and case.

H's winter hat

H’s winter hat

She is now as ready as she can be for tomorrow’s GPS tracking testing.


P.S. There are a couple of other benefits beyond how great ‘H’ looks: her hat provides wind protection thus keeping the IMU temperature more stable; she’s also more visible from above i.e. on the DJI Mavic down-facing video I’ll be taking tomorrow.

Sixty seconds sanity check

A 60 seconds hover in preparations for my next GPS tracking flight on Wednesday when the sun will be out:

Weather conditions are poor today: light contrast is low due to thick clouds and the temperature is about 3°C.  The code is using the butterworth filter to extract gravity from the accelerometer, and the hover height is set higher at 1.5m to avoid the longer grass in the field next door on Wednesday: takeoff at 50cm/s for 3 seconds to get away from the ground quickly; descent is at 0.25cm/s for 6 seconds for a gentle landing.  The aim here is to move the down-facing LiDAR into a zone where it’ll be more accurate / stable vertically, while checking the down-facing video still provides accurate horizontal stability at this higher height, lower contrast lawn surface.

The check has passed well, boding well for Wednesdays GPS field flight!

Butterworth implementation details

I did a 1 minute hover flight today with the down-facing LiDAR + video reinstated to track the performance of the dynamic gravity values from the Butterworth as temperature shifts during the flight:

60s hover stats

60s hover stats

To be honest, it’s really hard for me to work out if the Butterworth is working well extracting gravity from acceleration as the IMU temperature falls.  There are so many different interdependent sensors here, it’s hard to see the wood for the trees!

As an example, here’s my pythonised Butterworth class and usage:

####################################################################################################
#
# Butterwork IIR Filter calculator and actor - this is carried out in the earth frame as we are track
# gravity drift over time from 0, 0, 1 (the primer values for egx, egy and egz)
#
# Code is derived from http://www.exstrom.com/journal/sigproc/bwlpf.c
#
####################################################################################################
class BUTTERWORTH:
    def __init__(self, sampling, cutoff, order, primer):

        self.n = int(round(order / 2))
        self.A = []
        self.d1 = []
        self.d2 = []
        self.w0 = []
        self.w1 = []
        self.w2 = []

        a = math.tan(math.pi * cutoff / sampling)
        a2 = math.pow(a, 2.0)

        for ii in range(0, self.n):
            r = math.sin(math.pi * (2.0 * ii + 1.0) / (4.0 * self.n))
            s = a2 + 2.0 * a * r + 1.0
            self.A.append(a2 / s)
            self.d1.append(2.0 * (1 - a2) / s)
            self.d2.append(-(a2 - 2.0 * a * r + 1.0) / s)

            self.w0.append(primer / (self.A[ii] * 4))
            self.w1.append(primer / (self.A[ii] * 4))
            self.w2.append(primer / (self.A[ii] * 4))

    def filter(self, input):
        for ii in range(0, self.n):
            self.w0[ii] = self.d1[ii] * self.w1[ii] + self.d2[ii] * self.w2[ii] + input
            output = self.A[ii] * (self.w0[ii] + 2.0 * self.w1[ii] + self.w2[ii])
            self.w2[ii] = self.w1[ii]
            self.w1[ii] = self.w0[ii]

        return output


#-------------------------------------------------------------------------------------------
# Setup and prime the butterworth - 0.01Hz 8th order, primed with the stable measured above.
#-------------------------------------------------------------------------------------------
bfx = BUTTERWORTH(motion_rate, 0.01, 8, egx)
bfy = BUTTERWORTH(motion_rate, 0.01, 8, egy)
bfz = BUTTERWORTH(motion_rate, 0.01, 8, egz)

#---------------------------------------------------------------------------------------
# Low pass butterworth filter to account for long term drift to the IMU due to temperature
# change - this happens significantly in a cold environment.
#---------------------------------------------------------------------------------------
eax, eay, eaz = RotateVector(qax, qay, qaz, -pa, -ra, -ya)
egx = bfx.filter(eax)
egy = bfy.filter(eay)
egz = bfz.filter(eaz)
qgx, qgy, qgz = RotateVector(egx, egy, egz, pa, ra, ya)

Dynamic gravity is produced by rotating the quad frame accelerometer readings (qa(x|y|z)) back to earth frame values (ea(x|y|z)), passing it through the butterworth filter (eg(x|y|z)), and then rotating this back to the quad frame (qd(x|y|z)).  This is then used to find velocity and distance by integrating (accelerometer – gravity) against time.

Sounds great, doesn’t it?

Trouble is, the angles used for the rotation above should be calculated from the quad frame gravity values qg(x|y|z).  See the chicken / egg problem?

The code gets around this because angles for a long time have been set up initially on takeoff, and then updated using the gyro rotation tweaked as an approximation to the rotation angle increments.  During flight, the qa(x|y|z) angles are fed in over a complimentary filter.

Thursday is forecast to be cold, sunny, and wind-free; I’ll be testing the above with a long GPS waypoint flights which so far lost stability at about the 20s point.  Fingers crossed I’m right that the drift of the accelerometer, and hence increasing errors on distance and velocity resolves this.  We shall see.


P.S. I’ve updated the code on GitHub as the Butterworth code is not having a negative effect, and may be commented out easily if not wanted.

Worth a Betta Bitta Butter

I suspect the low temperature increasingly unstable GPS tracking flights are partly due to the LiPo power (fixed with the hand / pocket digital warmers), and partly the accelerometer drift over time / temperature.  The problem with the latter is that gravity is recorded at the start of the flight and hence is fixed, meaning that accelerometer drift during the flight results in velocity and distance drift as the integration of (accelerometer – gravity) grows .  This is currently overcome by having second sources of velocity and distance (down-facing LiDAR + video) fused with the drifting IMU integrated (accelerometer – gravity) values.  The instability sets in over time as the IMU velocity and distance values drift increasingly over time but the LiDAR and video values don’t.  Ultimately these sources increasingly diverge and the instability ensues.

I remembered using a Butterworth IIR filter to extract gravity from the accelerometer dynamically 3 years ago to account for the accelerometer drift in cold temperatures, but it either failed or I got distracted by a better solution.

Yesterday, I tried again, but this time with a little better understanding: I now have a way to prime the filter without taking many seconds to do so.

Here’s the result: a stable hover without drift despite the LiDAR / video disabled – only the IMU status is in use:

Once the windy weather’s gone next week, I’ll be heading out into the next-door field again to fly with more GPS-waypoint intermediate targets, and this time, hopefully I’ll be able to complete the flight without the increasingly instability seen so far.

5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Here we go round the Mulberry bush.

Yesterday it was a cold and frosty morning, but more critically, sunny with only a light breeze:

28/12/2017 BBC weather forecast

Thursday’s weather

So I managed to squeeze in another test flight, where the plan was to fly ‘H’ between 3 GPS wayspoint frisbee-markers from a takeoff in between all three.  As you can see, reality deviated significantly from the plan.


I’d made a minor change to the code such that the maximum horizontal speed was 1 m/s reducing proportionally as ‘H’ got less than 5 meters from the target e.g at 3m from the target, the speed is set to 0.66cm/s.  That worked well approaching the first red frisbee-marker after takeoff.  However the next phase, although heading in the right direction between red and orange frisbee-markers, was very unstable and ultimately overshot the orange frisbee-marker, so I killed the flight.  Here’s what the flight controller saw:

RTF
TAKEOFF
HOVER
GPS: WHERE AM I?
GPS TARGET 6m -69o
GPS TARGET 6m -69o
GPS TARGET 5m -70o
GPS TARGET 4m -71o
GPS TARGET 4m -73o
GPS TARGET 3m -75o
GPS TARGET 2m -78o
GPS TARGET 2m -82o
GPS TARGET 1m -86o
GPS TARGET 1m -90o
GPS TARGET 8m 85o
GPS TARGET 8m 85o
GPS TARGET 7m 85o
GPS TARGET 6m 84o
GPS TARGET 4m 86o
GPS TARGET 2m 93o
GPS TARGET 2m -117o
Flight time 20.242111

The green text is from takeoff to the red frisbee-marker.  The yellow section shows a good heading towards the second frisbee-marker.  It started at the right speed of 1m/s while more than 5m away, but the red lines show velocity increasing to 2, 2 and 4 m/s as H got closer to the orange frisbee-marker.  On the plus side, she knew she’d overshot the target and had been told to double back at the point I killed the flight -check the angles at the end of the red lines.

Time to go bug hunting – it’s hopefully just a stupid crass typo on my part.  Luckily, the kids go back to school on Tuesday, and the weather forecast so far is looking good that morning:

Tuesday's weather forecast

Tuesday’s weather forecast

I hope by Tuesday I’ll also have worked out how to get better video quality from the Mavic!


P.S. No crass bug found, so my second best guestimation is that the LiPo cooled to below optimal performance temperature; this has the same effect as seen, and I had set the heater elements on the lowest level. The plan for the next flight is identical to before, but with the heaters on high and with full logging enabled in case this also fails and I need to diagnose in detail the source of the problem from the lateral velocity targets.