Refinement time.

Sorry it’s been quiet; the weather’s been awful, so no videos to show.  Instead, I’ve been tinkering to ensure the code is as good as it can be prior to moving on to object avoidance / maze tracking.

Zoe is back to life once more to help with the “face the direction you’re flying” tweak testing as these don’t quite work yet.  She’s back as my first few attempts with ‘H’ kept breaking props.  First job for ‘Z’ was to have her run the same code as Hermione but with the autopilot moved back to inline to reduce the number of processes as possible for her single CPU Pi0W in comparison with Hermione’s 4 CPU 3B.


  • I’ve started running the code with ‘sudo python -O ./’ to enable optimisation.  This disable assertion checking, and hopefully other stuff for better performance.
  • I’ve tweaked the Butterworth parameters to track gravity changes faster as Zoe’s IMU is exposed to the cold winds and her accelerometer values rise rapidly.
  • I’ve refining the Garmin LiDAR-Lite V3 to cope with occasional ‘no reading’ triggered caused by no laser reflection detected; this does happen occasionally (and correctly) if she’s tilted and the surface bouncing the laser points the wrong way.
  • I’d also hoped to add a “data ready interrupt” to the LiDAR to reduce the number of I2C requests made; however, the interrupts still don’t happens despite trying all 12 config options. I think the problem is Garmin’s so I’m awaiting a response from them on whether this flaw is fixed in a new model to be launched in the next few weeks .  In the meantime, I only call the GLL I2C when there’s video results which need the GLL vertical height to convert the video pixel movements into horizontal movement in meters.

Having added and tested all the above sequentially, the net result was failure: less bad a failure than previously, but failure nonetheless; the video tracking lagged in order to avoid the IMU FIFO overflowing.  So in the end, I changed Zoe’s video resolution to 240 pixels² @ 10 fps (‘H’ is at 320 pixel² @ 10 fps, and she now can hover on the grass which means I can get on with the “face where you’re going” code.

I do think all the other changes listed are valid and useful, and as a result, I’ve updated the code on GitHub.

In passing, I had also been investigating whether the magnetometer could be used to back up pitch, roll and yaw angles long term, but that’s an abject failure; with the props on idle prior to takeoff, it works fine giving the orientation to feed to the GPS tracking process, but once airborne, the magnetometer values shift by ±40° and varies depending which way she’s going while facing in the same direction.

Sixty seconds sanity check

A 60 seconds hover in preparations for my next GPS tracking flight on Wednesday when the sun will be out:

Weather conditions are poor today: light contrast is low due to thick clouds and the temperature is about 3°C.  The code is using the butterworth filter to extract gravity from the accelerometer, and the hover height is set higher at 1.5m to avoid the longer grass in the field next door on Wednesday: takeoff at 50cm/s for 3 seconds to get away from the ground quickly; descent is at 0.25cm/s for 6 seconds for a gentle landing.  The aim here is to move the down-facing LiDAR into a zone where it’ll be more accurate / stable vertically, while checking the down-facing video still provides accurate horizontal stability at this higher height, lower contrast lawn surface.

The check has passed well, boding well for Wednesdays GPS field flight!

Butterworth implementation details

I did a 1 minute hover flight today with the down-facing LiDAR + video reinstated to track the performance of the dynamic gravity values from the Butterworth as temperature shifts during the flight:

60s hover stats

60s hover stats

To be honest, it’s really hard for me to work out if the Butterworth is working well extracting gravity from acceleration as the IMU temperature falls.  There are so many different interdependent sensors here, it’s hard to see the wood for the trees!

As an example, here’s my pythonised Butterworth class and usage:

# Butterwork IIR Filter calculator and actor - this is carried out in the earth frame as we are track
# gravity drift over time from 0, 0, 1 (the primer values for egx, egy and egz)
# Code is derived from
    def __init__(self, sampling, cutoff, order, primer):

        self.n = int(round(order / 2))
        self.A = []
        self.d1 = []
        self.d2 = []
        self.w0 = []
        self.w1 = []
        self.w2 = []

        a = math.tan(math.pi * cutoff / sampling)
        a2 = math.pow(a, 2.0)

        for ii in range(0, self.n):
            r = math.sin(math.pi * (2.0 * ii + 1.0) / (4.0 * self.n))
            s = a2 + 2.0 * a * r + 1.0
            self.A.append(a2 / s)
            self.d1.append(2.0 * (1 - a2) / s)
            self.d2.append(-(a2 - 2.0 * a * r + 1.0) / s)

            self.w0.append(primer / (self.A[ii] * 4))
            self.w1.append(primer / (self.A[ii] * 4))
            self.w2.append(primer / (self.A[ii] * 4))

    def filter(self, input):
        for ii in range(0, self.n):
            self.w0[ii] = self.d1[ii] * self.w1[ii] + self.d2[ii] * self.w2[ii] + input
            output = self.A[ii] * (self.w0[ii] + 2.0 * self.w1[ii] + self.w2[ii])
            self.w2[ii] = self.w1[ii]
            self.w1[ii] = self.w0[ii]

        return output

# Setup and prime the butterworth - 0.01Hz 8th order, primed with the stable measured above.
bfx = BUTTERWORTH(motion_rate, 0.01, 8, egx)
bfy = BUTTERWORTH(motion_rate, 0.01, 8, egy)
bfz = BUTTERWORTH(motion_rate, 0.01, 8, egz)

# Low pass butterworth filter to account for long term drift to the IMU due to temperature
# change - this happens significantly in a cold environment.
eax, eay, eaz = RotateVector(qax, qay, qaz, -pa, -ra, -ya)
egx = bfx.filter(eax)
egy = bfy.filter(eay)
egz = bfz.filter(eaz)
qgx, qgy, qgz = RotateVector(egx, egy, egz, pa, ra, ya)

Dynamic gravity is produced by rotating the quad frame accelerometer readings (qa(x|y|z)) back to earth frame values (ea(x|y|z)), passing it through the butterworth filter (eg(x|y|z)), and then rotating this back to the quad frame (qd(x|y|z)).  This is then used to find velocity and distance by integrating (accelerometer – gravity) against time.

Sounds great, doesn’t it?

Trouble is, the angles used for the rotation above should be calculated from the quad frame gravity values qg(x|y|z).  See the chicken / egg problem?

The code gets around this because angles for a long time have been set up initially on takeoff, and then updated using the gyro rotation tweaked as an approximation to the rotation angle increments.  During flight, the qa(x|y|z) angles are fed in over a complimentary filter.

Thursday is forecast to be cold, sunny, and wind-free; I’ll be testing the above with a long GPS waypoint flights which so far lost stability at about the 20s point.  Fingers crossed I’m right that the drift of the accelerometer, and hence increasing errors on distance and velocity resolves this.  We shall see.

P.S. I’ve updated the code on GitHub as the Butterworth code is not having a negative effect, and may be commented out easily if not wanted.

Worth a Betta Bitta Butter

I suspect the low temperature increasingly unstable GPS tracking flights are partly due to the LiPo power (fixed with the hand / pocket digital warmers), and partly the accelerometer drift over time / temperature.  The problem with the latter is that gravity is recorded at the start of the flight and hence is fixed, meaning that accelerometer drift during the flight results in velocity and distance drift as the integration of (accelerometer – gravity) grows .  This is currently overcome by having second sources of velocity and distance (down-facing LiDAR + video) fused with the drifting IMU integrated (accelerometer – gravity) values.  The instability sets in over time as the IMU velocity and distance values drift increasingly over time but the LiDAR and video values don’t.  Ultimately these sources increasingly diverge and the instability ensues.

I remembered using a Butterworth IIR filter to extract gravity from the accelerometer dynamically 3 years ago to account for the accelerometer drift in cold temperatures, but it either failed or I got distracted by a better solution.

Yesterday, I tried again, but this time with a little better understanding: I now have a way to prime the filter without taking many seconds to do so.

Here’s the result: a stable hover without drift despite the LiDAR / video disabled – only the IMU status is in use:

Once the windy weather’s gone next week, I’ll be heading out into the next-door field again to fly with more GPS-waypoint intermediate targets, and this time, hopefully I’ll be able to complete the flight without the increasingly instability seen so far.

Diagnostics from indoors flights

I collected these diagnostics from a couple of indoor flights this morning.  The flight is 7.s long:

  • 1.5s to get the props spinning to hover speed on the ground
  • 2s climb at 30cm/s
  • 2s hover
  • 2s descent at -30cm/s.

This was using the complementary filter with tau set to 5s but the GERMS stats were collected to get a better idea what’s going on.

GERMS stats

GERMS stats

This shows two things – there’s a lot of noise in the raw data blue line, and the red trend line cleary shows an oscillation which is hiding the peaks and troughs of net acceleration / deceleration at 1.5, 3.5 and 5.5 seconds.

This set show raw accelerometer values, and the measure of gravity that comes after the raw data has been passed through the butterworth filter.  Three things of interest here:

Acceleration and Gravity

Acceleration and Gravity

  1. The grey line of raw acceleration is still noisy, but does show the peaks and troughs better at 1.5, 3.5 and 5.5s
  2. The green butterworth filter line is doing a lovely job of extracting gravity from the noisy accelerometer data
  3. There’s oscillation in both the X and Y accelerometer readings, best seen in the blue and orange trend lines.

The flights were perfect to look at – only these diagnostics show these subtle problems.

The next steps then are to sort increase the P gain and decrease the I gain to stop the oscillations.  With those gone, it’ll hopefully allow the GERMS stats to show only the deviations from real gravity, and thereby filter it out from the angle calculation as the complementary filter does now.

One point in passing, I have to drop the hover speed PWM value from the long standing 1500ms to 1400ms PWM; The new 4S batteries are clearly showing that 3S is not enough.


First the good bit so you can skip the boring explanation if you wish!

Low drift Phoebe

I did some experimentation to test which of the four different ways the IMU hardware interrupt and my GPIO code can work together to get the most efficient and accurate readings.

The IMU data ready interrupt can either generate a 50us pulse each time the data registers are updated, or it can produce a rising edge which latches high, only falling when the data is read over I2C.

My GPIO tweaks can either capture these interrupts continuously, or just capture the next one, and then capture no more until called again.

So I ran 4 tests to see which pairing worked together best based upon the number of I2C and data corruption errors I got and how accurate the timing was which is directly related to how many sensor readings are missed during motion processing.

Other than the latching, continuous option which just blocked (predictably), it was marginal between the other three in the lab testing, but with a slight advantage shown by 50us pulse, continuous reading.

And a few test flights with Phoebe confirmed this; much reduced drift and improved timing and at 500Hz sampling she missed only 2% of samples which is the best yet by a very long way – previously it’s been more like 20%

Two final tweaks: I loosened the tests for duff data and changed the butterworth filter values:

  • duff data can be identified when the sequence of ax, ay, az, temp, gx, gy, gz has some values set to -1 (0xFFFF) starting from the gz end.  Previously I’d been checking gy and gz.  Now I check just az, the reasoning being that errors in gyro of -1 have a very short term effect on the accuracy of angles, whereas -1 for az (which represents freefall in a vacuum) will plague a flight thereafter due to integration for velocity
  • The butterworth changes were experimental to see what settings might provide better flights by filtering more acceleration out of gravity; a 0.05Hz (20s) 6th order was used in these flights compared to the previous 0.1Hz (10s) 4th order resulting in much better drift control.

Fly drive

I took Chloe out onto the drive – horizontal drift didn’t exist – a perfect vertical rise and fall.  There were only two flights; the first rose to 50cm but then descended slowly during the hover phase, landing gently even before descent phase of the flight plan has started.  The second flight also didn’t drift laterally, but in contrast, it rose to 2 meters during ascent phase, and continued to climb during the hover phase; at that point, I killed the flight.

There’s a few possible causes I can think of off the top of my head:

  •  the Butterworth filter needs turning as its filtering out of gravity is lagging behind sensor drift – this is my prime suspect
  • the complementary filter needs tuning as its merging of angle sources is lagging behind sensor drift – this is much less likely however, if only because it would affect lateral drift also
  • IMU temperature drift between the first flight and the second – again I’m doubtful as although it could affect hover height, there shouldn’t be vertical drift during hover phase – this is what’s pointing to the Butterworth lagging too.

Zero-calibration vs. Zero-G calibration?

I’ve been so focussed on sensor calibration and specifically accelerometer calibration, that I may be missing the blindingly obvious.  As I’ve said many times, I simply cannot believe that any of the off-the-shelf quads out there bother with sensor calibration to anything like the extent I am, and yet there are commercial UAV’s out there which can hover at a fixed height without lateral drift.

It’s the fact these exist that’s stopping me from accepting that HoG is as good as she can be.

So what if I abandon calibration, and instead just rely on the Butterworth filters to track drifting offsets in the accelerometers?  That code is there currently, working in parallel with the calibration code, and seemingly working well to manage vertical drift despite this being the axis with the least reliable calibration of the sensors.  So what would happen if I throw away all my calibration code and just use the Butterworth filters to extract the accelerometer offsets?  The answer is this:

Zero-G calibration vs. Zero-calibration from Andy Baker on Vimeo.

Not perfect, but at the same time, not much different from the calibrated flights, and possibly actually slightly better.  The wind was picking up at this point making life hard.  I also think there are some extensions I can add since the angle calculations still are using the raw values in this flight, and that’s almost certainly the reason for the drift to the right.  If I can use the Butterworth gravity offset readings to correct the angles also, I think I can get even better performance against drift.

I2C errors continued

I’m still getting I2C errors logged, and the quality of each flight seems to depend on how many of these I get and when.  They all occur during the warm-up code, and could be anywhere from zero to five or six.  If they occur in the first few seconds of warm up, the impact on the quality of flight is small – later in the sequence and it becomes larger.  That’s due to the fact the 20s warmup is primarily there for filling up the Butterworth filter history – earlier errors are less significant.

I do wonder whether it’s the altimeter also on the same I2C bus that may be causing the problem.  My code doesn’t bother initializing it, so next step is to add code just for that.  Fingers crossed.

I added the few lines of code to just initialize the barometer, but that made no difference whatsoever to the I2C errors so for the moment, I’m out of ideas.