What Zoe saw.

I’ve finally found and almost resolved the instability problem with Zoe.  Zoe is a Pi0W compared to Hermione’s B3.  As a result, she’s a single CPU at lower frequency and she’s struggling to keep up.  In particular the LiDAR / Video distance increasing lag behind the IMU:

Distance

Distance

As a result, once fused, Zoe is reacting to historic movements and so drifting.  The IMU data processing takes priority and by slowing it down to 333Hz (from 500), that allows enough space to process the LiDAR / Video distance to stay in sync with the IMU.  Here’s the result for a simple 10 second hover.

There is still lagging drift but much less than seen previously; this drift is still cumulative; hence my next step is to reduce the video frame size a little more.

While this might not be the reason behind the RC drift, it cannot have been helping.


By the way, the fact the incrementally lagging drift is consistently left / right suggests strongly that I need to reduce the port / starboard PID due to the weight balance, primarily the LiPo battery aligned fore / aft in the frame.  On the plus side, without this flaw, I’d never have been able to diagnose the drift problem so clearly and quickly!

The lady’s not for turning

Around here in the South Cotswold, there are lakes, hundreds of them left behind once the Cotswold stone rocks and gravel have been mined from the ground.  People swim, yacht, canoe, windsurf and powerboat race around the area.  It’d be cool for a piDrone to fly from one side of a lake to the other, tracking the terrain as shown in this gravel pit just 2 minutes walk from my house.  ‘H’ start at the left, move over the pond, climb up and over the gravel and land on the far side:

Surface gravel mining

Surface gravel mining

But there’s a significant problem: neither the ground facing video nor LiDAR work over water.  For the down-facing video, there’s no contrast over the water surface for it to track horizontal movement.  For the LiDAR, the problem come when moving: the piDrone leans to move, and the laser beam doesn’t reflect back to the receiver and height reading stops working.

But there is a solution already in hand that I suspect is easy to implement and has little code performance impact, but amazing impact over the water survival: GPS is currently used in the autopilot process to compare where she is currently located compared to the target location, and pass the speed and direction through to the motion process; it would be nigh on trivial to also pass the horizontal distance and altitude difference since takeoff through to the motion process too.

These fuse with the existing ed*_input code values thus:

  • Horizontally, the GPS fuses always with the down facing PiCamera such that if / when the ground surfaces doesn’t have enough contrast (or she’s travelling too fast for video frames to overlap), the GPS will still keep things moving in the right direction and speed.
  • Vertically is more subtle; as mentioned above, the LiDAR fails when the ground surfaces doesn’t bounce the laser back to the receiver perhaps due to a surface reflection problem or simply because her maximum height of 40m has been exceeded.  In both cases, the LiDAR returns 1cm as the height to report the problem.  Here’s where GPS kicks in, reporting the current altitude since takeoff until the LiDAR starts getting readings again.

Like I’ve said, it’s only a few lines of relatively simple code.  The problem is whether I have the puppies’ plums to try it out over the local lakes?  I am highly tempted, as it’s a lot more real than the object avoidance code for which there will never be a suitable maze.  I think my mind is changing direction rapidly.

More on mapping

It’s been a frustrating week – despite lovely weather, lots broke, and once each was fixed, something else would break.  To top is all, an update to the latest version of Jessie yesterday locked the RPi as soon as I kicked off a passive flight.  I backed this ‘upgrade’ out as a result.  I now I have everything back and working, confirm by hover and 10m lateral flights this morning, although the latter aborted half-way through with an I2C error.  Underlying it all is power to the GLL and RPi 3B – I was seeing lots of brown-out LED flashes from the B3 and lots of I2C and GLL errors.  I’m consider swapping back to a 2B+ overclocked to 1.2GHz as a result.

In the meantime I have been looking at mapping in more detail as it’s complex and it needs breaking down into easy pieces.  Here’s the general idea:

Polystyrene block layout

Polystyrene block layout

Each polystyrene block is 2.5m long, 1.25m high and 10cm thick.  They are pinned together with screw-in camping tent pegs.  The plan is to

  • fly 10m  at 1m height without the ‘maze’, logging compass and GPS to check the results, in particular to see whether
    • GPS can be gently fused with RPi ground facing motion tracking to enhance lateral motion distance measurements
    • compass can be fused with IMU gyro yaw rate to enforce a better linear flight
  • fly 10m without the ‘maze’ again but with fused compass and GPS (assuming the above is OK)
  • add the ‘maze’ and fly in a straight 10m line from bottom to top again as a sanity check
  • add the Sweep and log it’s contents when doing the same 10m again
  • build the flight map in Excel based upon GPS, compass and sweep logs – the results should look like the map with the addition of what garden clutter lies beyond the end of each exit from the ‘maze’
  • add a new mapping process to do dynamically what has been done in Excel above
  • add object avoidance from Sweep and repeat – this is the hardest bit as it introduces dynamic updates to preconfigured flight plans
  • add ‘maze’ tracking code to reach a target GPS position, nominally the center of the ‘maze’ – this stage requires further thought to break it down further.

A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.

On a cold winter’s morning…


Both flights use identical code.  There are two tweaks compared to the previous videos:

  1. I’ve reduced the gyro rate PID P gain from 25 to 20 which has hugely increased the stability
  2. Zoe is using my refined algorithm for picking out the peaks in the macro-block frames – I think this is working better but there’s one further refinement I can make which should make it better yet.

I’d have liked to show Hermione doing the same, but for some reason she’s getting FIFO overflows.  My best guess is that her A+ overclocked to turbo (1GHz CPU) isn’t as fast as a Zero’s default setting of 1GHz – no idea why.  My first attempt on this has been improved scheduling by splitting the macro-block vectors processing into two phases:

  1. build up the dictionary of the set of macro-blocks
  2. processing the dictionary to identify the peaks.

Zoe does this in one fell swoop; Hermione schedules each independently, checking in between that the FIFO hasn’t filled up to a significant level, and if it has, deal with that first.  This isn’t quite working yet in passive test, even on Zoe, and I can’t find out why!  More anon.

30s with, <10s without

A few test runs.  In summary, with the LiDAR and Camera fused with the IMU, Zoe stays over her play mat at a controlled height for the length of the 30s flight.  Without the fusion, she lasted just a few seconds before she drifted off the mat, lost her height, or headed to me with menace (kill ensued).  I think that’s pretty conclusive code fusion works!

With Fusion:

Without Fusion:

Better macro-block interpretation

Currently, getting lateral motion from a frame full of macro-blocks is very simplistic:  find the average SAD value for a frame, and then only included those vectors whose SAD is lower.

I’m quite surprised this works as well as it does but I’m fairly sure it can be improved.  There are four factors to the content of a frame of macro-blocks.

  • yaw change: all macro-block vectors will circle around the centre of the frame
  • height change: all macro-blocks vectors will point towards or away from the centre of the frame.
  • lateral motion change: all macro-blocks vectors are pointing in the same direction in the frame.
  • noise: the whole purpose of macro-blocks is simply to find the best matching blocks between two frame; doing this with a chess set (for example) could well have any block from the first frame matching any one of the 50% of the second frame.

Given a frame of macro-blocks, yaw increment between frames can found from the gyro, and thus be removed easily.

The same goes for height too derived from LiDAR.

That leaves either noise or a lateral vector.  By then averaging these values out, we can pick the vectors that are similar to the distance / direction of the average vector. SAD doesn’t come into the matter.

This won’t be my first step however: that’s to work out why the height of the flight wasn’t anything like as stable as I’d been expecting.

Strutting her stuff

Finally, fusion worth showing.

Yes, height’s a bit variable as she doesn’t accurate height readings below about 20cm.

Yes, it’s a bit jiggery because the scale of the IMU and other sensors aren’t quite in sync.

But fundamentally, it works – nigh on zero drift for 20s.  With just the IMU, I couldn’t get this minimal level of drift for more than a few seconds.

Next steps: take her out for a longer, higher flight to really prove how well this is working.

Live Cold Fusion Test

I took Zoe outside (temperature 0°C – freezing point) to fly this morning with fusion enabled to see what the effect was – fusion was disabled in the flights; they was purely for compare and contrast of the two independent sensor sources.

Fusion was a complementary filter with the -3dB crossover set to 1s.

In all graphs,

  • blue comes from the Garmin LiDAR (Z axis) or Camera (X and Y axes)
  • orange comes from the accelerometer with necessary subtraction of gravity and integration.
  • grey is the fused value – orange works short term with longer term fusion with blue.
Fusion

Fusion

In general, the shapes match closely, but there’s some oddities I need to understand better:

  • horizontal velocity from the Camera is very spiky – the average is right, and the fusion is hiding the spikes well – I’m assuming the problem is my code coping with the change of tilt of the camera compared to the ground.
  •   the vertical height is wrong – at 3 seconds, Zoe should be at 90cm and leveling out.

I need to continue dissecting these stats – more anon.

Fusion thoughts

Here’s my approach to handling the tilt for downward facing vertical and horizontal motion tracking.

Vertical compenstation

Vertical compensation

This is what I’d already worked out to take into account any tilt of the quadcopter frame and therefore LEDDAR sensor readings.

With this in place, this is the equivalent processing for the video motion tracking:

Horizontal compensation

Horizontal compensation

The results of both are in the earth frame, as are the flight plan targets, so I think I’ll swap to earth frame processing until the last step of processing.

One problem as I’m now pushing the limit of the code keeping up with the sensors: with diagnostics on and 1kHz IMU sampling, the IMU FIFO overflows as the code can’t keep up.  This is with Zoe (1GHz CPU speed) and without LEDDAR.

LEDDAR has already forced me to drop the IMU sample rate to 500Hz on Chloe; I really hope this is enough to also allow LEDDAR, macro-block processing and diagnostics to work without FIFO overflow.  I really don’t want to drop to the next level of 333Hz if I can help it.

Coding is in process already.