Yaw(n)

I think I’ve finished writing the code to support GPS tracking: a target GPS location is stored prior to the flight; the flight takes off pointing in any direction, and once it has learned it’s own GPS takeoff position, it heads towards the target, updating the direction to the target each time it gets a new GPS fix for where it is, and finally landing when it’s current position matches the target GPS position.

There’s a lot of testing to be done before I can try it live.  The new autopilot code is pretty much a rewrite from scratch.  It now has multiple partial flight plans:

  • take-off
  • landing
  • file defined direction + speed + time
  • GPS initial location finding
  • GPS tracking to destination
  • Scanse abort.

It switches between these depending on either external Scanse or GPS inputs, or the flight plan section completing e.g. the GPS flight plan reaches its destination or the file-based flight plan has completed the various time-based lateral movements and hovers (both of which will then swap to the landing flight plan).  Luckily, most of this testing can be carried out passively.

The first test preempts all of these; as per yesterday’s post, I should be able to use my magnetometer combined with its magnetic declination to find Hermione’s initial orientation compared with true (i.e. GPS) north.  At the same time, I could use the magnetometer to provide long term yaw values to fused with the integrated yaw rate from the gyrometer.  Here’s what I got from two sequential passive tests:

Compass & integrated Gyro yaw

Compass & integrated Gyro yaw

I’m not hugely impressed with the results of either.

  • The difference between compass and integrated gyro yaw doesn’t match as tightly as I was expecting – in its current state, I won’t be using the compass direction as a long term fuse with the integrated gyro yaw unless I can improve this.
  • The blobs in the lower pair are compass orientation values as she’s sitting on the ground – half way through I rotate her clockwise roughly 90°.  The rotation angle is pretty good as are the direction (NW -> NE -> SE, but I don’t like the distributed density of the blobs as she sat still on the ground at each location – I think I’ll have to use an average value for the starting orientation value passed to the autopilot for GPS tracking processing.

Magnetic Declination

The one thing holding me back with GPS tracking is the direction Hermione is pointing at launch; in particular, she can find out the flight direction to the target based on the difference between the GPS positions of takeoff and landing points, but Hermione’s compass only gives orientation angles based on magnetic north, and the difference between true and magnetic north varies over time and space.

After digging around, it turns out “Magnetic Declination” is the name of the angle between magnetic and true north, which for me is currently -1° 5′ i.e. at takeoff, when Hermione’s compass says she’s pointing magnetic north, she’s actually point +1° 5′ from true north.  For me, that means it’s irrelevant for the short distance flights I’m implementing.

In the perfect world, I would like to have a magnetic declination table, mapping GPS longitude / latitude to magnetic declination angle, or use a GPS receiver which has one of these built in (they do exist apparently), but for now simply knowing the magnetic declination is negligible for my few meter flights is enough.

Just FYI, Cambridge’ magnetic declination is -0º 30′ which is even more negligible!

Global Warming

In early spring this year (March – mid-April) conditions were good for flying; light stable winds with sunny days and rainy days all forecast accurately days in advance.  The net of this was lots of test flights, lots of videos, and the only downside was the need to put Hermione’s HoG RPi B3 and PCB in a customize Pimoroni Tangerine PiBow to try to keep temperatures in the IMU above 20°C where its sensor offsets and gains are most stable.

In the second half of spring, the temperature soared to the high twenties / early thirties; there was no breeze and no rain; the weather was Mediterranean midsummer.  The lawn was starched and featureless for the down-facing lateral-motion tracking RPi camera and so flights moved to the gravel drive; more damage to the props ensued but progress was good.

It’s now nominally mid-summer here.  The weather’s moved to blustery showers, torrential rain storms, wind speed in the teens and sometimes gale force, with bursts of thunder and lightening and to make it worse, it changes every half an hour.  The temperature is mid-to-high teens. It’s standard autumnal weather in all but name, and it’s impossible to fly in; I get my kit out in bright sunshine and low twenties with no breeze, and by the time it’s set up, the sky has clouded over threatening rain, the wind has risen into the teens and the temperature has dropped into the teens.

Why am I telling you?  Because now, I’ve pretty much finished the GPS routing branch code, and in doing so, I’ve spotted a couple of fixes / enhancements I’ve back-applied to the main line; if I’m right, I may have fixed the feature where Hermione always faces the way she’s going rather than the direction she was pointing at takeoff: to change her direction of flight, she continues to fly ‘forwards’ from her point of view but she yaws her body round to point her nose in the new direction.  Completely pointless really, just another challenge to tick of the list.

Here’s to decent English summer weather again so I can post something that might actually be interest to you, dear readers!

GPS routing design

Starting requirements:

  • a prerecorded GPS target location (and possible intermediate waypoints) saved to file
  • the takeoff GPS location
  • an arbitrary takeoff flight direction akin to an aeroplane taking off down a runway i.e. autopilot sends motion process velocity vector of (0.3,0) = 0.3m/s forwards from Hermione’s point of view.
  • autopilot receives updates from GPS when there is a change from the previous one (i.e. GPS resolution is roughly a meter, it’s rate is typically 1Hz, and the flight speed is fixed at about 0.3m/s so each GPS update won’t necessarily be a change from the known location)
GPS routing

GPS routing

With the above, when autopilot receives a changed GPS location update, a GPS direction of flight (θ1) is calculated based on the difference between the previous (L1) and current (L2) GPS location.  Another direction (θ2) is determined from the current (L2) to the target GPS locations.  Knowing these two directions – the direction it is and should be going – autopilot sends motion processing a yawed version of the current (initially (0.3,0)) velocity vector.  This repeats until the current GPS location matches the target GPS location, at which point, a standard landing is sent by the autopilot to the motion processor – or in future, another waypoint is read from file and the above repeated again.

The above is relatively simple to implement, however there’s a lot of finicky bits to shuffle around so the code works:

  • Auto-pilot needs to be told it’s using GPS not file based routing
  • GPS waypoints are collected by the main process
  • flight GPS locations are processed by the autopilot
  • GPS is started fresh each flight / waypoint collection as the GPS process feeds into different processes per-flight or waypoint
  • Autopilot takeoff hovers while waiting to acquire enough satellites at the starting position, only then triggering the movement along the “runway”.

There’s probably others I haven’t thought of yet.

The result will be a bit of a zig-zaggy flight towards the target, primarily because the GPS target is likely to be less than 10m away, and GPS resolutions is down at about 1m.  If I get this to work, this is the last but one step of full-autonomous-flight completed!  However, the last is perhaps the hardest (perhaps impossible) to achieve: the route through a maze.

Video Distance + Compass Direction ≈ GPS

Distance + Direction = GPS

Distance + Direction = GPS

By human measurements, the distance was about 7m at about 45° (i.e NE).  GPS says 8.6m, video camera tracking says 5 which is the flight plan defined length to travel.

It was never going to be perfect due to the difference between magnetic and true north, the resolution of GPS of around 1m, and how video distance tracking will always be a best guess, but it’s more than good enough for my cunning plan to work.

However, the plan’s taking a premature diversion; during this test, I was less careful and she ended up (in vertical descent mode) clipping 5 props against the drive stone wall.  Next step (after replacing the props!) is now to deploy my Scanse Sweep code which will trigger an orderly landing if any object is detected less than 1.5m away – Hermione’s radius is 50cm prop tip to tip diagonally so that’s 1m clearance.

One interesting point: the compass readings are mostly in a very dense cluster, with just a few (relatively) pointing in very different directions – that’s as Hermione passed the family car!

Resistance is futile…


Given her previous hover flight was so good, I couldn’t resist a 10m flight directly away from me. She overshot the 10m target significantly, probably due to being unable to track the ground motion in the dust storm section.  I killed the flight before she hit the wall. Nevertheless, this confirms that she’s good to go with GPS tracking, firstly where she’s going, and next, actually defining where she should be going, based on a preset waypoint collected prior to a flight.

As a result, I’ve updated the code on GitHub.

Surprise Surprise…

the unexpected hits you between the eyes* (no, not literally!).

She’s sans chapeau as this was just a quick test. However, the flight was quite surprising so I thought I’d share: I’ve been tinkering with the scheduling of video processing, GPS, autopilot and the main sensor processing; I’d spotted these were getting out of sync, the primary reason being reading the autopilot and GPS OS FIFO shared memory data streams often enough for them to stay in sync, yet not too often that the read() blocked. The trivial drift over this 23s hover proved this is working – double integrated acceleration can only hold back drift for a second or two. What surprised me though is that this level of stability took place over gravel, and on checking the config, it became even more of a surprise: the video was running 320 x 320 pixels at 10Hz at 50% contrast levels and it worked brilliantly. I’d assumed higher resolution was needed and that she’d been flying at 640 x 640 (i.e. 4 times the resolution) and at that level she was both drifting, and struggling to process the video frames fast enough. I’m finally at a confident place that I can now move GPS to feed the autopilot such that the autopilot can direct the core motion processing where to go.


*courtesy of Cilla Black

Cotswold Raspberry Jam

So there I was this morning on the bus heading to Cheltenham to co-host the Cotswold Raspberry Jam, gazing at the beautiful Cotswold countryside, and my mind began to wander – well actually it was unleashed from the constraints of dealing with details it couldn’t fix while on a bus.  And out came a better idea about where to go for the next step to autonomy.

What’s been bugging me is the difference between magnetic and GPS North.  The ‘bus’ solution is to ignore magnetic north completely.  The compass still has a role to play maintaining zero yaw throughout a long flight, but it plays no part in the direction of flight – in fact, this is mandatory for the rest of the plan to work.

Instead, the autopilot translates between GPS north, south, east and west coordinate system and Hermione’s forward, backwards, right and left coordinate.  The autopilot only speaks to Hermione in her own coordinates when telling her where to go.  The autopilot learns Hermione’s coordinate system at the start of each flight by asking her to fly forwards a meter or so, and comparing that to the GPS vector change it gets.  This rough translation is refined continuously throughout the flight.  Hermione can start a flight pointing in any direction compared to the target GPS point, and the autopilot will get her to fly towards the GPS target regardless of the way she’s pointing.

Sadly now, I’m back from today’s Jam, and the details confront me once more, but the bus ride did yield a great view of where to go next.


P.S. As always, the jam was great; over 100 parents and kids, lots of cool things for them to see and do, including a visit this time by 2 members of local BBC micro:bit clubs.  Next one is 30th September.

OK, so this is weird…

When I first added the autopilot process, it would update the main process at 100Hz with the current distance vector target; the main process couldn’t quite keep up with what it was being fed, but it was close.  The down side was the video processing dropped rate through the floor, building up a big backlog, meaning there was a very late reaction to lateral drift.

So I changed the autopilot process to only send velocity vector targets; that meant autopilot sent an update to the main process every few seconds (i.e. ascent, hover, descent and stop updates) rather than 100 times a second for the distance increments; as a result, video processing was running at full speed again.

But when I turned on diagnostics, the main process can’t keep up with the autopilot despite the fact they are only send once every few seconds.  A print to screen the messages showed they were being sent correctly, but the main process’ select() didn’t pick them up: in a passive flight, it stayed at a fixed ascent velocity for ages – way beyond the point the autopilot prints indicated the hover, descent and stop messages had been sent .  Without diagnostics, the sending and receipt of the messages were absolutely in sync.  Throughout all this, the GPS and video processes’ data rates to the main process were low and worked perfectly.

The common factor between autopilot, GPS, video and diagnostics is that they use shared memory files to store / send their data to the main processor; having more than one with high demand (autopilot at 100Hz distance target or diagnostics at 100Hz) seemed to be the cause for one of the lower frequency shared memory sources simply to not be spotted as far as the main process’ select() was concerned.  I have no idea why this happens and that troubles me.

This useful link shows the tools to query shared memory usage stats.

df -k /dev/shm shows only 1% shared memory is used during a flight

Filesystem 1K-blocks Used Available Use% Mounted on
tmpfs 441384 4 441380 1% /dev/shm

ipcs -pm shows the processes owning the shared memory:

------ Shared Memory Creator/Last-op PIDs --------
shmid owner cpid lpid 
0 root 625 625 
32769 root 625 625 
65538 root 625 625 
98307 root 625 625 
131076 root 625 625

ps -eaf | grep python shows the processes in use by Hermione. Note that none of these’ process IDs are in the list of shared memory owners above:

root 609 599 0 15:43 pts/0 00:00:00 sudo python ./qc.py
root 613 609 12 15:43 pts/0 00:02:21 python ./qc.py
root 624 613 0 15:43 pts/0 00:00:03 python /home/pi/QCAPPlus.pyc GPS
root 717 613 1 16:00 pts/0 00:00:01 python /home/pi/QCAPPlus.pyc MOTION 800 800 10
root 730 613 14 16:01 pts/0 00:00:00 python /home/pi/QCAPPlus.pyc AUTOPILOT fp.csv 100

Oddly, it’s the gps daemon with the shared memory creator process ID:

gpsd 625 1 4 15:43 ? 00:01:00 /usr/sbin/gpsd -N /dev/ttyGPS

I’m not quite sure yet whether there’s anything wrong here.

I could just go ahead with object avoidance; the main process would only have diagnostics as it’s main high speed shared memory usage.  Autopilot can maintain the revised version of ony sending low frequency velocity vector target changes.  Autopilot would get high frequency input from the Sweep, but convert that to changes of low frequency velocity targets sent to the main process.  This way, main has only diagnostics, and autopilot only has sweep as fast inputs.  This is a speculative solution.  But I don’t like the idea of moving forward with an undiagnosed weird problem.

Odds and sods

Progress and thus blog updates have been delayed by the weather: too windy last week; this week, too hot and sunny.  When the sun is high, the camera can’t resolve sufficient contrast between the clover flowers and the grass in the mown lawn  Because sunrise is 5am, that means I can only do test flights in the evening.  It also means that in coming to this conclusion, I’ve broken 5 props so far.  Very frustrated and expensive.

As a result, I’m still to really confirm that autopilot process is working well, and Sweep still lies on my office table.

On the slight plus side, I’ve enhanced the GPS data stream to the main process; I suspect I was throwing too much away by using first ‘string’ and then ‘float’ to pack the data.  I’ve just upped it to a ’64bit float’.  Iff this works as I hope, that may be all that’s necessary to track “where am I?” in accurate GPS units only, using Sweep + compass just to spot orientation of current paths / hedges in the maze. allowing the autopilot to choose “which way now?”.  Any mapping for “have I been here before?” can be added as an enhancement; initially it will be a random choice of the various path directions available regardless of whether they’ve been visited already.  But this is all a long way in the future.


A little later after writing the above, a speckled shade from a large tree was cast over part of the back garden and I managed to collect this GPS plot of a supposed 5m flight NE over the course of a 20s i.e. 20 GPS samples:

GPS resolution

GPS resolution

She struggled with the video tracking and once she left the shade, chaos ensued, but I did get the GPS stats, which clearly shows a much higher resolution initially than I was getting before.  So that’s good.  Yet another prop snapped on the aborted flight as she headed towards the wall. So that’s prop number 6 or £110 in real terms – completely unaffordable.

The multiple breakages are again weather based: two weeks of no rain means once more the lawn is rock solid, and props clipping the ground on an aborted landing snap instead of embedding in soft soil.