GPS routing design

Starting requirements:

  • a prerecorded GPS target location (and possible intermediate waypoints) saved to file
  • the takeoff GPS location
  • an arbitrary takeoff flight direction akin to an aeroplane taking off down a runway i.e. autopilot sends motion process velocity vector of (0.3,0) = 0.3m/s forwards from Hermione’s point of view.
  • autopilot receives updates from GPS when there is a change from the previous one (i.e. GPS resolution is roughly a meter, it’s rate is typically 1Hz, and the flight speed is fixed at about 0.3m/s so each GPS update won’t necessarily be a change from the known location)
GPS routing

GPS routing

With the above, when autopilot receives a changed GPS location update, a GPS direction of flight (θ1) is calculated based on the difference between the previous (L1) and current (L2) GPS location.  Another direction (θ2) is determined from the current (L2) to the target GPS locations.  Knowing these two directions – the direction it is and should be going – autopilot sends motion processing a yawed version of the current (initially (0.3,0)) velocity vector.  This repeats until the current GPS location matches the target GPS location, at which point, a standard landing is sent by the autopilot to the motion processor – or in future, another waypoint is read from file and the above repeated again.

The above is relatively simple to implement, however there’s a lot of finicky bits to shuffle around so the code works:

  • Auto-pilot needs to be told it’s using GPS not file based routing
  • GPS waypoints are collected by the main process
  • flight GPS locations are processed by the autopilot
  • GPS is started fresh each flight / waypoint collection as the GPS process feeds into different processes per-flight or waypoint
  • Autopilot takeoff hovers while waiting to acquire enough satellites at the starting position, only then triggering the movement along the “runway”.

There’s probably others I haven’t thought of yet.

The result will be a bit of a zig-zaggy flight towards the target, primarily because the GPS target is likely to be less than 10m away, and GPS resolutions is down at about 1m.  If I get this to work, this is the last but one step of full-autonomous-flight completed!  However, the last is perhaps the hardest (perhaps impossible) to achieve: the route through a maze.

Hello pcworld.com readers!

First, I have to apologise on pcworld.com’s behalf – they’ve republished an article from over 2 years ago without bothering to update it in any way – they didn’t even bother to check the links to here still existed, so well done y’all for making it here!

Luckily the world has moved on here since then.  Have a browse, watch the videos, but I would strongly advise you not to follow in my footsteps – here’s why.

Cheers,

Hove

Encroaching Hermione’s personal space

With Scanse Sweep installed underneath (yes, she has blind spots from her legs and the WiFi antenna), any object detected between 50cm (the distance to tip of her props) and 1m (her personal space boundary) now triggers a controlled landing.  The same thing would happen if the obstacle wasn’t me approaching her, but instead, her approaching a brick wall: a vertical controlled descent to ground.

There’s a lot more that can be built on this; the Sweep is rotating at 1Hz (it can do up to 10Hz), and its taking about 115 samples per loop, each reporting both the rotation position (azimuth) and distance to the nearest object at that rotation.  Currently the code only collects the shortest distance per loop, and if under 1m, the standard file-based flight plan is replaced with a dynamically created descent flight plan based upon the height that Hermione should have reached at that point with the file-based flight plan.

Here’s the layout of communication between the 5 processes involved:

          +—————+     +—————————+
          |Sweep|——>——|Autopilot|——>——+
          +—————+     +—————————+     |
                                      |
                         +———+     +——————+
                         |GPS|——>——|Motion|
                         +———+     +——————+
                                      |
                          +—————+     |
                          |Video|——>——+
                          +—————+

The latest code updates are on GitHub.

Next step is to move GPS to also feed into Autopilot.  The move is easy, just a couple of minutes to move who starts the GPS process; the difficult bit is how the autopilot should handle that extra information.  Currently the plan is that before a flight, Hermione is taken to the desired end-point of the flight, and she captures the GPS coordinates.  Then she’s moved to somewhere else, and pointing in any direction; on take-off, she finds her current GPS position, and the autopilot builds a dynamic flight plan to the end-point; all the constituent parts of the code are already in place.  It’s just the plumbing that needs careful creation.


P.S. That was the first live test flight, hence the slightly nervous look on my face, and my step backwards once she’d detected my intrusions!

P.P.S: Proof that the abort was triggered courtesy of the logs:

[CRITICAL] (MainThread) fly 3467, ASCENT
[CRITICAL] (MainThread) fly 3467, HOVER
[CRITICAL] (MainThread) fly 3467, ABORT (0.88m)
[CRITICAL] (MainThread) fly 3467, STOP
[CRITICAL] (MainThread) fly 4087, Flight time 16.627974

An uninterrupted flight would have run for 22s where descent would have started at 18s.

Video Distance + Compass Direction ≈ GPS

Distance + Direction = GPS

Distance + Direction = GPS

By human measurements, the distance was about 7m at about 45° (i.e NE).  GPS says 8.6m, video camera tracking says 5 which is the flight plan defined length to travel.

It was never going to be perfect due to the difference between magnetic and true north, the resolution of GPS of around 1m, and how video distance tracking will always be a best guess, but it’s more than good enough for my cunning plan to work.

However, the plan’s taking a premature diversion; during this test, I was less careful and she ended up (in vertical descent mode) clipping 5 props against the drive stone wall.  Next step (after replacing the props!) is now to deploy my Scanse Sweep code which will trigger an orderly landing if any object is detected less than 1.5m away – Hermione’s radius is 50cm prop tip to tip diagonally so that’s 1m clearance.

One interesting point: the compass readings are mostly in a very dense cluster, with just a few (relatively) pointing in very different directions – that’s as Hermione passed the family car!

Resistance is futile…


Given her previous hover flight was so good, I couldn’t resist a 10m flight directly away from me. She overshot the 10m target significantly, probably due to being unable to track the ground motion in the dust storm section.  I killed the flight before she hit the wall. Nevertheless, this confirms that she’s good to go with GPS tracking, firstly where she’s going, and next, actually defining where she should be going, based on a preset waypoint collected prior to a flight.

As a result, I’ve updated the code on GitHub.

Surprise Surprise…

the unexpected hits you between the eyes* (no, not literally!).

She’s sans chapeau as this was just a quick test. However, the flight was quite surprising so I thought I’d share: I’ve been tinkering with the scheduling of video processing, GPS, autopilot and the main sensor processing; I’d spotted these were getting out of sync, the primary reason being reading the autopilot and GPS OS FIFO shared memory data streams often enough for them to stay in sync, yet not too often that the read() blocked. The trivial drift over this 23s hover proved this is working – double integrated acceleration can only hold back drift for a second or two. What surprised me though is that this level of stability took place over gravel, and on checking the config, it became even more of a surprise: the video was running 320 x 320 pixels at 10Hz at 50% contrast levels and it worked brilliantly. I’d assumed higher resolution was needed and that she’d been flying at 640 x 640 (i.e. 4 times the resolution) and at that level she was both drifting, and struggling to process the video frames fast enough. I’m finally at a confident place that I can now move GPS to feed the autopilot such that the autopilot can direct the core motion processing where to go.


*courtesy of Cilla Black

Compass & integrated gyro yaw fusion compatibility tests.

OK, back to the plan; the next step is to get the compass readings in sync with the gyro readings for yaw as per my bus inspired plan:

Yaw sources

Yaw sources

I’m pretty impressed with this; the compass is measuring magnetic north pole orientation, where north to east clockwise rotation is positive; in contrast the gyro anti-clockwise rotation is positive following the right hand rule.  Also the integrated gyro has no bounds on the yaw values it produces (±∞°), where as the magnetometer results (after some trig on the vector it outputs) is ±180°.

Note to self: gonna have to cancel the fusion when integrated gyro and compass yaw are significantly either side of zero.  I do have one further concern: this test was run without props / ESCs powered up; I’m worried that once the motors are running, the magnetic force that makes the motors spin will completely ruin the brilliant synchronicity above.  The next test for the plan will be to test GPS tracking over a 10m flight which, in passing, should also show the effect of the motors’ magnetism has on the compass readings.

A blast from the past

For the last few days, the outdoor temperature dropped to the teens from the twenties.  As a result, she didn’t get off the ground; she just jittered around.  The graph of double integrated (accelerometer – gravity) shows she believed she’s was climbing right at the point the IMU temperature dropped.

Not again :-(

Not again 🙁

This looks awfully similar to this.

The problem is that in these cooler temperatures, once the props start spinning, the IMU cools, and the temperature sensitive accelerometer output shifts as a result.  Because I take a snapshot of gravity prior to the props spinning, net acceleration (accelerometer – gravity) is wrong, so integrating it twice to get distances is very very wrong.

The problem showed up this time because to add the salad bowl lid,  I had to remove her Pimoroni Tangerine Dream PiBow, thus exposing her to the elements. For now, I’ve just moved the initial read of gravity to immediately before take off, but I’m also working out how to reintroduce the PiBox case or similar in a lower profile version to that the salad bowl still fits neatly on top.

Beauty is in the eye of the beholder,

so behold her(mione):

Gorgeous

Gorgeous

Here’s a quick test flight just to check I’ve not broken anything; more details on Vimeo.

It all started with the props: I’d broken 6 of the CF ones in the last week or two (costing £110 to replace) and on hunting for cheaper alternatives*, I found the white beechwood T-motor props.  They are half the price, stronger, and less likely to split on impact – the CF ones are actually a sandwich with a middle layer of what looks like wood fibre and any lateral contact with the ground and the sandwich splits in three.

The new props’ span is an inch shorter but with slightly higher pitch; after an initial test flight, it was clear they were more powerful as a result. They also look nicer, and that’s what triggered the rest of the makeover.

I’ve been looking for a chapeau to cover her beret PCB for ages (the PCB doesn’t conform to the standard HAT definition, hence beret), and over time I’ve built up a collection of yellow and orange plastic salad bowls as a result, but none quite fitted right.  But with the new white colour scheme, I found one that fitted nigh on perfectly, just a little dremel trimming required.

Finally, the feet: these are rubber lacrosse balls.  They are heavier and stronger than the previous yellow dodgeball foam feet that Hermione forever punched holes through on landing.  The increased prop power more than copes with the extra foot weight.

I think the new look added nearly a kilo in total, and measuring her on the scales, she now weighs 4.6kg, so it’s amazing she takes off at all!


*The problem with most more-affordable props is they don’t fit upside-down on the ground facing motors in Hermione’s X8 format frame.

Cotswold Raspberry Jam

So there I was this morning on the bus heading to Cheltenham to co-host the Cotswold Raspberry Jam, gazing at the beautiful Cotswold countryside, and my mind began to wander – well actually it was unleashed from the constraints of dealing with details it couldn’t fix while on a bus.  And out came a better idea about where to go for the next step to autonomy.

What’s been bugging me is the difference between magnetic and GPS North.  The ‘bus’ solution is to ignore magnetic north completely.  The compass still has a role to play maintaining zero yaw throughout a long flight, but it plays no part in the direction of flight – in fact, this is mandatory for the rest of the plan to work.

Instead, the autopilot translates between GPS north, south, east and west coordinate system and Hermione’s forward, backwards, right and left coordinate.  The autopilot only speaks to Hermione in her own coordinates when telling her where to go.  The autopilot learns Hermione’s coordinate system at the start of each flight by asking her to fly forwards a meter or so, and comparing that to the GPS vector change it gets.  This rough translation is refined continuously throughout the flight.  Hermione can start a flight pointing in any direction compared to the target GPS point, and the autopilot will get her to fly towards the GPS target regardless of the way she’s pointing.

Sadly now, I’m back from today’s Jam, and the details confront me once more, but the bus ride did yield a great view of where to go next.


P.S. As always, the jam was great; over 100 parents and kids, lots of cool things for them to see and do, including a visit this time by 2 members of local BBC micro:bit clubs.  Next one is 30th September.