Autonomous Autopilot, Almost

So the autopilot is now full autonomous, able to direct the main motion processor between various intermediate GPS waypoints to the final destination.  It will soft-abort flights via a controlled immediate landing if objects are in the way, the number of ‘visible’ satellites drops below a safety limit, or at the end of the flight.

Or it would if my GPS receiver / the weather would behave.  It’s only ever got 11 satellites max, and today it was struggling to keep 8.  Roughly 10 seems to be the acceptable lower limit my Mavic will accept for GPS control and today, it had no problem finding 11+.  Hermione on the other hand could only see 8 at the highest and this often drops mid-flight.  Without a constant set of satellites the position drifts – I’ve seen 40+m errors today.  Luckily I have code to abort a flight if the number of satellites drops below the minimum, and that’s what I’ve seen; in addition, the GPS is used to produce the direction of travel, but not the speed which is fixed at a gentle 0.3m/s to allow the down-facing video to track lateral motion accurately.

Here’s the GPS data I got for my standard 2m square flight; as you can see the return to base which in real life was very accurate, was 4m out by GPS’ account:

GPS accuracy & resolution

GPS accuracy & resolution

So I need the skies to clear and ideally to find a better GPS receiver, and herein lies the problem: I’m off to Disney Land, Paris next week.  Sod’s law says GPS conditions will be perfect here in England next week!


P.S. Just found myself a USB receiver that covers both GPS and GLONASS. GPS is USA owned and has 31 satellites; GLONASS is the Russian equivalent with 24 satellites; hopefully together I’ll be able to get much higher number of satellites, and hence much more accuracy as a result. Sadly, the proof of this pudding will only happen after a week of Disney junk-food eating. 🙁

Yaw(n)

I think I’ve finished writing the code to support GPS tracking: a target GPS location is stored prior to the flight; the flight takes off pointing in any direction, and once it has learned it’s own GPS takeoff position, it heads towards the target, updating the direction to the target each time it gets a new GPS fix for where it is, and finally landing when it’s current position matches the target GPS position.

There’s a lot of testing to be done before I can try it live.  The new autopilot code is pretty much a rewrite from scratch.  It now has multiple partial flight plans:

  • take-off
  • landing
  • file defined direction + speed + time
  • GPS initial location finding
  • GPS tracking to destination
  • Scanse abort.

It switches between these depending on either external Scanse or GPS inputs, or the flight plan section completing e.g. the GPS flight plan reaches its destination or the file-based flight plan has completed the various time-based lateral movements and hovers (both of which will then swap to the landing flight plan).  Luckily, most of this testing can be carried out passively.

The first test preempts all of these; as per yesterday’s post, I should be able to use my magnetometer combined with its magnetic declination to find Hermione’s initial orientation compared with true (i.e. GPS) north.  At the same time, I could use the magnetometer to provide long term yaw values to fused with the integrated yaw rate from the gyrometer.  Here’s what I got from two sequential passive tests:

Compass & integrated Gyro yaw

Compass & integrated Gyro yaw

I’m not hugely impressed with the results of either.

  • The difference between compass and integrated gyro yaw doesn’t match as tightly as I was expecting – in its current state, I won’t be using the compass direction as a long term fuse with the integrated gyro yaw unless I can improve this.
  • The blobs in the lower pair are compass orientation values as she’s sitting on the ground – half way through I rotate her clockwise roughly 90°.  The rotation angle is pretty good as are the direction (NW -> NE -> SE, but I don’t like the distributed density of the blobs as she sat still on the ground at each location – I think I’ll have to use an average value for the starting orientation value passed to the autopilot for GPS tracking processing.

Surprise Surprise…

the unexpected hits you between the eyes* (no, not literally!).

She’s sans chapeau as this was just a quick test. However, the flight was quite surprising so I thought I’d share: I’ve been tinkering with the scheduling of video processing, GPS, autopilot and the main sensor processing; I’d spotted these were getting out of sync, the primary reason being reading the autopilot and GPS OS FIFO shared memory data streams often enough for them to stay in sync, yet not too often that the read() blocked. The trivial drift over this 23s hover proved this is working – double integrated acceleration can only hold back drift for a second or two. What surprised me though is that this level of stability took place over gravel, and on checking the config, it became even more of a surprise: the video was running 320 x 320 pixels at 10Hz at 50% contrast levels and it worked brilliantly. I’d assumed higher resolution was needed and that she’d been flying at 640 x 640 (i.e. 4 times the resolution) and at that level she was both drifting, and struggling to process the video frames fast enough. I’m finally at a confident place that I can now move GPS to feed the autopilot such that the autopilot can direct the core motion processing where to go.


*courtesy of Cilla Black

OK, so this is weird…

When I first added the autopilot process, it would update the main process at 100Hz with the current distance vector target; the main process couldn’t quite keep up with what it was being fed, but it was close.  The down side was the video processing dropped rate through the floor, building up a big backlog, meaning there was a very late reaction to lateral drift.

So I changed the autopilot process to only send velocity vector targets; that meant autopilot sent an update to the main process every few seconds (i.e. ascent, hover, descent and stop updates) rather than 100 times a second for the distance increments; as a result, video processing was running at full speed again.

But when I turned on diagnostics, the main process can’t keep up with the autopilot despite the fact they are only send once every few seconds.  A print to screen the messages showed they were being sent correctly, but the main process’ select() didn’t pick them up: in a passive flight, it stayed at a fixed ascent velocity for ages – way beyond the point the autopilot prints indicated the hover, descent and stop messages had been sent .  Without diagnostics, the sending and receipt of the messages were absolutely in sync.  Throughout all this, the GPS and video processes’ data rates to the main process were low and worked perfectly.

The common factor between autopilot, GPS, video and diagnostics is that they use shared memory files to store / send their data to the main processor; having more than one with high demand (autopilot at 100Hz distance target or diagnostics at 100Hz) seemed to be the cause for one of the lower frequency shared memory sources simply to not be spotted as far as the main process’ select() was concerned.  I have no idea why this happens and that troubles me.

This useful link shows the tools to query shared memory usage stats.

df -k /dev/shm shows only 1% shared memory is used during a flight

Filesystem 1K-blocks Used Available Use% Mounted on
tmpfs 441384 4 441380 1% /dev/shm

ipcs -pm shows the processes owning the shared memory:

------ Shared Memory Creator/Last-op PIDs --------
shmid owner cpid lpid 
0 root 625 625 
32769 root 625 625 
65538 root 625 625 
98307 root 625 625 
131076 root 625 625

ps -eaf | grep python shows the processes in use by Hermione. Note that none of these’ process IDs are in the list of shared memory owners above:

root 609 599 0 15:43 pts/0 00:00:00 sudo python ./qc.py
root 613 609 12 15:43 pts/0 00:02:21 python ./qc.py
root 624 613 0 15:43 pts/0 00:00:03 python /home/pi/QCAPPlus.pyc GPS
root 717 613 1 16:00 pts/0 00:00:01 python /home/pi/QCAPPlus.pyc MOTION 800 800 10
root 730 613 14 16:01 pts/0 00:00:00 python /home/pi/QCAPPlus.pyc AUTOPILOT fp.csv 100

Oddly, it’s the gps daemon with the shared memory creator process ID:

gpsd 625 1 4 15:43 ? 00:01:00 /usr/sbin/gpsd -N /dev/ttyGPS

I’m not quite sure yet whether there’s anything wrong here.

I could just go ahead with object avoidance; the main process would only have diagnostics as it’s main high speed shared memory usage.  Autopilot can maintain the revised version of ony sending low frequency velocity vector target changes.  Autopilot would get high frequency input from the Sweep, but convert that to changes of low frequency velocity targets sent to the main process.  This way, main has only diagnostics, and autopilot only has sweep as fast inputs.  This is a speculative solution.  But I don’t like the idea of moving forward with an undiagnosed weird problem.

Boules

Flight plan processing has now been moved to a new autopilot process ultimately with the aim of feeding Sweep object detection into the autopilot, which in turn swaps from the current flight plan to an ’emergency landing’ flight plan, updating the main process to hover and descend to ground. This flight proves the autopilot feed works, though does raise other problems, primarily the drift over the course of a 3 second takeoff, 5 second hover and 4 second land.

I suspect the drift is caused by video processing: it was processing the 10Hz output at 7Hz, leading to lag.  This reduced processing speed may be caused by overheating of the RPi CPU, causing it to throttle back.

I’d set

force_turbo=1

in /boot/config.txt to make sure the CPU runs at fully speed always, rather than when it ‘thinks’ it’s busy.  This, combined with the fact the temperature today is 24° in the shade may well be enough for the CPU to throttle back due to overheating instead.  Certainly just a few days ago when the outdoor temperature was in the mid-teens and a cool breeze was blowing, there was no drift.  The CPU has a heatsink, but it is a case.  I may have to either remove the case (exposing it to the elements) or add a cooling fan to the case.  If possible, the latter would be better as the case is there to provide a constant temperature to IMU temperature drift.  I’ve found this one which might just fit in.

Note the kids’ pétanque kit is deliberately on the ground to provide weighty contrasting areas that the grass no longer supplies to the video.  Off to mow the lawn now.

Brain dump

Just jotting down some thoughts to share…

The top two sketches are yesterday’s and future processes used by my code.  Today, the code is more like the left, except the pilot has been broken out to its own process: it still just reads the file based flight plan.  It’s been tested passively, and the logs show correct data fed to, read by and logged by the motion process – the fact the flight ends ‘underground’ is there intentionally to ensure landing always touches down:

Autopilot flight plan targets

Autopilot flight plan targets

The lower left in my sketch defines several steps adding ‘AI’ to the autopilot.  This example uses a normal file based flight plan to move 10m forward; part of the way, there is a diagonal wall in the way.  First step is to feed Sweep to the autopilot from the motion processor so the pilot can see the wall.  Next step is to hover / land when the pilot spots an obstacle that’s too close.  Finally, the pilot leaves the motion hovering an indeterminate amount of time while the wall is mapped and a diversion created to redirect Hermione around the obstacle.

Intelligent Autopilot

I’m posting this to crystallise a lot of interrelated parallel thoughts into a working serial plan.  What you read below is probably the 20th draft revision:


Currently, each flight has a flight plan written by hand and saved to file beforehand: lines of 3D velocities plus how long to fly at those velocities e.g.

# - evx is fore / aft velocity in m/s; fore is positive
# - evy is port / starboard velocity in m/s; port is positive 
# - evz is up / down velocity in m/s; up is positive
# - time is how long to maintain that speed in seconds
# - name is an arbitrary name put out to console to show flight progress

# evx, evy, evz, time, name
 0.0, 0.0, 0.3, 2.0, ASCENT
 0.0, 0.0, 0.0, 6.0, HOVER
 0.0, 0.0,-0.3, 2.0, DESCENT

The main motion process uses this to calculate the distance PID targets at a given point in time; it’s efficient, but inflexible – it doesn’t react to the world around it.  To incorporate object avoidance, GPS waypoint targets, and ultimately maze mapping and tracking, I’ve decided to add a well-informed autopilot process to do the hard work, and leave the motion process in ignorant bliss.  As a separate process, the LINUX kernel scheduling should use a different RPi 3B CPU core for the autopilot than that used by the already overworked motion process.

Initially, the autopilot process will use the same input file as before, feeding the latest distances targets over an OS FIFO to be picked up promptly by the motion process much as GPS, video and Sweep do now.  That will ease the load on the motion process slightly as a result.

Once tested, GPS and Sweep will move from the motion process to the autopilot so it can make intelligent decisions dynamically about the distance targets it sends to be motion process; the time-critical motion processing remains oblivious to brick walls, hedges and where it’s been, it just blindly does what the intelligent autopilot tells it based upon the brick walls, hedges and mapping the autopilot knows of.

The only minor problem I can see currently is that the autopilot needs to know the direction the quad is pointing so it can yaw the quad frame Sweep data to align with the earth frame GPS flight data; because the compass is part of the IMU code on the motion process, I either need a FIFO feeding this back to the autopilot, or to access the IMU compass directly from the autopilot via I2C.  Plenty of time to resolve that though while I get the basics working first.

The motivation for doing this is I don’t want to attach the expensive Sweep to Hermione until all future significant code changes have been test and any significant bugs have been made and tested thoroughly.

For the record, the current code on GitHub has been updated.