OK, so this is weird…

When I first added the autopilot process, it would update the main process at 100Hz with the current distance vector target; the main process couldn’t quite keep up with what it was being fed, but it was close.  The down side was the video processing dropped rate through the floor, building up a big backlog, meaning there was a very late reaction to lateral drift.

So I changed the autopilot process to only send velocity vector targets; that meant autopilot sent an update to the main process every few seconds (i.e. ascent, hover, descent and stop updates) rather than 100 times a second for the distance increments; as a result, video processing was running at full speed again.

But when I turned on diagnostics, the main process can’t keep up with the autopilot despite the fact they are only send once every few seconds.  A print to screen the messages showed they were being sent correctly, but the main process’ select() didn’t pick them up: in a passive flight, it stayed at a fixed ascent velocity for ages – way beyond the point the autopilot prints indicated the hover, descent and stop messages had been sent .  Without diagnostics, the sending and receipt of the messages were absolutely in sync.  Throughout all this, the GPS and video processes’ data rates to the main process were low and worked perfectly.

The common factor between autopilot, GPS, video and diagnostics is that they use shared memory files to store / send their data to the main processor; having more than one with high demand (autopilot at 100Hz distance target or diagnostics at 100Hz) seemed to be the cause for one of the lower frequency shared memory sources simply to not be spotted as far as the main process’ select() was concerned.  I have no idea why this happens and that troubles me.

This useful link shows the tools to query shared memory usage stats.

df -k /dev/shm shows only 1% shared memory is used during a flight

Filesystem 1K-blocks Used Available Use% Mounted on
tmpfs 441384 4 441380 1% /dev/shm

ipcs -pm shows the processes owning the shared memory:

------ Shared Memory Creator/Last-op PIDs --------
shmid owner cpid lpid 
0 root 625 625 
32769 root 625 625 
65538 root 625 625 
98307 root 625 625 
131076 root 625 625

ps -eaf | grep python shows the processes in use by Hermione. Note that none of these’ process IDs are in the list of shared memory owners above:

root 609 599 0 15:43 pts/0 00:00:00 sudo python ./qc.py
root 613 609 12 15:43 pts/0 00:02:21 python ./qc.py
root 624 613 0 15:43 pts/0 00:00:03 python /home/pi/QCAPPlus.pyc GPS
root 717 613 1 16:00 pts/0 00:00:01 python /home/pi/QCAPPlus.pyc MOTION 800 800 10
root 730 613 14 16:01 pts/0 00:00:00 python /home/pi/QCAPPlus.pyc AUTOPILOT fp.csv 100

Oddly, it’s the gps daemon with the shared memory creator process ID:

gpsd 625 1 4 15:43 ? 00:01:00 /usr/sbin/gpsd -N /dev/ttyGPS

I’m not quite sure yet whether there’s anything wrong here.

I could just go ahead with object avoidance; the main process would only have diagnostics as it’s main high speed shared memory usage.  Autopilot can maintain the revised version of ony sending low frequency velocity vector target changes.  Autopilot would get high frequency input from the Sweep, but convert that to changes of low frequency velocity targets sent to the main process.  This way, main has only diagnostics, and autopilot only has sweep as fast inputs.  This is a speculative solution.  But I don’t like the idea of moving forward with an undiagnosed weird problem.

Boules

Flight plan processing has now been moved to a new autopilot process ultimately with the aim of feeding Sweep object detection into the autopilot, which in turn swaps from the current flight plan to an ’emergency landing’ flight plan, updating the main process to hover and descend to ground. This flight proves the autopilot feed works, though does raise other problems, primarily the drift over the course of a 3 second takeoff, 5 second hover and 4 second land.

I suspect the drift is caused by video processing: it was processing the 10Hz output at 7Hz, leading to lag.  This reduced processing speed may be caused by overheating of the RPi CPU, causing it to throttle back.

I’d set

force_turbo=1

in /boot/config.txt to make sure the CPU runs at fully speed always, rather than when it ‘thinks’ it’s busy.  This, combined with the fact the temperature today is 24° in the shade may well be enough for the CPU to throttle back due to overheating instead.  Certainly just a few days ago when the outdoor temperature was in the mid-teens and a cool breeze was blowing, there was no drift.  The CPU has a heatsink, but it is a case.  I may have to either remove the case (exposing it to the elements) or add a cooling fan to the case.  If possible, the latter would be better as the case is there to provide a constant temperature to IMU temperature drift.  I’ve found this one which might just fit in.

Note the kids’ pétanque kit is deliberately on the ground to provide weighty contrasting areas that the grass no longer supplies to the video.  Off to mow the lawn now.

Brain dump

Just jotting down some thoughts to share…

The top two sketches are yesterday’s and future processes used by my code.  Today, the code is more like the left, except the pilot has been broken out to its own process: it still just reads the file based flight plan.  It’s been tested passively, and the logs show correct data fed to, read by and logged by the motion process – the fact the flight ends ‘underground’ is there intentionally to ensure landing always touches down:

Autopilot flight plan targets

Autopilot flight plan targets

The lower left in my sketch defines several steps adding ‘AI’ to the autopilot.  This example uses a normal file based flight plan to move 10m forward; part of the way, there is a diagonal wall in the way.  First step is to feed Sweep to the autopilot from the motion processor so the pilot can see the wall.  Next step is to hover / land when the pilot spots an obstacle that’s too close.  Finally, the pilot leaves the motion hovering an indeterminate amount of time while the wall is mapped and a diversion created to redirect Hermione around the obstacle.

Intelligent Autopilot

I’m posting this to crystallise a lot of interrelated parallel thoughts into a working serial plan.  What you read below is probably the 20th draft revision:


Currently, each flight has a flight plan written by hand and saved to file beforehand: lines of 3D velocities plus how long to fly at those velocities e.g.

# - evx is fore / aft velocity in m/s; fore is positive
# - evy is port / starboard velocity in m/s; port is positive 
# - evz is up / down velocity in m/s; up is positive
# - time is how long to maintain that speed in seconds
# - name is an arbitrary name put out to console to show flight progress

# evx, evy, evz, time, name
 0.0, 0.0, 0.3, 2.0, ASCENT
 0.0, 0.0, 0.0, 6.0, HOVER
 0.0, 0.0,-0.3, 2.0, DESCENT

The main motion process uses this to calculate the distance PID targets at a given point in time; it’s efficient, but inflexible – it doesn’t react to the world around it.  To incorporate object avoidance, GPS waypoint targets, and ultimately maze mapping and tracking, I’ve decided to add a well-informed autopilot process to do the hard work, and leave the motion process in ignorant bliss.  As a separate process, the LINUX kernel scheduling should use a different RPi 3B CPU core for the autopilot than that used by the already overworked motion process.

Initially, the autopilot process will use the same input file as before, feeding the latest distances targets over an OS FIFO to be picked up promptly by the motion process much as GPS, video and Sweep do now.  That will ease the load on the motion process slightly as a result.

Once tested, GPS and Sweep will move from the motion process to the autopilot so it can make intelligent decisions dynamically about the distance targets it sends to be motion process; the time-critical motion processing remains oblivious to brick walls, hedges and where it’s been, it just blindly does what the intelligent autopilot tells it based upon the brick walls, hedges and mapping the autopilot knows of.

The only minor problem I can see currently is that the autopilot needs to know the direction the quad is pointing so it can yaw the quad frame Sweep data to align with the earth frame GPS flight data; because the compass is part of the IMU code on the motion process, I either need a FIFO feeding this back to the autopilot, or to access the IMU compass directly from the autopilot via I2C.  Plenty of time to resolve that though while I get the basics working first.

The motivation for doing this is I don’t want to attach the expensive Sweep to Hermione until all future significant code changes have been test and any significant bugs have been made and tested thoroughly.

For the record, the current code on GitHub has been updated.