Cotswold Raspberry Jam

So there I was this morning on the bus heading to Cheltenham to co-host the Cotswold Raspberry Jam, gazing at the beautiful Cotswold countryside, and my mind began to wander – well actually it was unleashed from the constraints of dealing with details it couldn’t fix while on a bus.  And out came a better idea about where to go for the next step to autonomy.

What’s been bugging me is the difference between magnetic and GPS North.  The ‘bus’ solution is to ignore magnetic north completely.  The compass still has a role to play maintaining zero yaw throughout a long flight, but it plays no part in the direction of flight – in fact, this is mandatory for the rest of the plan to work.

Instead, the autopilot translates between GPS north, south, east and west coordinate system and Hermione’s forward, backwards, right and left coordinate.  The autopilot only speaks to Hermione in her own coordinates when telling her where to go.  The autopilot learns Hermione’s coordinate system at the start of each flight by asking her to fly forwards a meter or so, and comparing that to the GPS vector change it gets.  This rough translation is refined continuously throughout the flight.  Hermione can start a flight pointing in any direction compared to the target GPS point, and the autopilot will get her to fly towards the GPS target regardless of the way she’s pointing.

Sadly now, I’m back from today’s Jam, and the details confront me once more, but the bus ride did yield a great view of where to go next.


P.S. As always, the jam was great; over 100 parents and kids, lots of cool things for them to see and do, including a visit this time by 2 members of local BBC micro:bit clubs.  Next one is 30th September.

OK, so this is weird…

When I first added the autopilot process, it would update the main process at 100Hz with the current distance vector target; the main process couldn’t quite keep up with what it was being fed, but it was close.  The down side was the video processing dropped rate through the floor, building up a big backlog, meaning there was a very late reaction to lateral drift.

So I changed the autopilot process to only send velocity vector targets; that meant autopilot sent an update to the main process every few seconds (i.e. ascent, hover, descent and stop updates) rather than 100 times a second for the distance increments; as a result, video processing was running at full speed again.

But when I turned on diagnostics, the main process can’t keep up with the autopilot despite the fact they are only send once every few seconds.  A print to screen the messages showed they were being sent correctly, but the main process’ select() didn’t pick them up: in a passive flight, it stayed at a fixed ascent velocity for ages – way beyond the point the autopilot prints indicated the hover, descent and stop messages had been sent .  Without diagnostics, the sending and receipt of the messages were absolutely in sync.  Throughout all this, the GPS and video processes’ data rates to the main process were low and worked perfectly.

The common factor between autopilot, GPS, video and diagnostics is that they use shared memory files to store / send their data to the main processor; having more than one with high demand (autopilot at 100Hz distance target or diagnostics at 100Hz) seemed to be the cause for one of the lower frequency shared memory sources simply to not be spotted as far as the main process’ select() was concerned.  I have no idea why this happens and that troubles me.

This useful link shows the tools to query shared memory usage stats.

df -k /dev/shm shows only 1% shared memory is used during a flight

Filesystem 1K-blocks Used Available Use% Mounted on
tmpfs 441384 4 441380 1% /dev/shm

ipcs -pm shows the processes owning the shared memory:

------ Shared Memory Creator/Last-op PIDs --------
shmid owner cpid lpid 
0 root 625 625 
32769 root 625 625 
65538 root 625 625 
98307 root 625 625 
131076 root 625 625

ps -eaf | grep python shows the processes in use by Hermione. Note that none of these’ process IDs are in the list of shared memory owners above:

root 609 599 0 15:43 pts/0 00:00:00 sudo python ./qc.py
root 613 609 12 15:43 pts/0 00:02:21 python ./qc.py
root 624 613 0 15:43 pts/0 00:00:03 python /home/pi/QCAPPlus.pyc GPS
root 717 613 1 16:00 pts/0 00:00:01 python /home/pi/QCAPPlus.pyc MOTION 800 800 10
root 730 613 14 16:01 pts/0 00:00:00 python /home/pi/QCAPPlus.pyc AUTOPILOT fp.csv 100

Oddly, it’s the gps daemon with the shared memory creator process ID:

gpsd 625 1 4 15:43 ? 00:01:00 /usr/sbin/gpsd -N /dev/ttyGPS

I’m not quite sure yet whether there’s anything wrong here.

I could just go ahead with object avoidance; the main process would only have diagnostics as it’s main high speed shared memory usage.  Autopilot can maintain the revised version of ony sending low frequency velocity vector target changes.  Autopilot would get high frequency input from the Sweep, but convert that to changes of low frequency velocity targets sent to the main process.  This way, main has only diagnostics, and autopilot only has sweep as fast inputs.  This is a speculative solution.  But I don’t like the idea of moving forward with an undiagnosed weird problem.

Odds and sods

Progress and thus blog updates have been delayed by the weather: too windy last week; this week, too hot and sunny.  When the sun is high, the camera can’t resolve sufficient contrast between the clover flowers and the grass in the mown lawn  Because sunrise is 5am, that means I can only do test flights in the evening.  It also means that in coming to this conclusion, I’ve broken 5 props so far.  Very frustrated and expensive.

As a result, I’m still to really confirm that autopilot process is working well, and Sweep still lies on my office table.

On the slight plus side, I’ve enhanced the GPS data stream to the main process; I suspect I was throwing too much away by using first ‘string’ and then ‘float’ to pack the data.  I’ve just upped it to a ’64bit float’.  Iff this works as I hope, that may be all that’s necessary to track “where am I?” in accurate GPS units only, using Sweep + compass just to spot orientation of current paths / hedges in the maze. allowing the autopilot to choose “which way now?”.  Any mapping for “have I been here before?” can be added as an enhancement; initially it will be a random choice of the various path directions available regardless of whether they’ve been visited already.  But this is all a long way in the future.


A little later after writing the above, a speckled shade from a large tree was cast over part of the back garden and I managed to collect this GPS plot of a supposed 5m flight NE over the course of a 20s i.e. 20 GPS samples:

GPS resolution

GPS resolution

She struggled with the video tracking and once she left the shade, chaos ensued, but I did get the GPS stats, which clearly shows a much higher resolution initially than I was getting before.  So that’s good.  Yet another prop snapped on the aborted flight as she headed towards the wall. So that’s prop number 6 or £110 in real terms – completely unaffordable.

The multiple breakages are again weather based: two weeks of no rain means once more the lawn is rock solid, and props clipping the ground on an aborted landing snap instead of embedding in soft soil.

Intelligent Autopilot

I’m posting this to crystallise a lot of interrelated parallel thoughts into a working serial plan.  What you read below is probably the 20th draft revision:


Currently, each flight has a flight plan written by hand and saved to file beforehand: lines of 3D velocities plus how long to fly at those velocities e.g.

# - evx is fore / aft velocity in m/s; fore is positive
# - evy is port / starboard velocity in m/s; port is positive 
# - evz is up / down velocity in m/s; up is positive
# - time is how long to maintain that speed in seconds
# - name is an arbitrary name put out to console to show flight progress

# evx, evy, evz, time, name
 0.0, 0.0, 0.3, 2.0, ASCENT
 0.0, 0.0, 0.0, 6.0, HOVER
 0.0, 0.0,-0.3, 2.0, DESCENT

The main motion process uses this to calculate the distance PID targets at a given point in time; it’s efficient, but inflexible – it doesn’t react to the world around it.  To incorporate object avoidance, GPS waypoint targets, and ultimately maze mapping and tracking, I’ve decided to add a well-informed autopilot process to do the hard work, and leave the motion process in ignorant bliss.  As a separate process, the LINUX kernel scheduling should use a different RPi 3B CPU core for the autopilot than that used by the already overworked motion process.

Initially, the autopilot process will use the same input file as before, feeding the latest distances targets over an OS FIFO to be picked up promptly by the motion process much as GPS, video and Sweep do now.  That will ease the load on the motion process slightly as a result.

Once tested, GPS and Sweep will move from the motion process to the autopilot so it can make intelligent decisions dynamically about the distance targets it sends to be motion process; the time-critical motion processing remains oblivious to brick walls, hedges and where it’s been, it just blindly does what the intelligent autopilot tells it based upon the brick walls, hedges and mapping the autopilot knows of.

The only minor problem I can see currently is that the autopilot needs to know the direction the quad is pointing so it can yaw the quad frame Sweep data to align with the earth frame GPS flight data; because the compass is part of the IMU code on the motion process, I either need a FIFO feeding this back to the autopilot, or to access the IMU compass directly from the autopilot via I2C.  Plenty of time to resolve that though while I get the basics working first.

The motivation for doing this is I don’t want to attach the expensive Sweep to Hermione until all future significant code changes have been test and any significant bugs have been made and tested thoroughly.

For the record, the current code on GitHub has been updated.

“To mow, or not to mow, that is the question:

Whether ’tis easier for the macro-blocks to track
The clovers and daisies of contrasting colour..”

The answer is no, I shouldn’t have mown the lawn.  With the kids’ toys moved out of the way, and any ungreen contrasting features slain, there was nothing to distinguish one blade of shorn grass from another and Hermione drifted wildly.  Reinstating the kids’ chaos restored high quality tracking over 5m.

The point of the flight was to compare GPS versus macro-block lateral tracking.  Certainly over this 5m flight, down-facing video beat GPS hands down:

Lateral tracking

Lateral tracking

My best guess interpretation of the GPS graph is that the flight was actually from the 2 – 7m diagonally heading north west.  The camera POV doesn’t include compass data, so it’s correctly showing her flying forwards by 5m.  The compass code is not working accurately yet – it needs more investigation why not – it was showing ~90° (i.e. East) rather than the true 45° (i.e. North East) shown by the GPS and a handheld compass.

I’ve done some more refinements to scheduling the sensor reads, and also accuracy of GPS data streamed from the GPS process.  It’s worth viewing this graph full screen – each spike shows the time in seconds between motion processing loops – i.e. the time spent processing other sensors – 10ms indicates just IMU data was processed.  The fact no loop takes more than 0.042s* even with full diagnostics running means I could up the sampling rate back to 1kHz – it’s at 500Hz at the moment.  More importantly, it shows processing is nicely spread out and each sensor is getting it’s fair share of the processing and nobody is hogging the limelight.

Motion processing

Motion processing

As a result, I’ve updated the code on GitHub.


*42ms is the point where the IMU FIFO overflows at 1kHz sampling – 512 FIFO size / 12 bytes sample size / 1kHz sampling rate

Another bath time Eureka moment

and also premature, but I’ve just solved the problem of how to get Hermione to find her way through a maze.  It obviously involves GPS to define the centre of the maze, and Scanse Sweep to avoid the hedges, but what’s new is how to uses both to build a map of where she’s been.

Both GPS and Sweep provide input at once a second.  A combination of both’s data is sent to yet another process.  This combination of GPS location, the sweep at that location (and thinking on the fly, also compass orientation) builds a map.  Building and storing the map isn’t trivial, but it’s not going to be difficult either.  That map process can then be queried by the motion processing code i.e. Q: “Where next?”; A: “Left” or maybe “Compass point 118 degrees”.  The answer to the question is determined by areas that have not been explored yet – i.e. areas that aren’t on the map yet.

Once more, some future fun defined.

 

Persistent /dev/tty* names for GPS and Scanse Sweep

This details for this post are stolen from here; I’ve just tweaked it into my context.

The problem it solves is that both GPS and Sweep are USB UART /dev/ttyUSB? devices, and which is which is indeterminate and will change between boots.  The stolen solution below sets up a symlink to each device based upon the device attributes such that the code only ever refers the symlink i.e. ttySWEEP and ttyGPS rather than trying to guess the correct /dev/ttyUSB? underlying these symlinks.  I’ll hand you over now to my edition of the original author’s post:


Persistent names for usb-serial devices

I own a bunch of devices that appear as /dev/ttyUSB? in the system e.g. GPS and Scanse Sweep.  As I plug them in and pull them out from the USB ports, they get names like /dev/ttyUSB0 or ttyUSB1. Sadly the device names are not persistent — whether the Sweep pops up as /dev/ttyUSB0 or /dev/ttyUSB1 depends on the order in which are the devices discovered by the kernel. That makes things difficult — it usually requires a trial and error approach to find out what the hell is the GPS board’s tty name this time.

Wouldn’t it be nice to have persistent, descriptive device name for each of these toys? Like /dev/ttyGPS and /dev/ttySWEEP?

usb-serial devices

To distinguish between them we need some other unique identifier — in this case a vendor, product and serial number. These are the messages recorded at the end of /var/log/messages when Sweep (for example) is plugged in:

May 13 06:22:27 general kernel: [ 30.629485] usb 1-1.4: new full-speed USB device number 5 using dwc_otg
May 13 06:22:27 general kernel: [ 30.756984] usb 1-1.4: New USB device found, idVendor=0403, idProduct=6015
May 13 06:22:27 general kernel: [ 30.757006] usb 1-1.4: New USB device strings: Mfr=1, Product=2, SerialNumber=3
May 13 06:22:27 general kernel: [ 30.757019] usb 1-1.4: Product: FT230X Basic UART
May 13 06:22:27 general kernel: [ 30.757031] usb 1-1.4: Manufacturer: FTDI
May 13 06:22:27 general kernel: [ 30.757044] usb 1-1.4: SerialNumber: DO004VY5

UDEV rules

Now with the list of identifiers in hand let’s create a UDEV ruleset that’ll make a nice symbolic link for each of these devices. UDEV rules are usually scattered into many files in /etc/udev/rules.d. Create a new file called 99-usb-serial.rules and put the following lines in there:

SUBSYSTEM=="tty", ATTRS{idVendor}=="10c4", ATTRS{idProduct}=="ea60", ATTRS{serial}=="011806AE", SYMLINK+="ttyGPS"
SUBSYSTEM=="tty", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6015", ATTRS{serial}=="DO004VY5", SYMLINK+="ttySWEEP"

By now it should be obvious what these lines mean. Perhaps just a note for the last entry on each line — SYMLINK+=”ttySWEEP” means that UDEV should create a symlink /dev/ttySWEEP pointing to the actual /dev/ttyUSB* device. In other words the device names will continue to be assigned ad-hoc but the symbolic links will always point to the right device node.


Right back to me – this is a preemptive strike for when I add the Scanse Sweep to Hermione’s existing GPS.  However, that’s not going to happen until the Garmin LiDAR-Lite stops f’ing up the I2C (again), and I’ve tested all the varients of the GPS flight plan code.

GPS + compass + yaw control code

Coding this combination of features has taken me several days, covering complex interaction depending on whether each is installed and how they are configured.  This has required a rework of the GPS processing and Flight Plan objects to ensure they interact cleanly, along with enhanced configuration parameters and use of compass.  Here’s the overview.

yaw control compass gps control
yaw control Defines always forward flight with yaw to turn. Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift. No interactions without compass
compass Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift. Defines NSEW during flight Defines direction of flight between waypoints.
gps control No interaction without compass Defines direction of flight between waypoints. Defines flight plan waypoints; tracks locations during flights.

I _think_ I’ve got all bases covered now in the code, but there’s a lot of new pieces to be tested individually, and in all possible combinations, passively first, and then in real flights. For the first time in this project, I’m going to need to plan my testing so I don’t miss a combination until it’s too late.

On the plus side, somehow after her last video, Hermione’s shoulder joint broke while I was testing higher video frame sizes and rates, and the camera stopped working.  Until the replacement parts arrive, there’s a lot of spare time for thinking and coding, and also occasionally tickling my fancy – more on the latter anon.

GPS + compass plan

Here’s what I’ll be attempting to achieve with GPS and compass data integrated into the code – its data’s available now, but just logged.

  • In a large, open field, I’ll take Hermione to a random point, and she’ll record the GPS location; this will be the destination for the flight.
  • I’ll then carry her to any other position in that field and pop her down on the ground pointing in any direction i.e. not necessarily towards the destination.
  • As part of the current take-off procedure, she already records her GPS take-off location; from this and the destination GPS location, she’ll build up a flight plan in much the same format as she reads from file now: from the calculated distance and direction between the GPS points, she’ll convert these to a fixed velocity vector and flight period, wrapped in the stand take-off, hover and landing flight plan sequence.
  • During the initial hover of the flight. she’ll rotate so she is pointing towards the target destination based upon the compass input.
  • Once actually heading towards this destination, compass + yaw fuse to ensure she’d always pointing in the direction she’s flying
  • At the same time, the fused combination of the double integrated acceleration, down facing video combined and GPS track the current location, comparing it to the predetermined destination position and thus tightly tracking her way to the destination.

All the above isn’t difficult unless I’m missing something hidden in plain site; a large part is in preflight checks, getting the destination and takeoff GPS points and compass orientation.  The changes to the flight based on this target are pretty much in place already.  Just minor changes are required to fuse the compass into the integrated yaw rate both for post-takeoff hover to point her towards the destination, and to maintain that during the flight to that destination.

This breaks down into several easy safe stages:

  • the automated flight plan file creation can be done in the back garden with the motors not even powered up.
  • Initially the flight plan will only cover take-off, hover with reorientation and landing, just to check she’s pointing the right way
  • then the full flight plan maintains further incremental yaw as zero while she flies forwards towards the target for the time at the speed defined in the flight plan.

I’d better stop blogging and get on with it.


After a little more thought, I’m going to break it down further.

  1. I’ll still take Hermione around the park collecting a destination GPS location along with possible intermediate way points.
  2. Once at the arbitrary starting point, her initial GPS location is also recorded as part of her take-off routine.
  3. Together these GPS locations are passed to a sub-class of the FlightPlan() object to produce a flight plan structure identical in looks to the file based flight plan used now.  The only difference is the coordinate system is GPS North, South, East and West in meters rather than piDrone take-off orientation of forward, backward, left and right.
  4. The compass will be used to determine the initial orientation WRT NSEW (magnetic) and set the base level yaw angle thus aligning it with the earth frame GPS flight plan (note to self – consider adding gain as well as the current offsets to the compass calibration; also what’s the angular error between compass and GPS north here?)
  5. The flight then runs using only the existing yaw control code – compass and GPS information are just logged as now.

This intermediate steps have clarified a few uncertainties and I definitely do now need to get on with it.

A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.