OK, so this is weird…

When I first added the autopilot process, it would update the main process at 100Hz with the current distance vector target; the main process couldn’t quite keep up with what it was being fed, but it was close.  The down side was the video processing dropped rate through the floor, building up a big backlog, meaning there was a very late reaction to lateral drift.

So I changed the autopilot process to only send velocity vector targets; that meant autopilot sent an update to the main process every few seconds (i.e. ascent, hover, descent and stop updates) rather than 100 times a second for the distance increments; as a result, video processing was running at full speed again.

But when I turned on diagnostics, the main process can’t keep up with the autopilot despite the fact they are only send once every few seconds.  A print to screen the messages showed they were being sent correctly, but the main process’ select() didn’t pick them up: in a passive flight, it stayed at a fixed ascent velocity for ages – way beyond the point the autopilot prints indicated the hover, descent and stop messages had been sent .  Without diagnostics, the sending and receipt of the messages were absolutely in sync.  Throughout all this, the GPS and video processes’ data rates to the main process were low and worked perfectly.

The common factor between autopilot, GPS, video and diagnostics is that they use shared memory files to store / send their data to the main processor; having more than one with high demand (autopilot at 100Hz distance target or diagnostics at 100Hz) seemed to be the cause for one of the lower frequency shared memory sources simply to not be spotted as far as the main process’ select() was concerned.  I have no idea why this happens and that troubles me.

This useful link shows the tools to query shared memory usage stats.

df -k /dev/shm shows only 1% shared memory is used during a flight

Filesystem 1K-blocks Used Available Use% Mounted on
tmpfs 441384 4 441380 1% /dev/shm

ipcs -pm shows the processes owning the shared memory:

------ Shared Memory Creator/Last-op PIDs --------
shmid owner cpid lpid 
0 root 625 625 
32769 root 625 625 
65538 root 625 625 
98307 root 625 625 
131076 root 625 625

ps -eaf | grep python shows the processes in use by Hermione. Note that none of these’ process IDs are in the list of shared memory owners above:

root 609 599 0 15:43 pts/0 00:00:00 sudo python ./qc.py
root 613 609 12 15:43 pts/0 00:02:21 python ./qc.py
root 624 613 0 15:43 pts/0 00:00:03 python /home/pi/QCAPPlus.pyc GPS
root 717 613 1 16:00 pts/0 00:00:01 python /home/pi/QCAPPlus.pyc MOTION 800 800 10
root 730 613 14 16:01 pts/0 00:00:00 python /home/pi/QCAPPlus.pyc AUTOPILOT fp.csv 100

Oddly, it’s the gps daemon with the shared memory creator process ID:

gpsd 625 1 4 15:43 ? 00:01:00 /usr/sbin/gpsd -N /dev/ttyGPS

I’m not quite sure yet whether there’s anything wrong here.

I could just go ahead with object avoidance; the main process would only have diagnostics as it’s main high speed shared memory usage.  Autopilot can maintain the revised version of ony sending low frequency velocity vector target changes.  Autopilot would get high frequency input from the Sweep, but convert that to changes of low frequency velocity targets sent to the main process.  This way, main has only diagnostics, and autopilot only has sweep as fast inputs.  This is a speculative solution.  But I don’t like the idea of moving forward with an undiagnosed weird problem.

Odds and sods

Progress and thus blog updates have been delayed by the weather: too windy last week; this week, too hot and sunny.  When the sun is high, the camera can’t resolve sufficient contrast between the clover flowers and the grass in the mown lawn  Because sunrise is 5am, that means I can only do test flights in the evening.  It also means that in coming to this conclusion, I’ve broken 5 props so far.  Very frustrated and expensive.

As a result, I’m still to really confirm that autopilot process is working well, and Sweep still lies on my office table.

On the slight plus side, I’ve enhanced the GPS data stream to the main process; I suspect I was throwing too much away by using first ‘string’ and then ‘float’ to pack the data.  I’ve just upped it to a ’64bit float’.  Iff this works as I hope, that may be all that’s necessary to track “where am I?” in accurate GPS units only, using Sweep + compass just to spot orientation of current paths / hedges in the maze. allowing the autopilot to choose “which way now?”.  Any mapping for “have I been here before?” can be added as an enhancement; initially it will be a random choice of the various path directions available regardless of whether they’ve been visited already.  But this is all a long way in the future.


A little later after writing the above, a speckled shade from a large tree was cast over part of the back garden and I managed to collect this GPS plot of a supposed 5m flight NE over the course of a 20s i.e. 20 GPS samples:

GPS resolution

GPS resolution

She struggled with the video tracking and once she left the shade, chaos ensued, but I did get the GPS stats, which clearly shows a much higher resolution initially than I was getting before.  So that’s good.  Yet another prop snapped on the aborted flight as she headed towards the wall. So that’s prop number 6 or £110 in real terms – completely unaffordable.

The multiple breakages are again weather based: two weeks of no rain means once more the lawn is rock solid, and props clipping the ground on an aborted landing snap instead of embedding in soft soil.

Boules

Flight plan processing has now been moved to a new autopilot process ultimately with the aim of feeding Sweep object detection into the autopilot, which in turn swaps from the current flight plan to an ’emergency landing’ flight plan, updating the main process to hover and descend to ground. This flight proves the autopilot feed works, though does raise other problems, primarily the drift over the course of a 3 second takeoff, 5 second hover and 4 second land.

I suspect the drift is caused by video processing: it was processing the 10Hz output at 7Hz, leading to lag.  This reduced processing speed may be caused by overheating of the RPi CPU, causing it to throttle back.

I’d set

force_turbo=1

in /boot/config.txt to make sure the CPU runs at fully speed always, rather than when it ‘thinks’ it’s busy.  This, combined with the fact the temperature today is 24° in the shade may well be enough for the CPU to throttle back due to overheating instead.  Certainly just a few days ago when the outdoor temperature was in the mid-teens and a cool breeze was blowing, there was no drift.  The CPU has a heatsink, but it is a case.  I may have to either remove the case (exposing it to the elements) or add a cooling fan to the case.  If possible, the latter would be better as the case is there to provide a constant temperature to IMU temperature drift.  I’ve found this one which might just fit in.

Note the kids’ pétanque kit is deliberately on the ground to provide weighty contrasting areas that the grass no longer supplies to the video.  Off to mow the lawn now.

Brain dump

Just jotting down some thoughts to share…

The top two sketches are yesterday’s and future processes used by my code.  Today, the code is more like the left, except the pilot has been broken out to its own process: it still just reads the file based flight plan.  It’s been tested passively, and the logs show correct data fed to, read by and logged by the motion process – the fact the flight ends ‘underground’ is there intentionally to ensure landing always touches down:

Autopilot flight plan targets

Autopilot flight plan targets

The lower left in my sketch defines several steps adding ‘AI’ to the autopilot.  This example uses a normal file based flight plan to move 10m forward; part of the way, there is a diagonal wall in the way.  First step is to feed Sweep to the autopilot from the motion processor so the pilot can see the wall.  Next step is to hover / land when the pilot spots an obstacle that’s too close.  Finally, the pilot leaves the motion hovering an indeterminate amount of time while the wall is mapped and a diversion created to redirect Hermione around the obstacle.

Intelligent Autopilot

I’m posting this to crystallise a lot of interrelated parallel thoughts into a working serial plan.  What you read below is probably the 20th draft revision:


Currently, each flight has a flight plan written by hand and saved to file beforehand: lines of 3D velocities plus how long to fly at those velocities e.g.

# - evx is fore / aft velocity in m/s; fore is positive
# - evy is port / starboard velocity in m/s; port is positive 
# - evz is up / down velocity in m/s; up is positive
# - time is how long to maintain that speed in seconds
# - name is an arbitrary name put out to console to show flight progress

# evx, evy, evz, time, name
 0.0, 0.0, 0.3, 2.0, ASCENT
 0.0, 0.0, 0.0, 6.0, HOVER
 0.0, 0.0,-0.3, 2.0, DESCENT

The main motion process uses this to calculate the distance PID targets at a given point in time; it’s efficient, but inflexible – it doesn’t react to the world around it.  To incorporate object avoidance, GPS waypoint targets, and ultimately maze mapping and tracking, I’ve decided to add a well-informed autopilot process to do the hard work, and leave the motion process in ignorant bliss.  As a separate process, the LINUX kernel scheduling should use a different RPi 3B CPU core for the autopilot than that used by the already overworked motion process.

Initially, the autopilot process will use the same input file as before, feeding the latest distances targets over an OS FIFO to be picked up promptly by the motion process much as GPS, video and Sweep do now.  That will ease the load on the motion process slightly as a result.

Once tested, GPS and Sweep will move from the motion process to the autopilot so it can make intelligent decisions dynamically about the distance targets it sends to be motion process; the time-critical motion processing remains oblivious to brick walls, hedges and where it’s been, it just blindly does what the intelligent autopilot tells it based upon the brick walls, hedges and mapping the autopilot knows of.

The only minor problem I can see currently is that the autopilot needs to know the direction the quad is pointing so it can yaw the quad frame Sweep data to align with the earth frame GPS flight data; because the compass is part of the IMU code on the motion process, I either need a FIFO feeding this back to the autopilot, or to access the IMU compass directly from the autopilot via I2C.  Plenty of time to resolve that though while I get the basics working first.

The motivation for doing this is I don’t want to attach the expensive Sweep to Hermione until all future significant code changes have been test and any significant bugs have been made and tested thoroughly.

For the record, the current code on GitHub has been updated.

“To mow, or not to mow, that is the question:

Whether ’tis easier for the macro-blocks to track
The clovers and daisies of contrasting colour..”

The answer is no, I shouldn’t have mown the lawn.  With the kids’ toys moved out of the way, and any ungreen contrasting features slain, there was nothing to distinguish one blade of shorn grass from another and Hermione drifted wildly.  Reinstating the kids’ chaos restored high quality tracking over 5m.

The point of the flight was to compare GPS versus macro-block lateral tracking.  Certainly over this 5m flight, down-facing video beat GPS hands down:

Lateral tracking

Lateral tracking

My best guess interpretation of the GPS graph is that the flight was actually from the 2 – 7m diagonally heading north west.  The camera POV doesn’t include compass data, so it’s correctly showing her flying forwards by 5m.  The compass code is not working accurately yet – it needs more investigation why not – it was showing ~90° (i.e. East) rather than the true 45° (i.e. North East) shown by the GPS and a handheld compass.

I’ve done some more refinements to scheduling the sensor reads, and also accuracy of GPS data streamed from the GPS process.  It’s worth viewing this graph full screen – each spike shows the time in seconds between motion processing loops – i.e. the time spent processing other sensors – 10ms indicates just IMU data was processed.  The fact no loop takes more than 0.042s* even with full diagnostics running means I could up the sampling rate back to 1kHz – it’s at 500Hz at the moment.  More importantly, it shows processing is nicely spread out and each sensor is getting it’s fair share of the processing and nobody is hogging the limelight.

Motion processing

Motion processing

As a result, I’ve updated the code on GitHub.


*42ms is the point where the IMU FIFO overflows at 1kHz sampling – 512 FIFO size / 12 bytes sample size / 1kHz sampling rate

Power supply

I’ve an ongoing concern about the power requirements to run an RPi, IMU, Camera, LiDAR, WiFi, GPS, ESCs and soon the Sweep.

Today, I bit the bullet, and cut open a USB cable, split the red-wire, and attached my multimeter to the split wire to measure the current:

 This shot was taken during a passive flight with the RPi 3B running.  All but the LiDAR was powered by the snipped power line i.e. RPi 3B, IMU, GPS, WiFi, ESCs and camera.  When my code wasn’t running, it dropped to about 0.4A.  I was stunned at how low it was!  Adding the LiDAR and Scanse sweep (135mA and 650mA max respectively according to the spec) makes a grand maximum total of 1.37A drawn.

And that means I should be able to simplify the system significantly with just one 2.4A battery bank running everything.  I don’t quite believe it, so the next step is simply try it!


P.S. I couldn’t resist it and plugged the Scanse Sweep in and ran another flight; 1.07A was the highest achieved.  This wasn’t operating its LiDAR, just rotation, but it’s LiDAR is just another Garmin LiDAR-Lite V3, so the total with those two comes out as 1.34A, confirming the best guestimate above!


P.P.S. That went so well, I swapped PCBs to now everything is fed via the RPi micro USB cable from a 2.4A battery bank. I even attached the sweep, and it still only just touched 1A. Next step is to get the rain to stop, and try this for real.


P.P.P.S. I did a very short real flight and at all worked well, confirming each problem over the last couple of weeks was due to SD card corruption. I hope to get a video tomorrow (weather permitting), and then move on to add Scanse Sweep object avoidance.

I am Midas…

except this week, everything I touch has turned to fool’s gold.

It’s a bizarre broad range of problems, the latest of the many is python problems with PiCamera and RPIO code libraries.  It was these last two that finally convinced me I had a corrupted SD card, probably due to power brown-outs caused by running an RPi B3 with all the sensors.

The new SD card is on its way, but due to the joys of a UK bank-holidays, it won’t be with me until Tuesday.  Thereafter starts a new install from scratch, following my own instructions.

Until then, it’s all quiet on the western front of the Cotswold.

DIY vs DJI

What would you spend £1500 on?

Just the core hardware for Hermione?

  • CF Frame £240
  • 8 x T-motor P13x4.4 CF Props £145
  • 8 x T-motor U3 Motors £515
  • 8 x T-motor AIR 40A ESCs £215
  • 2 x Gens-Ace Tattu LiPo £180
  • Garmin LiDAR-Lite V3 £115
  • Scanse Sweep £270

Or 3 complete kits of these?

DJI Spark

DJI Spark

My advice, don’t try a DIY drone unless you are stupid as I am!