2m square

While waiting for half-decent weather, I’ve been tinkering and tweaking to make sure everything is as good as possible before moving on to the GPS tracking code. This is the result: a 2m square.

As a result, I’ve updated the code on GitHub.

Encroaching Hermione’s personal space

With Scanse Sweep installed underneath (yes, she has blind spots from her legs and the WiFi antenna), any object detected between 50cm (the distance to tip of her props) and 1m (her personal space boundary) now triggers a controlled landing.  The same thing would happen if the obstacle wasn’t me approaching her, but instead, her approaching a brick wall: a vertical controlled descent to ground.

There’s a lot more that can be built on this; the Sweep is rotating at 1Hz (it can do up to 10Hz), and its taking about 115 samples per loop, each reporting both the rotation position (azimuth) and distance to the nearest object at that rotation.  Currently the code only collects the shortest distance per loop, and if under 1m, the standard file-based flight plan is replaced with a dynamically created descent flight plan based upon the height that Hermione should have reached at that point with the file-based flight plan.

Here’s the layout of communication between the 5 processes involved:

          +—————+     +—————————+
          |Sweep|——>——|Autopilot|——>——+
          +—————+     +—————————+     |
                                      |
                         +———+     +——————+
                         |GPS|——>——|Motion|
                         +———+     +——————+
                                      |
                          +—————+     |
                          |Video|——>——+
                          +—————+

The latest code updates are on GitHub.

Next step is to move GPS to also feed into Autopilot.  The move is easy, just a couple of minutes to move who starts the GPS process; the difficult bit is how the autopilot should handle that extra information.  Currently the plan is that before a flight, Hermione is taken to the desired end-point of the flight, and she captures the GPS coordinates.  Then she’s moved to somewhere else, and pointing in any direction; on take-off, she finds her current GPS position, and the autopilot builds a dynamic flight plan to the end-point; all the constituent parts of the code are already in place.  It’s just the plumbing that needs careful creation.


P.S. That was the first live test flight, hence the slightly nervous look on my face, and my step backwards once she’d detected my intrusions!

P.P.S: Proof that the abort was triggered courtesy of the logs:

[CRITICAL] (MainThread) fly 3467, ASCENT
[CRITICAL] (MainThread) fly 3467, HOVER
[CRITICAL] (MainThread) fly 3467, ABORT (0.88m)
[CRITICAL] (MainThread) fly 3467, STOP
[CRITICAL] (MainThread) fly 4087, Flight time 16.627974

An uninterrupted flight would have run for 22s where descent would have started at 18s.

Resistance is futile…


Given her previous hover flight was so good, I couldn’t resist a 10m flight directly away from me. She overshot the 10m target significantly, probably due to being unable to track the ground motion in the dust storm section.  I killed the flight before she hit the wall. Nevertheless, this confirms that she’s good to go with GPS tracking, firstly where she’s going, and next, actually defining where she should be going, based on a preset waypoint collected prior to a flight.

As a result, I’ve updated the code on GitHub.

Surprise Surprise…

the unexpected hits you between the eyes* (no, not literally!).

She’s sans chapeau as this was just a quick test. However, the flight was quite surprising so I thought I’d share: I’ve been tinkering with the scheduling of video processing, GPS, autopilot and the main sensor processing; I’d spotted these were getting out of sync, the primary reason being reading the autopilot and GPS OS FIFO shared memory data streams often enough for them to stay in sync, yet not too often that the read() blocked. The trivial drift over this 23s hover proved this is working – double integrated acceleration can only hold back drift for a second or two. What surprised me though is that this level of stability took place over gravel, and on checking the config, it became even more of a surprise: the video was running 320 x 320 pixels at 10Hz at 50% contrast levels and it worked brilliantly. I’d assumed higher resolution was needed and that she’d been flying at 640 x 640 (i.e. 4 times the resolution) and at that level she was both drifting, and struggling to process the video frames fast enough. I’m finally at a confident place that I can now move GPS to feed the autopilot such that the autopilot can direct the core motion processing where to go.


*courtesy of Cilla Black

OK, so this is weird…

When I first added the autopilot process, it would update the main process at 100Hz with the current distance vector target; the main process couldn’t quite keep up with what it was being fed, but it was close.  The down side was the video processing dropped rate through the floor, building up a big backlog, meaning there was a very late reaction to lateral drift.

So I changed the autopilot process to only send velocity vector targets; that meant autopilot sent an update to the main process every few seconds (i.e. ascent, hover, descent and stop updates) rather than 100 times a second for the distance increments; as a result, video processing was running at full speed again.

But when I turned on diagnostics, the main process can’t keep up with the autopilot despite the fact they are only send once every few seconds.  A print to screen the messages showed they were being sent correctly, but the main process’ select() didn’t pick them up: in a passive flight, it stayed at a fixed ascent velocity for ages – way beyond the point the autopilot prints indicated the hover, descent and stop messages had been sent .  Without diagnostics, the sending and receipt of the messages were absolutely in sync.  Throughout all this, the GPS and video processes’ data rates to the main process were low and worked perfectly.

The common factor between autopilot, GPS, video and diagnostics is that they use shared memory files to store / send their data to the main processor; having more than one with high demand (autopilot at 100Hz distance target or diagnostics at 100Hz) seemed to be the cause for one of the lower frequency shared memory sources simply to not be spotted as far as the main process’ select() was concerned.  I have no idea why this happens and that troubles me.

This useful link shows the tools to query shared memory usage stats.

df -k /dev/shm shows only 1% shared memory is used during a flight

Filesystem 1K-blocks Used Available Use% Mounted on
tmpfs 441384 4 441380 1% /dev/shm

ipcs -pm shows the processes owning the shared memory:

------ Shared Memory Creator/Last-op PIDs --------
shmid owner cpid lpid 
0 root 625 625 
32769 root 625 625 
65538 root 625 625 
98307 root 625 625 
131076 root 625 625

ps -eaf | grep python shows the processes in use by Hermione. Note that none of these’ process IDs are in the list of shared memory owners above:

root 609 599 0 15:43 pts/0 00:00:00 sudo python ./qc.py
root 613 609 12 15:43 pts/0 00:02:21 python ./qc.py
root 624 613 0 15:43 pts/0 00:00:03 python /home/pi/QCAPPlus.pyc GPS
root 717 613 1 16:00 pts/0 00:00:01 python /home/pi/QCAPPlus.pyc MOTION 800 800 10
root 730 613 14 16:01 pts/0 00:00:00 python /home/pi/QCAPPlus.pyc AUTOPILOT fp.csv 100

Oddly, it’s the gps daemon with the shared memory creator process ID:

gpsd 625 1 4 15:43 ? 00:01:00 /usr/sbin/gpsd -N /dev/ttyGPS

I’m not quite sure yet whether there’s anything wrong here.

I could just go ahead with object avoidance; the main process would only have diagnostics as it’s main high speed shared memory usage.  Autopilot can maintain the revised version of ony sending low frequency velocity vector target changes.  Autopilot would get high frequency input from the Sweep, but convert that to changes of low frequency velocity targets sent to the main process.  This way, main has only diagnostics, and autopilot only has sweep as fast inputs.  This is a speculative solution.  But I don’t like the idea of moving forward with an undiagnosed weird problem.

Boules

Flight plan processing has now been moved to a new autopilot process ultimately with the aim of feeding Sweep object detection into the autopilot, which in turn swaps from the current flight plan to an ’emergency landing’ flight plan, updating the main process to hover and descend to ground. This flight proves the autopilot feed works, though does raise other problems, primarily the drift over the course of a 3 second takeoff, 5 second hover and 4 second land.

I suspect the drift is caused by video processing: it was processing the 10Hz output at 7Hz, leading to lag.  This reduced processing speed may be caused by overheating of the RPi CPU, causing it to throttle back.

I’d set

force_turbo=1

in /boot/config.txt to make sure the CPU runs at fully speed always, rather than when it ‘thinks’ it’s busy.  This, combined with the fact the temperature today is 24° in the shade may well be enough for the CPU to throttle back due to overheating instead.  Certainly just a few days ago when the outdoor temperature was in the mid-teens and a cool breeze was blowing, there was no drift.  The CPU has a heatsink, but it is a case.  I may have to either remove the case (exposing it to the elements) or add a cooling fan to the case.  If possible, the latter would be better as the case is there to provide a constant temperature to IMU temperature drift.  I’ve found this one which might just fit in.

Note the kids’ pétanque kit is deliberately on the ground to provide weighty contrasting areas that the grass no longer supplies to the video.  Off to mow the lawn now.

A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.

Cloudy conditions

The sun wasn’t shining brightly, so no high contrast on the lawn; the lawn had been mowed, removing the contrasting grass clumps too. Yet she still did a great attempt at a 1m square. I think this is about as good as she can get – it’s time for me to move on to adding compass, GPS and object avoidance.  The code as been updated on GitHub.

Ö

I chose to name my piDrones Phoebe, Chloe and Zoe as they can all be spelt with an umlaut – don’t ask me why, I have no idea.  I ran out of umlaut names (despite research) so I opted for Hermione as the latest, greatest model as it sounds similar although she can’t ever bear an umlaut as she lacks the critical ‘oe’.

Anyway, Phoebe, Zoe and the always short-lived* Chloe have all been merged into the best of each; the result is ‘Ö’ pronounced like the french for ‘yes’.  She has Phoebe’s ESCs, motors and props, Chloe’s amazing frame, and Zoe’s Pi0W and PCB.

Ö’s build is virtually indestructible as she weighs just 1kg fully loaded.  Because she’s so light, crash torques are tiny compared to the strength of the frame; the only perceivable damage is to broken props and these are cheap from e-bay and I already have a vast stock of them.  In comparison, Hermione weighs 4kg; this, and the fact she’s so large means crash torque is huge in comparison, damage always occurs for anything but a perfect landing, and replacement frame parts and props is expensive.  Ultimately I still want to have Hermione as queen piDrone because of her X8 format, and use of a B3 4-cores allowing further sensors**, but while I’m still diagnosing the current problems, I think little miss indestructible is better suited financially to the task-in-hand.

Sadly, there’s one problem; Ö’s Pi0W isn’t fast enough to cope with video resolution higher than about 400² pixels, ruling out lawn / gravel etc.  This is what she can do successfully:

On the plus side, I think that’s just enough to sort out my understanding of Hermione’s yaw flaws.


*Chloe got retired (again) as the 1.5A regulator on the PCB was insufficient to drive the A+, IMU, Camera and LiDAR-Lite. The same I2C errors I have before returned. Swapping Chloe’s A+ to Ö’s Pi0W resolved this.

** i.e. GPS, compass and Scanse Sweep