Remote control

I’d run out of ideas of what to do next, and then it struck me: I could finally add an RC, and with just a few seconds’ thought, I realised it would be reletively simple.

Back at the end of 2013, about a year into my piDrone project, I’d clearly considered this – and yes, that is a Raspberry Pi 1A!!!

Remote Control Prototype

Remote Control Prototype

I’m guessing this was the point I realised I was way too ignorant, and realised autonomous control was easier, ironically.  I still have these I2C Grayhill 67A joysticks, and now RC control is much easier to add safely – it’s just another poll.poll() input to the Autopilot process like the Sweep- and GPS-flight plan inputs are now.  And this is perfect for Penelope.  Hermione will remain the OTT autonomous control, and ‘P’ will have some degree of manual control.  And in a way, this is much like my DJI Mavic – it hands control to the human only when it is safe to do so.

The RC will be based on the new RPI B3+ with it’s improved WiFi.  I’ve already done a prototype of how the joysticks attach to a Pimoroni PiBow Ninja body.

Pibow Joysticks

Pibow Joysticks

And actually, that feeds into the GPS fusion with the IMU ∑∑(acceleration – gravity)δtδt, and LiDARs distance inputs when the LiDARs are out of range.  I’m a lot more comfortable testing it this way rather than flying her over a lake!

My lord, I have a cunning plan!


P.S. I’m calling the RC “Phoebe”, who alongside Penelope make up team “Pi²”.

P.P.S. In one of those convenient coincidences, my one unknown was how to power Phoebe.  Then this appeared via Twitter.  I hope they do another production run.  It’s these synchronised occurrences that force you to consider whether there is a higher being playing chess with one’s life!

P.P.P.S. Changed my mind: the RC is called “Ivy” and together with “Penelope”, they make the “PI” team!

Obstruction avoidance test 2 – PASSED!!!!!

After a minor tweak to the handling resolution of the Scanse Sweep data, all works brilliantly.

This is a five metre forwards flight, with the flight paused and later resumed once the obstacle has been avoided.  Note that ‘H’ tracks the obstruction at about one meter away.  Hence she flies a quarter circle around the circular cardboard tube, before continuing the forward flight when the obstruction is behind her.

The code is updated on GitHub as a result.

WAP RPi 3B+ and Stretch

So Hermione is a 3B running Jessie from February 2017 due to network and I2C problems added in the March release.

Penelope is now a 3B+ running Stretch.  As a result, I want to update to the latest WAP software and I2C.  The biggest problem for me is setting up the isolated Wireless Access Point.  I’ve finally solved it with a lot of help from my friends on the RPi Forum.  Here’s how:

To set up an isolated AP:

  • Follow these instructions up to but not including “ADD ROUTING AND MASQUERADE” section.
  • Comment out any “NETWORK={” entries in /etc/wpa_supplicant/wpa_supplicant.conf
  • reboot

To disable the isolated AP to contact the internet for apt-get updates etc:

  • In /etc/default/hostapd set DAEMON_CONF=””
  • Reinstate the “NETWORK={” entries in /etc/wpa_supplicant/wpa_supplicant.conf for your home network ssid etc
  • Comment out the AP interface IP details in /etc/dhcpcd.conf
  • reboot

To solve the I2C, I’m waiting for the Garmin LiDAR-Lite v3HP which has been consistently delayed by up to 8 weeks since (at least) the start of February i.e. it’s still delayed by up to 8 weeks, two months later!  The Garmin LiDAR-Lite v3 exposed the I2C problem: it was strictly running I2C rules whereas the RPi I2C included an incompatible bend of the rules commonly supported many other devices e.g the MPU-9250 IMU.  I’m hoping the delay is due to the fact they are updating their I2C implementation to support the bent rules.

THBAPSA


P.S. Penelope has a new salad bowl:

Penelope's Salad Bowl

Penelope’s Salad Bowl

OA test 1 analysis

My fault: when an obstacle is detected, the autopilot tells ‘H’ to move 90°, parallel to the obstacle, until she reaches the safe zone.  That’s not what the logs from the autopilot say:

AP: PHASE CHANGE: RTF
AP: PHASE CHANGE: TAKEOFF
AP: PHASE CHANGE: HOVER
AP: FILE FLIGHT PLAN
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 6 DEGREES.
AP: PHASE CHANGE: AVOID @ 103 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 7 DEGREES.
AP: PHASE CHANGE: AVOID @ 94 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 13 DEGREES.
AP: PHASE CHANGE: AVOID @ 81 DEGREES
AP: AVOIDING OBSTACLE @ 11 DEGREES.
AP: PHASE CHANGE: AVOID @ 79 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 15 DEGREES.
AP: PHASE CHANGE: AVOID @ 76 DEGREES
AP: AVOIDING OBSTACLE @ 15 DEGREES.
AP: PHASE CHANGE: AVOID @ 72 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 14 DEGREES.
AP: PHASE CHANGE: AVOID @ 70 DEGREES
AP: AVOIDING OBSTACLE @ 18 DEGREES.
AP: PHASE CHANGE: AVOID @ 72 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 23 DEGREES.
AP: PHASE CHANGE: AVOID @ 72 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 31 DEGREES.
AP: PHASE CHANGE: AVOID @ 73 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 31 DEGREES.
AP: PHASE CHANGE: AVOID @ 65 DEGREES
AP: AVOIDING OBSTACLE @ 37 DEGREES.
AP: PHASE CHANGE: AVOID @ 59 DEGREES
AP: OBSTACLE AVOIDED, RESUME PAUSED
AP: PHASE CHANGE: FORE
AP: AVOIDING OBSTACLE @ 40 DEGREES.
AP: PHASE CHANGE: AVOID @ 71 DEGREES
AP: PROXIMITY LANDING 1.46 METERS
AP: PHASE CHANGE: PROXIMITY CRITICAL 0.67m
AP: LANDING COMPLETE
AP: FINISHED

Frankly, it’s a miracle the video looked as good as it did! Clearly, there’s something wrong with compensating direction angles, as the OBSTACLE and AVOID angles should be ±90°.  Superficially this should be simple to fix, fingers crossed.


P.S. Based on the video, Sweep angles are correct; it’s my code calculation that’s wrong; for example, the final sample of the object at +40°should result of a result of -50° not 71°.

P.P.S.  The bug was crass; the log used the distance not direction, and the functional code is correct.  I suspect the problem’s solution is fine-tuning of the critical vs. warning ranges.

Penelope’s progress

I’ve done nothing on Penelope since my introductory post, but now I have reason to proceed:the new Raspberry Pi 3 B+ is already on its way courtesy of pimoroni combined with the new Garmin LiDAR-Lite v3HP and the new Raspian Stretch O/S.  Together these make the motivation I need though only in the background; my main focus is object avoidance with Hermione, and I hope to post on the first results imminently.  Oddly, what’s holding things back is hardware not software: construction of the obstacle.

Back to the mundane

After a great @CotswoldJam 6th #PiParty yesterday, it’s back to bug fixing; trouble is I only have symptoms and no idea of the cure.

Here’s the various processes running, all connected by shared-memory FIFO streams:

+—————+     (1)
|Sweep|——————>——————+
+—————+             |
              +—————+———+     (3)     +——————+
              |Autopilot|——————>——————|Motion|
              +—————+———+             +———+——+
+———+               |                     |
|GPS|———————>———————+                     |
+———+      (2)                            |
                                          |
                      +—————+     (4)     |
                      |Video|——————>——————+
                      +—————+

The problem is this: when using the GPS and Sweep processes individually, the system works beautifully, but together, the Motion process poll.poll() doesn’t pick up what the Autopilot process sends it.  From the logs for each process, GPS, Sweep, Video and Autopilot are working, sending their data to their neighbour; it’s just Motion that picks up just Video not Autopilot data from the FIFOs.

I can’t help but suspect that is hardware in some way:  The Raspberry Pi 3B has 4 CPUs; I have 5 processes and 4 FIFOs between them.

While I could continue working on the pond or object avoidance enhancements without GPS and Sweep working together, I’d very much like not to.  There’s will now be a short delay until I work out what to do next.

P.S. Ideas gratfully received!


Bug fixed, code updated on GitHub

“Penelope” or “Lucy”…

…as in the Amazon series “Lucifer”?  I’ll stick with Ms. Pitstop despite the colour scheme; Lucifer never shows up on Tuesdays.

Separated at birth?

Separated at birth?

She’s still pending the new version of the Garmin LIDAR-Lite v3HP – the lower-profile, higher-accuracy version of Hermione and Zoes’ height tracking LiDAR, She’s also waiting for a new PCB so she can have a buzzer, though that’s not holding her back in the same way.  She’ll intentionally not have a Scance Sweep as it’s very very expensive for a non-critical sensor.

My intent had been to make her lower profile, sleek and stealthy to enable longer flights per battery hence the shorter legs, and lower hat and the 13 x 4.4 CF props (compared to ‘H’ 12 x 4.4 Beechwoods). However her hat and feet prevent this – the feet are true lacrosse balls, so heavier than Hermione’s indoor ones, and her salad bowl top also seems heavier.  Overall then ‘H’ weighs in at 4.8kg all installed, and Penelope 4.7kg.  Thus the main benefit is likely she’ll be nippier due to slightly more power from the lighter, larger CF props combined with the raised centre of gravity.  And in fact, this raised CoG and lighter, larger props may well reduce the power needed – we shall see.

In the background, I am working on the “Pond Problem”: fusing GPS distance / direction with the other sensors.  Code’s nigh on complete but I’m yet to convince myself it will work well enough to test it immediately over the local gravel lakes.

The lady’s not for turning

Around here in the South Cotswold, there are lakes, hundreds of them left behind once the Cotswold stone rocks and gravel have been mined from the ground.  People swim, yacht, canoe, windsurf and powerboat race around the area.  It’d be cool for a piDrone to fly from one side of a lake to the other, tracking the terrain as shown in this gravel pit just 2 minutes walk from my house.  ‘H’ start at the left, move over the pond, climb up and over the gravel and land on the far side:

Surface gravel mining

Surface gravel mining

But there’s a significant problem: neither the ground facing video nor LiDAR work over water.  For the down-facing video, there’s no contrast over the water surface for it to track horizontal movement.  For the LiDAR, the problem come when moving: the piDrone leans to move, and the laser beam doesn’t reflect back to the receiver and height reading stops working.

But there is a solution already in hand that I suspect is easy to implement and has little code performance impact, but amazing impact over the water survival: GPS is currently used in the autopilot process to compare where she is currently located compared to the target location, and pass the speed and direction through to the motion process; it would be nigh on trivial to also pass the horizontal distance and altitude difference since takeoff through to the motion process too.

These fuse with the existing ed*_input code values thus:

  • Horizontally, the GPS fuses always with the down facing PiCamera such that if / when the ground surfaces doesn’t have enough contrast (or she’s travelling too fast for video frames to overlap), the GPS will still keep things moving in the right direction and speed.
  • Vertically is more subtle; as mentioned above, the LiDAR fails when the ground surfaces doesn’t bounce the laser back to the receiver perhaps due to a surface reflection problem or simply because her maximum height of 40m has been exceeded.  In both cases, the LiDAR returns 1cm as the height to report the problem.  Here’s where GPS kicks in, reporting the current altitude since takeoff until the LiDAR starts getting readings again.

Like I’ve said, it’s only a few lines of relatively simple code.  The problem is whether I have the puppies’ plums to try it out over the local lakes?  I am highly tempted, as it’s a lot more real than the object avoidance code for which there will never be a suitable maze.  I think my mind is changing direction rapidly.