Penelope’s progress

I’ve done nothing on Penelope since my introductory post, but now I have reason to proceed:the new Raspberry Pi 3 B+ is already on its way courtesy of pimoroni combined with the new Garmin LiDAR-Lite v3HP and the new Raspian Stretch O/S.  Together these make the motivation I need though only in the background; my main focus is object avoidance with Hermione, and I hope to post on the first results imminently.  Oddly, what’s holding things back is hardware not software: construction of the obstacle.

“Penelope” or “Lucy”…

…as in the Amazon series “Lucifer”?  I’ll stick with Ms. Pitstop despite the colour scheme; Lucifer never shows up on Tuesdays.

Separated at birth?

Separated at birth?

She’s still pending the new version of the Garmin LIDAR-Lite v3HP – the lower-profile, higher-accuracy version of Hermione and Zoes’ height tracking LiDAR, She’s also waiting for a new PCB so she can have a buzzer, though that’s not holding her back in the same way.  She’ll intentionally not have a Scance Sweep as it’s very very expensive for a non-critical sensor.

My intent had been to make her lower profile, sleek and stealthy to enable longer flights per battery hence the shorter legs, and lower hat and the 13 x 4.4 CF props (compared to ‘H’ 12 x 4.4 Beechwoods). However her hat and feet prevent this – the feet are true lacrosse balls, so heavier than Hermione’s indoor ones, and her salad bowl top also seems heavier.  Overall then ‘H’ weighs in at 4.8kg all installed, and Penelope 4.7kg.  Thus the main benefit is likely she’ll be nippier due to slightly more power from the lighter, larger CF props combined with the raised centre of gravity.  And in fact, this raised CoG and lighter, larger props may well reduce the power needed – we shall see.

In the background, I am working on the “Pond Problem”: fusing GPS distance / direction with the other sensors.  Code’s nigh on complete but I’m yet to convince myself it will work well enough to test it immediately over the local gravel lakes.

The lady’s not for turning

Around here in the South Cotswold, there are lakes, hundreds of them left behind once the Cotswold stone rocks and gravel have been mined from the ground.  People swim, yacht, canoe, windsurf and powerboat race around the area.  It’d be cool for a piDrone to fly from one side of a lake to the other, tracking the terrain as shown in this gravel pit just 2 minutes walk from my house.  ‘H’ start at the left, move over the pond, climb up and over the gravel and land on the far side:

Surface gravel mining

Surface gravel mining

But there’s a significant problem: neither the ground facing video nor LiDAR work over water.  For the down-facing video, there’s no contrast over the water surface for it to track horizontal movement.  For the LiDAR, the problem come when moving: the piDrone leans to move, and the laser beam doesn’t reflect back to the receiver and height reading stops working.

But there is a solution already in hand that I suspect is easy to implement and has little code performance impact, but amazing impact over the water survival: GPS is currently used in the autopilot process to compare where she is currently located compared to the target location, and pass the speed and direction through to the motion process; it would be nigh on trivial to also pass the horizontal distance and altitude difference since takeoff through to the motion process too.

These fuse with the existing ed*_input code values thus:

  • Horizontally, the GPS fuses always with the down facing PiCamera such that if / when the ground surfaces doesn’t have enough contrast (or she’s travelling too fast for video frames to overlap), the GPS will still keep things moving in the right direction and speed.
  • Vertically is more subtle; as mentioned above, the LiDAR fails when the ground surfaces doesn’t bounce the laser back to the receiver perhaps due to a surface reflection problem or simply because her maximum height of 40m has been exceeded.  In both cases, the LiDAR returns 1cm as the height to report the problem.  Here’s where GPS kicks in, reporting the current altitude since takeoff until the LiDAR starts getting readings again.

Like I’ve said, it’s only a few lines of relatively simple code.  The problem is whether I have the puppies’ plums to try it out over the local lakes?  I am highly tempted, as it’s a lot more real than the object avoidance code for which there will never be a suitable maze.  I think my mind is changing direction rapidly.

Refinement time.

Sorry it’s been quiet; the weather’s been awful, so no videos to show.  Instead, I’ve been tinkering to ensure the code is as good as it can be prior to moving on to object avoidance / maze tracking.

Zoe is back to life once more to help with the “face the direction you’re flying” tweak testing as these don’t quite work yet.  She’s back as my first few attempts with ‘H’ kept breaking props.  First job for ‘Z’ was to have her run the same code as Hermione but with the autopilot moved back to inline to reduce the number of processes as possible for her single CPU Pi0W in comparison with Hermione’s 4 CPU 3B.


  • I’ve started running the code with ‘sudo python -O ./’ to enable optimisation.  This disable assertion checking, and hopefully other stuff for better performance.
  • I’ve tweaked the Butterworth parameters to track gravity changes faster as Zoe’s IMU is exposed to the cold winds and her accelerometer values rise rapidly.
  • I’ve refining the Garmin LiDAR-Lite V3 to cope with occasional ‘no reading’ triggered caused by no laser reflection detected; this does happen occasionally (and correctly) if she’s tilted and the surface bouncing the laser points the wrong way.
  • I’d also hoped to add a “data ready interrupt” to the LiDAR to reduce the number of I2C requests made; however, the interrupts still don’t happens despite trying all 12 config options. I think the problem is Garmin’s so I’m awaiting a response from them on whether this flaw is fixed in a new model to be launched in the next few weeks .  In the meantime, I only call the GLL I2C when there’s video results which need the GLL vertical height to convert the video pixel movements into horizontal movement in meters.

Having added and tested all the above sequentially, the net result was failure: less bad a failure than previously, but failure nonetheless; the video tracking lagged in order to avoid the IMU FIFO overflowing.  So in the end, I changed Zoe’s video resolution to 240 pixels² @ 10 fps (‘H’ is at 320 pixel² @ 10 fps, and she now can hover on the grass which means I can get on with the “face where you’re going” code.

I do think all the other changes listed are valid and useful, and as a result, I’ve updated the code on GitHub.

In passing, I had also been investigating whether the magnetometer could be used to back up pitch, roll and yaw angles long term, but that’s an abject failure; with the props on idle prior to takeoff, it works fine giving the orientation to feed to the GPS tracking process, but once airborne, the magnetometer values shift by ±40° and varies depending which way she’s going while facing in the same direction.

Sixty seconds sanity check

A 60 seconds hover in preparations for my next GPS tracking flight on Wednesday when the sun will be out:

Weather conditions are poor today: light contrast is low due to thick clouds and the temperature is about 3°C.  The code is using the butterworth filter to extract gravity from the accelerometer, and the hover height is set higher at 1.5m to avoid the longer grass in the field next door on Wednesday: takeoff at 50cm/s for 3 seconds to get away from the ground quickly; descent is at 0.25cm/s for 6 seconds for a gentle landing.  The aim here is to move the down-facing LiDAR into a zone where it’ll be more accurate / stable vertically, while checking the down-facing video still provides accurate horizontal stability at this higher height, lower contrast lawn surface.

The check has passed well, boding well for Wednesdays GPS field flight!

5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Finally found the fecking wobbles

I ‘think’ there are two factors: the colder temperature reduced the power of the LiPo; this then made the system a little less able to react to distance errors, causing it to rotate more to correct horizontal distance drift, and this in turn exposed a long term bug that completely failed to compensate for horizontal distance changes due to changes in pitch / roll angles (the code was there, but had a crass bug).

The cool LiPo has been fixed with a walkers / skiers pocket-warmer strapped firmly on top, keeping it lovely and cosy.

The crass video horizontal tracking error has been fix also.  As a result, GitHub has been updated, and naively once more I can continue working on the GPS tracking.

More on mapping

It’s been a frustrating week – despite lovely weather, lots broke, and once each was fixed, something else would break.  To top is all, an update to the latest version of Jessie yesterday locked the RPi as soon as I kicked off a passive flight.  I backed this ‘upgrade’ out as a result.  I now I have everything back and working, confirm by hover and 10m lateral flights this morning, although the latter aborted half-way through with an I2C error.  Underlying it all is power to the GLL and RPi 3B – I was seeing lots of brown-out LED flashes from the B3 and lots of I2C and GLL errors.  I’m consider swapping back to a 2B+ overclocked to 1.2GHz as a result.

In the meantime I have been looking at mapping in more detail as it’s complex and it needs breaking down into easy pieces.  Here’s the general idea:

Polystyrene block layout

Polystyrene block layout

Each polystyrene block is 2.5m long, 1.25m high and 10cm thick.  They are pinned together with screw-in camping tent pegs.  The plan is to

  • fly 10m  at 1m height without the ‘maze’, logging compass and GPS to check the results, in particular to see whether
    • GPS can be gently fused with RPi ground facing motion tracking to enhance lateral motion distance measurements
    • compass can be fused with IMU gyro yaw rate to enforce a better linear flight
  • fly 10m without the ‘maze’ again but with fused compass and GPS (assuming the above is OK)
  • add the ‘maze’ and fly in a straight 10m line from bottom to top again as a sanity check
  • add the Sweep and log it’s contents when doing the same 10m again
  • build the flight map in Excel based upon GPS, compass and sweep logs – the results should look like the map with the addition of what garden clutter lies beyond the end of each exit from the ‘maze’
  • add a new mapping process to do dynamically what has been done in Excel above
  • add object avoidance from Sweep and repeat – this is the hardest bit as it introduces dynamic updates to preconfigured flight plans
  • add ‘maze’ tracking code to reach a target GPS position, nominally the center of the ‘maze’ – this stage requires further thought to break it down further.

£55.71 import duty* later…

and this is what UPS handed over to me in return.

Scanse Sweep 2D LiDAR tracker

Scanse Sweep 2D LiDAR tracker

First impressions are extremely good which bodes well for meeting my high requirements and expectations.  The box is sturdy, and the contents are not going to get damaged in transit; in addition to the Sweep itself, there’s

  • a custom UART / USB adaptor that’s small and slick
  • one USB A to micro-B for a PC connection – as cables like this go, this is probably the slickest looking I’ve seen – flat cable with very tidy connectors
  • one JST to bare wires to add your own connection – this is probably what I’ll be using to attach to the RPi UART ttyAMA0 as its the same wires that my Garmin LiDAR-Lite uses so I have spares
  • one cable I have no idea what it’s for – JST at one end clearly for the Sweep, but no idea what the other end’s for
  • a sticker, oddly requiring microwaving to make it sticky – I’ll be keeping this boxed!
  • a quick start guide.

I’m not in a rush to install it on Hermione; it’s too expensive to risk while I’m busy testing out the GPS, compass and yaw code – but I will put together a basic system, probably on my piPad.  That’ll let me check whether my double 2.4A battery bank is sufficient for both a B3 and the 0.5A the Sweep needs – I suspect this will be the first problem I’ll be having to fix; certainly, the raw Garmin LiDAR already runs off it only 2.4A feed due to power problems.

*I’m glad we are part of the EU and the free trade agreement means there is no import duty – oh no, wait, feck, argh – BREXIT! }:-{(>

Still stuck

Hermione is still causing trouble with yaw control flights despite lots of refinements.  Here’s the latest.

Hermione's troubles

Hermione’s troubles

@3s she’s climbed to about a meter high and then hovered for a second.  All the X, Y, and Z flight plan targets and sensor inputs are nicely aligned.

The ‘fun’ starts at 4 seconds.  The flight plan, written from my point of view says move left by 1m over 4 seconds.  From Hermione’s point of view, with the yaw code in use, this translates to rotate anti-clockwise by 90° while moving forwards by 1m over 4 seconds.  The yaw graph from the sensors shows the ACW rotation is happening correctly.  The amber line in the Y graph shows the left / right distance target from H’s POV is correctly zero.  Similarly, the amber line in the X graph correctly shows she should move forwards by 1m over 4s.  All’s good as far as far as the targets are concerned from her and my POV.

But there’s some severe discrepancy from the sensors inputs POV.  From my POV, she rotated ACW 90° as expected, but then she moved forwards away from me, instead of left.  The blue line on the Y graph (the LiDAR and ground-facing video inputs) confirms this; it shows she moves right by about 0.8m from her POV.  But the rusty terracotta line in the Y graph (the double integrated accelerometer – gravity readings) shows exactly the opposite.  The grey fusion of the amber and terracotta cancel each other out thus following the target perfectly but for completely the wrong reasons.

There are similar discrepancies in the X graph, where the LiDAR + Video blue line is the best match to what I saw: virtually no forward movement from H’s POV except for some slight forward movement after 8s when she should be hovering.

So the net of this?  The LiDAR / Video processing is working perfectly.  The double integrated IMU accelerometer results are wrong, and I need to work out why?  The results shown are taken directly from the accelerometer, and double integrated in excel (much like what the code does too), and I’m pretty convinced I’ve got this right.  Yet more digging to be done.

In other news…

  • Ö has ground facing lights much like Zoe had.  Currently they are always on, but ultimately I intend to use them in various ways such as flashing during calibration etc – this requires a new PCB however to plug a MOSFET gate into a GPIO pin.
  • piNet has changed direction somewhat: I’m testing within the bounds of my garden whether I can define a target destination with GPS, and have enough accuracy for the subsequent flight from elsewhere to get to that target accurately.  This is step one in taking the GPS coordinates of the centre of a maze, and then starting a flight from the edge to get back there.

That’s all for now, folks.  Thanks for sticking with me during these quiet times.

P.S. I’ve got better things to do that worry about why everything goes astray @ 7s, 3s after the yaw to move left started; it’s officially on hold as I’ve other stuff lurking in the background that’s about the flower.