Refinement time.

Sorry it’s been quiet; the weather’s been awful, so no videos to show.  Instead, I’ve been tinkering to ensure the code is as good as it can be prior to moving on to object avoidance / maze tracking.

Zoe is back to life once more to help with the “face the direction you’re flying” tweak testing as these don’t quite work yet.  She’s back as my first few attempts with ‘H’ kept breaking props.  First job for ‘Z’ was to have her run the same code as Hermione but with the autopilot moved back to inline to reduce the number of processes as possible for her single CPU Pi0W in comparison with Hermione’s 4 CPU 3B.

Then

  • I’ve started running the code with ‘sudo python -O ./qc.py’ to enable optimisation.  This disable assertion checking, and hopefully other stuff for better performance.
  • I’ve tweaked the Butterworth parameters to track gravity changes faster as Zoe’s IMU is exposed to the cold winds and her accelerometer values rise rapidly.
  • I’ve refining the Garmin LiDAR-Lite V3 to cope with occasional ‘no reading’ triggered caused by no laser reflection detected; this does happen occasionally (and correctly) if she’s tilted and the surface bouncing the laser points the wrong way.
  • I’d also hoped to add a “data ready interrupt” to the LiDAR to reduce the number of I2C requests made; however, the interrupts still don’t happens despite trying all 12 config options. I think the problem is Garmin’s so I’m awaiting a response from them on whether this flaw is fixed in a new model to be launched in the next few weeks .  In the meantime, I only call the GLL I2C when there’s video results which need the GLL vertical height to convert the video pixel movements into horizontal movement in meters.

Having added and tested all the above sequentially, the net result was failure: less bad a failure than previously, but failure nonetheless; the video tracking lagged in order to avoid the IMU FIFO overflowing.  So in the end, I changed Zoe’s video resolution to 240 pixels² @ 10 fps (‘H’ is at 320 pixel² @ 10 fps, and she now can hover on the grass which means I can get on with the “face where you’re going” code.

I do think all the other changes listed are valid and useful, and as a result, I’ve updated the code on GitHub.

In passing, I had also been investigating whether the magnetometer could be used to back up pitch, roll and yaw angles long term, but that’s an abject failure; with the props on idle prior to takeoff, it works fine giving the orientation to feed to the GPS tracking process, but once airborne, the magnetometer values shift by ±40° and varies depending which way she’s going while facing in the same direction.

Sixty seconds sanity check

A 60 seconds hover in preparations for my next GPS tracking flight on Wednesday when the sun will be out:

Weather conditions are poor today: light contrast is low due to thick clouds and the temperature is about 3°C.  The code is using the butterworth filter to extract gravity from the accelerometer, and the hover height is set higher at 1.5m to avoid the longer grass in the field next door on Wednesday: takeoff at 50cm/s for 3 seconds to get away from the ground quickly; descent is at 0.25cm/s for 6 seconds for a gentle landing.  The aim here is to move the down-facing LiDAR into a zone where it’ll be more accurate / stable vertically, while checking the down-facing video still provides accurate horizontal stability at this higher height, lower contrast lawn surface.

The check has passed well, boding well for Wednesdays GPS field flight!

5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Finally found the fecking wobbles

I ‘think’ there are two factors: the colder temperature reduced the power of the LiPo; this then made the system a little less able to react to distance errors, causing it to rotate more to correct horizontal distance drift, and this in turn exposed a long term bug that completely failed to compensate for horizontal distance changes due to changes in pitch / roll angles (the code was there, but had a crass bug).

The cool LiPo has been fixed with a walkers / skiers pocket-warmer strapped firmly on top, keeping it lovely and cosy.

The crass video horizontal tracking error has been fix also.  As a result, GitHub has been updated, and naively once more I can continue working on the GPS tracking.

More on mapping

It’s been a frustrating week – despite lovely weather, lots broke, and once each was fixed, something else would break.  To top is all, an update to the latest version of Jessie yesterday locked the RPi as soon as I kicked off a passive flight.  I backed this ‘upgrade’ out as a result.  I now I have everything back and working, confirm by hover and 10m lateral flights this morning, although the latter aborted half-way through with an I2C error.  Underlying it all is power to the GLL and RPi 3B – I was seeing lots of brown-out LED flashes from the B3 and lots of I2C and GLL errors.  I’m consider swapping back to a 2B+ overclocked to 1.2GHz as a result.

In the meantime I have been looking at mapping in more detail as it’s complex and it needs breaking down into easy pieces.  Here’s the general idea:

Polystyrene block layout

Polystyrene block layout

Each polystyrene block is 2.5m long, 1.25m high and 10cm thick.  They are pinned together with screw-in camping tent pegs.  The plan is to

  • fly 10m  at 1m height without the ‘maze’, logging compass and GPS to check the results, in particular to see whether
    • GPS can be gently fused with RPi ground facing motion tracking to enhance lateral motion distance measurements
    • compass can be fused with IMU gyro yaw rate to enforce a better linear flight
  • fly 10m without the ‘maze’ again but with fused compass and GPS (assuming the above is OK)
  • add the ‘maze’ and fly in a straight 10m line from bottom to top again as a sanity check
  • add the Sweep and log it’s contents when doing the same 10m again
  • build the flight map in Excel based upon GPS, compass and sweep logs – the results should look like the map with the addition of what garden clutter lies beyond the end of each exit from the ‘maze’
  • add a new mapping process to do dynamically what has been done in Excel above
  • add object avoidance from Sweep and repeat – this is the hardest bit as it introduces dynamic updates to preconfigured flight plans
  • add ‘maze’ tracking code to reach a target GPS position, nominally the center of the ‘maze’ – this stage requires further thought to break it down further.

£55.71 import duty* later…

and this is what UPS handed over to me in return.

Scanse Sweep 2D LiDAR tracker

Scanse Sweep 2D LiDAR tracker

First impressions are extremely good which bodes well for meeting my high requirements and expectations.  The box is sturdy, and the contents are not going to get damaged in transit; in addition to the Sweep itself, there’s

  • a custom UART / USB adaptor that’s small and slick
  • one USB A to micro-B for a PC connection – as cables like this go, this is probably the slickest looking I’ve seen – flat cable with very tidy connectors
  • one JST to bare wires to add your own connection – this is probably what I’ll be using to attach to the RPi UART ttyAMA0 as its the same wires that my Garmin LiDAR-Lite uses so I have spares
  • one cable I have no idea what it’s for – JST at one end clearly for the Sweep, but no idea what the other end’s for
  • a sticker, oddly requiring microwaving to make it sticky – I’ll be keeping this boxed!
  • a quick start guide.

I’m not in a rush to install it on Hermione; it’s too expensive to risk while I’m busy testing out the GPS, compass and yaw code – but I will put together a basic system, probably on my piPad.  That’ll let me check whether my double 2.4A battery bank is sufficient for both a B3 and the 0.5A the Sweep needs – I suspect this will be the first problem I’ll be having to fix; certainly, the raw Garmin LiDAR already runs off it only 2.4A feed due to power problems.


*I’m glad we are part of the EU and the free trade agreement means there is no import duty – oh no, wait, feck, argh – BREXIT! }:-{(>

Still stuck

Hermione is still causing trouble with yaw control flights despite lots of refinements.  Here’s the latest.

Hermione's troubles

Hermione’s troubles

@3s she’s climbed to about a meter high and then hovered for a second.  All the X, Y, and Z flight plan targets and sensor inputs are nicely aligned.

The ‘fun’ starts at 4 seconds.  The flight plan, written from my point of view says move left by 1m over 4 seconds.  From Hermione’s point of view, with the yaw code in use, this translates to rotate anti-clockwise by 90° while moving forwards by 1m over 4 seconds.  The yaw graph from the sensors shows the ACW rotation is happening correctly.  The amber line in the Y graph shows the left / right distance target from H’s POV is correctly zero.  Similarly, the amber line in the X graph correctly shows she should move forwards by 1m over 4s.  All’s good as far as far as the targets are concerned from her and my POV.

But there’s some severe discrepancy from the sensors inputs POV.  From my POV, she rotated ACW 90° as expected, but then she moved forwards away from me, instead of left.  The blue line on the Y graph (the LiDAR and ground-facing video inputs) confirms this; it shows she moves right by about 0.8m from her POV.  But the rusty terracotta line in the Y graph (the double integrated accelerometer – gravity readings) shows exactly the opposite.  The grey fusion of the amber and terracotta cancel each other out thus following the target perfectly but for completely the wrong reasons.

There are similar discrepancies in the X graph, where the LiDAR + Video blue line is the best match to what I saw: virtually no forward movement from H’s POV except for some slight forward movement after 8s when she should be hovering.

So the net of this?  The LiDAR / Video processing is working perfectly.  The double integrated IMU accelerometer results are wrong, and I need to work out why?  The results shown are taken directly from the accelerometer, and double integrated in excel (much like what the code does too), and I’m pretty convinced I’ve got this right.  Yet more digging to be done.

In other news…

  • Ö has ground facing lights much like Zoe had.  Currently they are always on, but ultimately I intend to use them in various ways such as flashing during calibration etc – this requires a new PCB however to plug a MOSFET gate into a GPIO pin.
  • piNet has changed direction somewhat: I’m testing within the bounds of my garden whether I can define a target destination with GPS, and have enough accuracy for the subsequent flight from elsewhere to get to that target accurately.  This is step one in taking the GPS coordinates of the centre of a maze, and then starting a flight from the edge to get back there.

That’s all for now, folks.  Thanks for sticking with me during these quiet times.


P.S. I’ve got better things to do that worry about why everything goes astray @ 7s, 3s after the yaw to move left started; it’s officially on hold as I’ve other stuff lurking in the background that’s about the flower.

POV

The problem: the camera point of view is in the quad frame; the garmin point of view is in the earth frame.  They need to both be in the same frame to produce a vector that’s meaningful.  A pretty radical rewrite of this area last night resulted.  A test flight this morning sadly was pretty much the same as yesterday: a very stable hover, but shooting off right when she should have gone left.  More stats:

POV

POV

The top pair of accelerometer vs camera show pretty good alignment, right up to the point of 0.4m to the right.  I believe this is correct, but I wouldn’t put money on it yet!

The middle pair are accelerometer vs LiDAR height over time, which is excellent.

The bottom pair are the flight plans in earth and quad frames (the quad one is simply the earth one rotated from my to her POV) – this is where there’s clearly a problem – they should be the same but they are wrong once the flight rotates.  I can’t see an obvious bug in the code, which makes me suspect there’s an obvious bug in my understanding instead.

Doesn’t bode well

I’ve been tinkering with the existing PCB layout (shown below) to see if power supply changes fix the Garmin LiDAR Lite (GLL) I²C interference.

Hermione's closeup

Hermione’s closeup

I’ve basically removed the black cuboid 1.5A 5V switching power regulator using the space instead to add a 680uF electrolytic capacitor between 5V and ground near the GLL as specified in the docs.  I then powered her up directly via the PCB, rather than indirectly via the RPi micro USB 5V output GPIO pin.  I used a lab power supply which can handle up to 5A.

Hermione showed identical I²C errors with the GLL plugged in, even if not used.  The PSU showed only 0.45mA at boot, 0.5mA with Hermione’s code running passively, and about 0.6A while she was ‘flying’.  This strongly suggests the new PCB won’t solve the problem either.  I am running out of fingers to cross 🙁

The option of using the GLL PWM output is still open, but I’m not keen to swap over to using the pigpio library this requires as it uses a daemon process in the background, unlike RPi.GPIO and RPIO which both directly access the /dev/gpio devices.  I wonder how hard it would be to write my own PWM C library with python wrapper to epoll the GLL PWM pin?  It would require a new PCB revision as would pigpio so there’s a PITA there too.  For some reason, probably because this all just works on Zoe, I still want to keep banging my head to solve the I²C problems.

9 flights

For the record, here’s 9 sequential flights, with each lasting 10s: 3s ascent at 0.3m/s, 4s hoverand 3s descent at 0.3m/s.  The drift is different in each.  There are actually only 8 flights in the video; the 9th is not in the video: she never took off and lost WiFi so I had to unplug her and take her indoors.  I may post it tomorrow, though it mostly consists of me grumbling quiets while I pick her up with my arse facing the camera!

Hermione had Garmin and RaspiCam disabled – videoing the ground for lateral tracking is pointless if the height is not accurately known.

On the plus side, the new foam balls did an amazingly successful job of softening some quite high falls.