That’s all, folks…*

Featured

I think I’ve done all that can be done with my Raspberry Pi, Python piDrones.  Code is updated on GitHub as a result.  Here’s the vimeo links to the proof-of-the-pudding as it were:

The hardest by far though was the simplest in concept: a stable autonomous hover beyond a few seconds; each of the cool functions listed above probably took a few weeks on average; in contrast, the one-minute-plus hover took years.

There’s lots more videos on Vimeo linked to the blog via pidrone.io/video.

I’ve achieved my personal target and then some: taking a Raspberry Pi and Python to create a piDrone, starting from absolutely ignorance, and doing it my way without using other’s code, ideas or suggestions.

What else could be done?  My only idea is long distance / time flights requiring:

  1. GPS fused with existing distances sensors
  2. Compass fused with existing orientation sensors
  3. Long range wireless connectivity
  4. Nuclear-fusion batteries.

Lack of #4 renders 1-3 pointless.

Also pointless sadly is Penelope; Hermione, my big 3B, is the queen of autonomous control and Zoe, my Pi0W, the queen of human control.  Two’s company, three’s a crowd. The only thing unique I can do with ‘P’ is to get her RPi 3B+ and Stretch O/S completed, and my motivation is lacking; that makes Penelope the queen of spare parts 😥

Time for me to find another hobby to hold back my terminal-boredom-syndrome.  On my bike, me thinks.

So long, and thanks for all the fish!


* …nearly.  I’m doing some refinement for Zoe, primarily so I can take new to the Cotswold Raspberry Jams and anything new and exciting the RPi releases next.

“Penelope” or “Lucy”…

…as in the Amazon series “Lucifer”?  I’ll stick with Ms. Pitstop despite the colour scheme; Lucifer never shows up on Tuesdays.

Separated at birth?

Separated at birth?

She’s still pending the new version of the Garmin LIDAR-Lite v3HP – the lower-profile, higher-accuracy version of Hermione and Zoes’ height tracking LiDAR, She’s also waiting for a new PCB so she can have a buzzer, though that’s not holding her back in the same way.  She’ll intentionally not have a Scance Sweep as it’s very very expensive for a non-critical sensor.

My intent had been to make her lower profile, sleek and stealthy to enable longer flights per battery hence the shorter legs, and lower hat and the 13 x 4.4 CF props (compared to ‘H’ 12 x 4.4 Beechwoods). However her hat and feet prevent this – the feet are true lacrosse balls, so heavier than Hermione’s indoor ones, and her salad bowl top also seems heavier.  Overall then ‘H’ weighs in at 4.8kg all installed, and Penelope 4.7kg.  Thus the main benefit is likely she’ll be nippier due to slightly more power from the lighter, larger CF props combined with the raised centre of gravity.  And in fact, this raised CoG and lighter, larger props may well reduce the power needed – we shall see.

In the background, I am working on the “Pond Problem”: fusing GPS distance / direction with the other sensors.  Code’s nigh on complete but I’m yet to convince myself it will work well enough to test it immediately over the local gravel lakes.

The lady’s not for turning

Around here in the South Cotswold, there are lakes, hundreds of them left behind once the Cotswold stone rocks and gravel have been mined from the ground.  People swim, yacht, canoe, windsurf and powerboat race around the area.  It’d be cool for a piDrone to fly from one side of a lake to the other, tracking the terrain as shown in this gravel pit just 2 minutes walk from my house.  ‘H’ start at the left, move over the pond, climb up and over the gravel and land on the far side:

Surface gravel mining

Surface gravel mining

But there’s a significant problem: neither the ground facing video nor LiDAR work over water.  For the down-facing video, there’s no contrast over the water surface for it to track horizontal movement.  For the LiDAR, the problem come when moving: the piDrone leans to move, and the laser beam doesn’t reflect back to the receiver and height reading stops working.

But there is a solution already in hand that I suspect is easy to implement and has little code performance impact, but amazing impact over the water survival: GPS is currently used in the autopilot process to compare where she is currently located compared to the target location, and pass the speed and direction through to the motion process; it would be nigh on trivial to also pass the horizontal distance and altitude difference since takeoff through to the motion process too.

These fuse with the existing ed*_input code values thus:

  • Horizontally, the GPS fuses always with the down facing PiCamera such that if / when the ground surfaces doesn’t have enough contrast (or she’s travelling too fast for video frames to overlap), the GPS will still keep things moving in the right direction and speed.
  • Vertically is more subtle; as mentioned above, the LiDAR fails when the ground surfaces doesn’t bounce the laser back to the receiver perhaps due to a surface reflection problem or simply because her maximum height of 40m has been exceeded.  In both cases, the LiDAR returns 1cm as the height to report the problem.  Here’s where GPS kicks in, reporting the current altitude since takeoff until the LiDAR starts getting readings again.

Like I’ve said, it’s only a few lines of relatively simple code.  The problem is whether I have the puppies’ plums to try it out over the local lakes?  I am highly tempted, as it’s a lot more real than the object avoidance code for which there will never be a suitable maze.  I think my mind is changing direction rapidly.

Obstruction Avoidance Overview

Here’s what ‘H’ does detecting an obstacles within one meter:

Now it’s time to get her to move around the obstacle and continue to her destination.  All sensors are installed and working; this is pure code in the autopilot process*.

Here’s the general idea for how the code should work for full maze exploration:

  • Takeoff, then…
    • hover
    • record current GPS location in map dictionary, including previous location (i.e index “here I am” : content “here I came from” in units of meters held in a python dictionary
    • do Sweep 360° scan for obstacles
    • find next direction based on the current map contents either…
      • unexplored unobstructed direction (beyond 2m) biased towards the target GPS point (e.g. the centre of the maze)
      • previously visited location marking the current location in the map dictionary as blocked to avoid further return visits
    • head off on the new direction until
      • obstacle found in the new direction
      • unexplored direction (i.e not in the map so far) found
    • repeat

And in fact, this same set of rules are required for just avoiding obstacles, which is good, as I doubt I’ll ever be able to find / build a suitably sized maze, and if I did, the LiPo will run out long before the centre of the maze is reached.


* The fact it’s pure code means it’s going to be quiet on the blogging front apart from GPS tracking videos when the weather warms up.  I’m also considering building a new version of Hermione from the spares I have in stock, provisionally called “Penelope”.  She’s there for shows only; I can then use Hermione purely for testing new features without worrying about breaking her prior to an event.

Whoohoo!


What’s left?

  • Refine the u-blox NEO-M8T to use the “pedestian” model algorithm which is more accurate than the current “portable” model used.
  • Test the yaw code I’ve been working on in the background so ‘H’ always points in the way she’s going.
  • Stop swearing at Microsoft for the Spectre and Meltdown updates they installed this morning which has crippled my PC just loading web pages!

5 years later…

It was around Christmas 2012 that I started investigating an RPi drone, and the first post was at the end of January ’13.

5 years later, phase ‘one’ is all but done, barring all but the first as minor, mostly optional extras:

  • Track down the GPS tracking instability – best guess is reduced LiPo power as the flight progress in near zero temperature conditions.
  • Get Zoe working again – she’s been unused for a while – and perhaps, if possible, add GPS support although this may not be possible because she’s just a single CPU Pi0W
  • Fuse the magnometer / gyrometer 3D readings to long term angle stability, particular yaw which has no backup long term sensor beyond the gyro.
  • Add a level of yaw control such that ‘H’ always points the way she’s flying – currently she always points in the same direction she took off at.  I’ve tried this several times, and it’s always had a problem I couldn’t solve.  Third time lucky.
  • Upgrade the operating systems to Raspbian Stretch with corresponding requirements for the I2C fix and network WAP / udhcpd / dnsmasq which currently means the OS is stuck with Jessie from the end of February 2017.
  • Upgrade camera + lidar 10Hz sampling versus camera 320² pixels versus IMU 500Hz sampling to 20Hz, 480² pixels, 1kHz respectively.  However, every previous attempt to update one leads to the scheduling no longer able to process the others – I suspect I’ll need to wait for the Raspberry Pi B 4 or 5 for the increased performance.

Looking into the future…

  • Implement (ironically named) SLAM  object mapping and avoidance with Sweep, ultimately aimed at maze nativation – just two problems here: no mazes wide enough for ‘H’ clearance, and the AI required to remember and react to explore only unexplored areas in the search for the center.
  • Fuse GPS latitude, longitude and altitude / down-facing LiDAR + video / ΣΣ acceleration δt δt fusion for vertical + horizontal distance – this requires further connections between the various processes such that GPS talks to motion process which does the fusion.  It enables higher altitude flights where the LiDAR / Video can’t ‘see’ the ground – there are subtleties here swapping between GPS and Video / LiDAR depending whose working best at a given height above the ground based on an some fuzzy logic.
  • Use down-facing camera for height and yaw as well as lateral motion – this is more a proof of concept, restrained by the need for much higher resolution videos which current aren’t possible with the RPi B3.
  • Find a cold-fusion nuclear battery bank for flight from the Cotswolds, UK to Paris, France landing in Madrid, Spain or Barcelona, Catalonia!

These future aspirations are dreams unlike to become reality either to power supply, CPU performance or WiFi reach.  Although a solution to the WiFi range may be solvable now, the other need future technology, at least one of which my not be available within my lifetime :-).

Wishing you all a happy New Year and a great 2018!

Here we go round the Mulberry bush.

Yesterday it was a cold and frosty morning, but more critically, sunny with only a light breeze:

28/12/2017 BBC weather forecast

Thursday’s weather

So I managed to squeeze in another test flight, where the plan was to fly ‘H’ between 3 GPS wayspoint frisbee-markers from a takeoff in between all three.  As you can see, reality deviated significantly from the plan.


I’d made a minor change to the code such that the maximum horizontal speed was 1 m/s reducing proportionally as ‘H’ got less than 5 meters from the target e.g at 3m from the target, the speed is set to 0.66cm/s.  That worked well approaching the first red frisbee-marker after takeoff.  However the next phase, although heading in the right direction between red and orange frisbee-markers, was very unstable and ultimately overshot the orange frisbee-marker, so I killed the flight.  Here’s what the flight controller saw:

RTF
TAKEOFF
HOVER
GPS: WHERE AM I?
GPS TARGET 6m -69o
GPS TARGET 6m -69o
GPS TARGET 5m -70o
GPS TARGET 4m -71o
GPS TARGET 4m -73o
GPS TARGET 3m -75o
GPS TARGET 2m -78o
GPS TARGET 2m -82o
GPS TARGET 1m -86o
GPS TARGET 1m -90o
GPS TARGET 8m 85o
GPS TARGET 8m 85o
GPS TARGET 7m 85o
GPS TARGET 6m 84o
GPS TARGET 4m 86o
GPS TARGET 2m 93o
GPS TARGET 2m -117o
Flight time 20.242111

The green text is from takeoff to the red frisbee-marker.  The yellow section shows a good heading towards the second frisbee-marker.  It started at the right speed of 1m/s while more than 5m away, but the red lines show velocity increasing to 2, 2 and 4 m/s as H got closer to the orange frisbee-marker.  On the plus side, she knew she’d overshot the target and had been told to double back at the point I killed the flight -check the angles at the end of the red lines.

Time to go bug hunting – it’s hopefully just a stupid crass typo on my part.  Luckily, the kids go back to school on Tuesday, and the weather forecast so far is looking good that morning:

Tuesday's weather forecast

Tuesday’s weather forecast

I hope by Tuesday I’ll also have worked out how to get better video quality from the Mavic!


P.S. No crass bug found, so my second best guestimation is that the LiPo cooled to below optimal performance temperature; this has the same effect as seen, and I had set the heater elements on the lowest level. The plan for the next flight is identical to before, but with the heaters on high and with full logging enabled in case this also fails and I need to diagnose in detail the source of the problem from the lateral velocity targets.

Perfect, nearly!

With a minor tweak to the u-blox NEO-M8T GPS receiver configuration, ‘H’ headed off in the right direction, making subtle changes of direction as she got closer to the pre-recorded GPS target point, and when within one meter, hovered briefly before landing. Only down side was the LiPo was at under 40% by then (it started at 48%), and that seems to be the point there’s not enough power for a stable descent, hence the very chaotic landing.

This is probably the last test for 2017 due both to the weather forecast and the chaos of Christmas.  See you all in the New Year – have a great holiday break!


P.S. The latest code has been updated on GitHub.

Same shᴉt, different day.

I went to the neighbouring field that the farmer isn’t using because they are digging for gravel in 90% of it, still leaving a sizeable 10% right next to my house; again tried GPS tracking, and again overshat (overshotted? overshooted? overshot?) significantly.  She started 13 meters away from the orange frisbee based on the GPS position of both, correlating nicely with the video.  She’s facing almost exactly away from the target as passed from the autopilot and logged by the motion process:

 GPS TARGET 13m 173o

The target is in a NNE direction by gut feel and confirmed by the GPS log stats:

GPS tracking

GPS tracking

GPS thought it had travelled about 5.3 meters when in fact this was more than 16 meters, based both on the video and on the GPS itself showing 33 samples at 1Hz with ‘H’ programmed to fly at 0.5 m/s.

Now what’s interesting here are the spacing between the dots on the graph. Since each dot is one second apart, this give the speed the GPS thought it was moving:

GPS speed

GPS speed

As already mentioned, ‘H’ is flying at a constant and stable 0.5m/s, confirmed by the video, but GPS ‘speed’ climbed and has not yet reached the stable 0.5m/s.

As a result of all the above, I finally I have a clue of why ‘H’ keeps overshooting her GPS target: it’s like the NEO-M8T algorithm uses some form of low pass filter, which gives brilliant accuracy for a stable location (i.e. the waypoints and at takeoff), but when moving, each new reading is fused with historic readings, causing significant lag.

The NEO-M8T has a vast amount of config parameters accessible via its u-center app.  Time for me to explore the options.


P.S. A post to the u-blox forum quickly yielded the solution: the UBX-CFG-NAV5 can be set via the u-center app (or other means, I’m sure).  By default, the NEO-M8T algorithm uses a “stationary” model, but there are many other options:

u-center

u-center

I’ll try “portable” model first based on their descriptions of the different models in their NEO-M8T spec, section 7, followed by “pedestrian” and finally “Automotive”.


P.P.S. The “portable” model worked perfectly as shown by the next post.  There was one additional config change to make which is in the “CFG” section: changes need to be saved in all possible options so the update to “portable” survived after reboot.

Winter Wondering

Worth reminding yourself of yesterday’s down-facing Mavic video of the flight first.

This graph is made from the raw GPS data from the NEO-M8T – no processing by me other than saving the results to file:

GPS waypoints and flight

GPS waypoints and flight

The grey line is the 3 preset waypoints: orange, red and purple correspond to the same coloured frisbees you can see in the down-facing video from yesterday.  The GPS waypoints on the graph are a very plausible match with their places in the video, both in location of each and the distance and direction between them .

The blue line is Hermione recorded live as she flew to the orange waypoint.

Here are the problems:

  1. Hermione took off from the purple frisbee in real life.  As she took off, she determined her GPS takeoff point dynamically.  This GPS point in the graph is about 4m away from purple waypoint(s) on the graph.
  2. In the video, you can see she flew in the right direction towards and beyond the orange frisbee.  In contrast, she thought from her GPS tracking that she was only ½ way to the orange waypoint, hence the real-world overshoot, and my termination of the flight.
  3. Hermione was travelling at 1m/s and the video back this up.  However the intermediate GPS locations she read suggest more like 0.3m/s based on the fact GPS updates happen at 1Hz.

Both for presetting the waypoints, and during the flight itself, 9 satellites were in use.

While the difference in take-off location of 4m is just about tolerable as a fixed offset, the fact the GPS points suggest 0.3m/s (compared with the real, correct 1m/s) is not and I have absolutely no idea why the NEO-M8T is doing this.