9 flights

For the record, here’s 9 sequential flights, with each lasting 10s: 3s ascent at 0.3m/s, 4s hoverand 3s descent at 0.3m/s.  The drift is different in each.  There are actually only 8 flights in the video; the 9th is not in the video: she never took off and lost WiFi so I had to unplug her and take her indoors.  I may post it tomorrow, though it mostly consists of me grumbling quiets while I pick her up with my arse facing the camera!

Hermione had Garmin and RaspiCam disabled – videoing the ground for lateral tracking is pointless if the height is not accurately known.

On the plus side, the new foam balls did an amazingly successful job of softening some quite high falls.

The GPS + compass plan

My intent with GPS and compass it that Hermione flies from an arbitrary take-off location to a predetermined target GPS location, oriented in the direction she’s flying.

Breaking that down into a little more detail.

  • Turn Hermione on and calibrate the compass, and wait for enough GPS satellites to be acquired.
  • Carry her to the destination landing point and capture the GPS coordinated, saving them to file.
  • Move to a random place in the open flying area and kick off the flight.
  • Before take-off, acquire the GPS coordinates of the starting point, and from that and the target coordinates, get the 3D flight direction vector
  • On takeoff, climb to 1m, and while hovering, yaw to point in the direction of the destination target vector using the compass as the only tool to give a N(X), W(Y), Up(Z) orientation vector – some account needs to be taken for magnetic north (compass) vs. true north (GPS)
  • Once done, fly towards the target, always pointing in the way she’s flying (i.e. yaw target is linked to velocity sensor input), current GPS position changing during the flight always realigning the direction target vector to the destination position.
  • On arrival at the target GPS location, she hovers for a second (i.e. braking) and decends.

There’s a lot of detail hidden in the summary above, not least the fact that GPS provides yet another feed for 3D distance and velocity vectors to be fused with the accelerometer / PiCamera / LiDAR, so I’m going to have to go through it step by step

The first is to fly a square again, but with her oriented to the next direction at the hover, and once moving to the next corner, have yaw follow the direction of movement.  Next comes compass calibration, and flight plan based upon magnetic north west and up.

However, someone’s invoked Murphy’s / Sod’s law on me again: Hermione is seeing the I2C errors again despite no hardware or software changes in this area.  Zoe is behaving better, and I’m trying to double the motion tracking by doubling the video frame rate / sampling rate for the Garmin LiDAR-Lite; the rate change is working for both, but the LiDAR readings see to be duff, reading 60cm when the flight height is less than 10cm.  Grr 🙁

One last sanity check

Before moving on to compass and GPS usage, there’s one last step I want to ensure works: lateral movement.

The flight plan is defined thus:

  • take-off in the center of a square flight plan to about 1m height
  • move left by 50cm
  • move forward by 50cm – this place her in to top left corner of the square
  • move right by 1m
  • move back by 1m
  • move left by 1m
  • move forwards by 50cm
  • move right by 50cm
  • land back at the take-off point.

The result’s not perfect despite running the ground facing camera at 640 x 640 pixels; to be honest, with lawn underneath her, I still think she did pretty well.  She’s still a little lurchy, but I think some pitch / roll rotation PID tuning over the IKEA mat should resolve this quickly.  Once again, you judge whether she achieved this 34 second flight well enough?

Zoe++

Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks.  She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.

The results?:

And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*.  Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind.  But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.

I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.


* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.

Better macro-block interpretation

Currently, getting lateral motion from a frame full of macro-blocks is very simplistic:  find the average SAD value for a frame, and then only included those vectors whose SAD is lower.

I’m quite surprised this works as well as it does but I’m fairly sure it can be improved.  There are four factors to the content of a frame of macro-blocks.

  • yaw change: all macro-block vectors will circle around the centre of the frame
  • height change: all macro-blocks vectors will point towards or away from the centre of the frame.
  • lateral motion change: all macro-blocks vectors are pointing in the same direction in the frame.
  • noise: the whole purpose of macro-blocks is simply to find the best matching blocks between two frame; doing this with a chess set (for example) could well have any block from the first frame matching any one of the 50% of the second frame.

Given a frame of macro-blocks, yaw increment between frames can found from the gyro, and thus be removed easily.

The same goes for height too derived from LiDAR.

That leaves either noise or a lateral vector.  By then averaging these values out, we can pick the vectors that are similar to the distance / direction of the average vector. SAD doesn’t come into the matter.

This won’t be my first step however: that’s to work out why the height of the flight wasn’t anything like as stable as I’d been expecting.

Strutting her stuff

Finally, fusion worth showing.

Yes, height’s a bit variable as she doesn’t accurate height readings below about 20cm.

Yes, it’s a bit jiggery because the scale of the IMU and other sensors aren’t quite in sync.

But fundamentally, it works – nigh on zero drift for 20s.  With just the IMU, I couldn’t get this minimal level of drift for more than a few seconds.

Next steps: take her out for a longer, higher flight to really prove how well this is working.

Irritations and Innovations

No breakthroughs to report but:

  • Zoe is now running indoors safely with or without motion fusion installed
  • Without the fusion, she drifts horizontally and continues to rise during hover phase: this suggests the value for gravity at takeoff has drifted during the flight, perhaps temperature related?  It’s only about 15°C in our house currently which is outside the range she works well in.  First test is to add a blob of blue tack on the IMU so it isn’t cooled by the breeze from the props.
  • With fusion, her height is much better, but she swings laterally around her takeoff point – the Garmin LiDAR lite is doing it’s job well but there’s some tuning required for the lateral motion from the Raspberry Camera.  Also it’s dark in the play room, even with the lighting on, so I’m going to add LED lighting under her motors to give the camera better site.  She’s flying over an IKEA LEKPLATS play mat, but ours seems very faded, so I’ll be getting her a new one.
  • I’ve added a whole bunch of safety trip wires so that, for example, if she’s 50cm above where the flight plan says she should be, the flight dies.  Together these make her much safer for flights indoors.
  • I’ve added enhanced scheduling to prioritise IMU over camera input when the IMU FIFO is reading half-full; this is to prevent FIFO overflows as camera processing sometimes takes a while, and the overflows have been happening a lot recently.
  • I’ve also added another couple of pairs of PIDs – I’m not sure how I got away without them before.  The equivalent controls yaw perfectly, but the pitch and roll angles were missing, skipping straight to the rotation rates instead.
    • distance (target – input) =PID=> corrective velocity target
    • velocity (target – input) =PID=> corrective acceleration target
    • acceleration target => angular target (maths to choose an angle for a desired acceleration)
    • angle (target – input) =PID=> corrective rotation target
    • rotation (target – input) =PID=> PWM output

Together all these changes require a lot of tuning, tinkering and testing; I hope to report back with a video when there’s something worth sharing.

Salad bowl

Nothing exciting to report I’m afraid – still waiting for the rain to stop to take the girls out.  So I thought I’d show you what they look like.  Both are now kitted out with LiDAR and RaspiCam for vertical and lateral distance sampling.  They run the same software – the code enables 8 ESC objects if the hostname is hermione.local or 4 otherwise:

Salad bowl

Salad bowl

As you may have guess, Hermione’s lid is a Salad Bowl from Italy – I spotted one we had and it fitted perfectly, so I got myself an orange one – £5 + £15 shipping to avoid the Italian postal service stealing it in transit.  The bowl is concave on the underside (in its role as a salad bowl).  Luckily, it’s 10cm diameter matching some black acrylic disks I have (one of which is on Hermione’s underside to which attaches the Garmin LiDAR Lite and the0 RaspiCam.  Ultimately, the Scanse Sweep will be attached to this top black disc.

On that train of thought, I need to get a move on as the Scanse Sweep could well arrive before Christmas according to the latest update note.

Motion tracking is now working

Courtesy of a discussion with 6by9 on the Raspberry Pi forum, the motion tracking for Zoe is now working using raspivid which has an option to not buffer the macro-block output.  Here’s a plot of a passive flight where she moves forward and back (X) a couple of times, and then left and right (Y).

Macro-block vs accelerometer

Macro-block vs accelerometer

The plot clearly shows the macro-blocks and accelerometer X and Y readings are in sync (if not quite at the same scale), so tomorrow’s the day to set Zoe loose in the garden with the motors powered up – fingers crossed no cartwheels across the lawn this time!

Video buffering

I flew Zoe over the weekend with the camera motion running, and it was pretty exiting watching her cartwheel across the lawn – clearly there’s more to do in the testing before I try again!

So I did a passive flight in the ‘lab’ just now and got these stats; I had hoped to show a comparison of the accelerometer measured distances vs the camera video distances, but that’s not what I got:

Motion stats

Motion stats

The lower 2 graphs are the interesting ones: the left one shows how bad the integration error is with the accelerometer – all the details in the accelerometer are swamped by integrated offset errors.  It also shows that we are only getting data from the video every four seconds.

So I did a test with my raw motion code (i.e. no overheads from the other sensors in the quadcopter code), and it showed those 4 second batches contain 39 samples, so clearly there’s some buffering of the 10 Hz video frame rate as configured.

So next step is to work out how to identify whether it’s the FIFO or the video that’s doing the buffering, and how to stop it!