Buttercups and daisies…

are lacking yet this spring, and having mown the lawn yesterday, features are hard to find for the video lateral tracking.  So I think this is a pretty good 37s hover.  In fact, I think it’s as good as it can be until the daisies start sprouting:

This is with a frame size of 640² pixels.  There’s an check in the code which reports whether the code keeps up with the video frame rate.  At 640² it does; I tried 800² and 720² but the code failed to keep up with the video frame rate of 20fps.

As a result, I’ve uploaded the changes to GitHub.  There’s work-in-progress code there for calibrating the compass “calibrateCompass()”, although that’s turning out to be a right PITA.  I’ll explain more another time.

As a side note, my Mavic uses two forward facing camera to stereoscopically track horizontal movement, combined with GPS and a corresponding ground facing pair of cameras and the IMU accelerometer integration, yet if you watch the frisbee / baseball bat to the left, even the Mavic drifts.

That’s better…

not perfect, but dramatically better.  The flight plan was:

  • take off to 1m
  • move left over 6s at 0.25m/s while simultaneously rotating ACW 90° to point in the direction of travel
  • land.

I’d spent yesterday’s wet weather rewriting the macro-block processing code, breaking it up into 5 phases:

  1. Upload the macro block vectors into a list
  2. For each macro-block vector in the list, undo yaw that had happened between this frame and the previous one
  3. Fill up a dictionary indexed with the un-yawed macro-block vectors
  4. Scan the directory, identifying clusters of vectors and assigned scores, building a list of highest scoring vector clusters
  5. Average the few, highest scoring clusters, redo the yaw of the result from step 2, and return the resultant vector

Although this is quite a lot more processing, splitting it into five phases compared to yesterday’s code’s two means that between each phase, the IMU FIFO can be checked, and processed if it’s filling up thus avoiding a FIFO overflow.

Two remaining more subtle problems remain:

  1. She should have stayed in frame
  2. She didn’t quite rotate the 90° to head left.

Nonetheless, I once more have a beaming success smile.

Weird yaw behaviour

I’ve implemented the yaw code such that Hermione points in the direction that she should be travelling based upon the flight plan velocity vector.  She should take-off, then move left at 0.25 m/s for 6 seconds, while also rotating anti-clockwise by 90° to face the way she’s supposed to be travelling.  However, here’s what my Mavic & I saw:

My best guess is the camera lateral tracking which simply looks for peaks in macro-block after stashing them all in a dictionary indexed by the vectors.  This ignores yaw, which was fine up to now, as I’d set the yaw target to zero.  I think I need to add an extra stage which un-yaws each macro-block vector before adding them to the dictionary and looking for peaks.  That’s relatively easy code, involving tracking yaw between video frame, but costly as it adds an extra phase to unraw each MB vector, before dictionarying them and checking for peaks.  Time will tell.

FMR!

I haven’t smiled this much since my kids were born.  I’m still beaming now with the occasional burst of delighted laughter too; my jaws are even starting to ache.  It’s just over 4 years since the project was born, and finally it’s shown that a Raspberry Pi running standard Raspian Linux and programmed in Python is more that capable of running a fully autonomous quadcopter (though strictly speaking, Hermione is an X8 drone). 🙂 😛 😀 😎


P.S. This isn’t the end but the opening up of the next phase:

  • Test higher resolution for the ground facing RaspiCam video, ideally to the maximum of 1076 x 1076 pixels and see whether that’s good enough for lateral flights on the gravel drive.  This is relatively simple and could well happen in the next few days.
  • Add GPS both for setting multi-step targets, and, combined with compass, tracking the flight location / orientation between these targets.  There’s several phases to keep me busy here.
  • Add the imminently arriving Scanse Sweep for object avoidance
  • Set her loose in a maze!

But for the moment, I just need to wait for my rabid grin to ease!


P.P.S. Code’s updated on GitHub


P.P.P.S. I’ve stashed the video on the front page so I can carry on blogging.

9 flights

For the record, here’s 9 sequential flights, with each lasting 10s: 3s ascent at 0.3m/s, 4s hoverand 3s descent at 0.3m/s.  The drift is different in each.  There are actually only 8 flights in the video; the 9th is not in the video: she never took off and lost WiFi so I had to unplug her and take her indoors.  I may post it tomorrow, though it mostly consists of me grumbling quiets while I pick her up with my arse facing the camera!

Hermione had Garmin and RaspiCam disabled – videoing the ground for lateral tracking is pointless if the height is not accurately known.

On the plus side, the new foam balls did an amazingly successful job of softening some quite high falls.

One last sanity check

Before moving on to compass and GPS usage, there’s one last step I want to ensure works: lateral movement.

The flight plan is defined thus:

  • take-off in the center of a square flight plan to about 1m height
  • move left by 50cm
  • move forward by 50cm – this place her in to top left corner of the square
  • move right by 1m
  • move back by 1m
  • move left by 1m
  • move forwards by 50cm
  • move right by 50cm
  • land back at the take-off point.

The result’s not perfect despite running the ground facing camera at 640 x 640 pixels; to be honest, with lawn underneath her, I still think she did pretty well.  She’s still a little lurchy, but I think some pitch / roll rotation PID tuning over the IKEA mat should resolve this quickly.  Once again, you judge whether she achieved this 34 second flight well enough?

Hermione’s proof of the pudding

Here finally is her flying in a stable hover for a long time without rocketing off into space.  Yes, she’s wobbly, but that’s a simple pitch / roll rotation rate PID tune much like I had to do with Zoe.  She’s running the video at 560 x 560 pixels at 10 fps, hence no need for the IKEA play mat.

Finally I can move on to adding the compass and GPS into the mix.

First lateral flight plan.

For the first time in 4 years, I tried a lateral flight plan, fairly confident in the belief that stable hover in a headwind is no different to intentional movement forward in no wind.  The flight plan was:

  • Climb at 30cm/s for 2s
  • Hover for 1s
  • Move forwards at 30cm/s for 2s
  • Hover for 1s
  • Move backwards at 30s/s for 2s
  • Hover for 1s
  • Descend at 30cm/s for 2s

Was I right?  You tell me:

Zoe++

Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks.  She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.

The results?:

And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*.  Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind.  But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.

I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.


* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.