Before moving on to compass and GPS usage, there’s one last step I want to ensure works: lateral movement.
The flight plan is defined thus:
- take-off in the center of a square flight plan to about 1m height
- move left by 50cm
- move forward by 50cm – this place her in to top left corner of the square
- move right by 1m
- move back by 1m
- move left by 1m
- move forwards by 50cm
- move right by 50cm
- land back at the take-off point.
The result’s not perfect despite running the ground facing camera at 640 x 640 pixels; to be honest, with lawn underneath her, I still think she did pretty well. She’s still a little lurchy, but I think some pitch / roll rotation PID tuning over the IKEA mat should resolve this quickly. Once again, you judge whether she achieved this 34 second flight well enough?
Here finally is her flying in a stable hover for a long time without rocketing off into space. Yes, she’s wobbly, but that’s a simple pitch / roll rotation rate PID tune much like I had to do with Zoe. She’s running the video at 560 x 560 pixels at 10 fps, hence no need for the IKEA play mat.
Finally I can move on to adding the compass and GPS into the mix.
For the first time in 4 years, I tried a lateral flight plan, fairly confident in the belief that stable hover in a headwind is no different to intentional movement forward in no wind. The flight plan was:
- Climb at 30cm/s for 2s
- Hover for 1s
- Move forwards at 30cm/s for 2s
- Hover for 1s
- Move backwards at 30s/s for 2s
- Hover for 1s
- Descend at 30cm/s for 2s
Was I right? You tell me:
Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks. She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.
And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*. Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind. But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.
I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.
* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.
Both flights use identical code. There are two tweaks compared to the previous videos:
- I’ve reduced the gyro rate PID P gain from 25 to 20 which has hugely increased the stability
- Zoe is using my refined algorithm for picking out the peaks in the macro-block frames – I think this is working better but there’s one further refinement I can make which should make it better yet.
I’d have liked to show Hermione doing the same, but for some reason she’s getting FIFO overflows. My best guess is that her A+ overclocked to turbo (1GHz CPU) isn’t as fast as a Zero’s default setting of 1GHz – no idea why. My first attempt on this has been improved scheduling by splitting the macro-block vectors processing into two phases:
- build up the dictionary of the set of macro-blocks
- processing the dictionary to identify the peaks.
Zoe does this in one fell swoop; Hermione schedules each independently, checking in between that the FIFO hasn’t filled up to a significant level, and if it has, deal with that first. This isn’t quite working yet in passive test, even on Zoe, and I can’t find out why! More anon.
Rather than spending ages trying to work out how to get rid of Hermione’s electronic errors, I’ve opted for a much simpler solution of powering her HoG from a 5V 2.1A battery bank. I used to do this in the olden days, and only swappend to LiPo + regulator when I needed space for the LiDAR and camera. But Hermione has plenty of spare space for both. Here’s what happened:
She was doing pretty well tracking against the lawn in windy conditions, right up to the point she went nuts and zoomed off into the sky. Shortly after, the LiDAR detected she was higher than the flight plan said she should be and the killed the flight. Once she vanishes from the shot, watch for her reflection in the window as she falls down to earth. She landed in a soft flower bed but that still resulted in two cut motor wires. No I/O error happened (good), but this type of radical error normally accompanies a FIFO overflow, and one wasn’t reported either. That leaves me stuck having to keep risking these dodgy flights until I can track down the cause.
A few test runs. In summary, with the LiDAR and Camera fused with the IMU, Zoe stays over her play mat at a controlled height for the length of the 30s flight. Without the fusion, she lasted just a few seconds before she drifted off the mat, lost her height, or headed to me with menace (kill ensued). I think that’s pretty conclusive code fusion works!
Finally, fusion worth showing.
Yes, height’s a bit variable as she doesn’t accurate height readings below about 20cm.
Yes, it’s a bit jiggery because the scale of the IMU and other sensors aren’t quite in sync.
But fundamentally, it works – nigh on zero drift for 20s. With just the IMU, I couldn’t get this minimal level of drift for more than a few seconds.
Next steps: take her out for a longer, higher flight to really prove how well this is working.
Bloomberg HelloWorld published their programme on English technology yesterday.
Zoe only appears at 2:18 for a fraction of a second, despite a 2 hour chat I had with these guys; on the plus side, they did take my advice and visit the Cotswold Raspberry Jam where they found David Pride and his amazing creations – have a look at 7:25 onwards and again at 11:12.
The paint has hardly dried on the raspivid solution for unbuffered macro-block streaming before a conversation on the Raspberry Pi forums yielded the way to get picamera to stream the video macro-block without buffering by explicitly opening an unbuffered output file rather than leaving it to picamera to second guess what’s needed.
Cutting to the chase, here’s the result.
Down-facing RaspiCam stabilized Zoe flight from Andy Baker on Vimeo.
Yes she drifted forward into the football goal over the course of the 20 seconds, but that’s 4 times longer to drift that far so frankly, I’m stunned how good the flight was! I have ideas as to why, and these are my next post.
And the best bit? No damage was done to my son’s goal net!