Well, tangerine actually!
With the IMU FIFO code taking the pressure off critical timings for reading the IMU, a huge range of options have opened up for what to do next with the spare time the FIFO has created:
- A simple keyboard remote control is tempting where the QC code polls stdin periodically during a hover phase and amends the flight plan dynamically; ultimately, I’d like to do this via a touch screen app on my Raspberry Pi providing joystick buttons. However for this to work well, I really need to add further sensor input providing feedback on longer-term horizontal and vertical motion…
- A downward facing Ultrasonic Range Finder (URF) would provide the vertical motion tracking when combined with angles from the IMU. I’d looked at this a long time ago but it stalled as I’d read the sensors could only run at up to 100kbps I2C baudrate which would prevent use of the higher 400kbps required for reading the FIFO. However a quick test just now shows the URF working perfectly at 400kbps.
- A downward facing RPi camera when combined with the URF would provide horizontal motion tracking. Again I’d written this off due to the URF, but now it’s worth progressing with. This is the Kitty++ code I started looking at during the summer and abandoned almost immediately due both to the lack of spare time in the code, and also the need for extra CPU cores to do the camera motion processing; Chloe with her Raspberry Pi B2 and her tangerine PiBow case more than satisfy that requirement now.
- The magnetometer / compass on the IMU can provide longer term yaw stability which currently relies on just the integrated Z-axis gyro.
I do also have barometer and GPS sensors ‘in-stock’ but their application is primarily for long distance flights over variable terrain at heights above the range of the URF. This is well out of scope for my current aims, so for the moment then, I’ll leave the parts shelved.
I have a lot of work to do, so I’d better get on with it.