Whether ’tis easier for the macro-blocks to track
The clovers and daisies of contrasting colour..”
The answer is no, I shouldn’t have mown the lawn. With the kids’ toys moved out of the way, and any ungreen contrasting features slain, there was nothing to distinguish one blade of shorn grass from another and Hermione drifted wildly. Reinstating the kids’ chaos restored high quality tracking over 5m.
The point of the flight was to compare GPS versus macro-block lateral tracking. Certainly over this 5m flight, down-facing video beat GPS hands down:
My best guess interpretation of the GPS graph is that the flight was actually from the 2 – 7m diagonally heading north west. The camera POV doesn’t include compass data, so it’s correctly showing her flying forwards by 5m. The compass code is not working accurately yet – it needs more investigation why not – it was showing ~90° (i.e. East) rather than the true 45° (i.e. North East) shown by the GPS and a handheld compass.
I’ve done some more refinements to scheduling the sensor reads, and also accuracy of GPS data streamed from the GPS process. It’s worth viewing this graph full screen – each spike shows the time in seconds between motion processing loops – i.e. the time spent processing other sensors – 10ms indicates just IMU data was processed. The fact no loop takes more than 0.042s* even with full diagnostics running means I could up the sampling rate back to 1kHz – it’s at 500Hz at the moment. More importantly, it shows processing is nicely spread out and each sensor is getting it’s fair share of the processing and nobody is hogging the limelight.
As a result, I’ve updated the code on GitHub.
*42ms is the point where the IMU FIFO overflows at 1kHz sampling – 512 FIFO size / 12 bytes sample size / 1kHz sampling rate
First, the result: autonomous10m linear flight forwards:
You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine. At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.
So here’s what’s changed in the last week:
As a result of all the above, I’ve updated GitHub.
The sun wasn’t shining brightly, so no high contrast on the lawn; the lawn had been mowed, removing the contrasting grass clumps too. Yet she still did a great attempt at a 1m square. I think this is about as good as she can get – it’s time for me to move on to adding compass, GPS and object avoidance. The code as been updated on GitHub.
are lacking yet this spring, and having mown the lawn yesterday, features are hard to find for the video lateral tracking. So I think this is a pretty good 37s hover. In fact, I think it’s as good as it can be until the daisies start sprouting:
This is with a frame size of 640² pixels. There’s an check in the code which reports whether the code keeps up with the video frame rate. At 640² it does; I tried 800² and 720² but the code failed to keep up with the video frame rate of 20fps.
As a result, I’ve uploaded the changes to GitHub. There’s work-in-progress code there for calibrating the compass “calibrateCompass()”, although that’s turning out to be a right PITA. I’ll explain more another time.
As a side note, my Mavic uses two forward facing camera to stereoscopically track horizontal movement, combined with GPS and a corresponding ground facing pair of cameras and the IMU accelerometer integration, yet if you watch the frisbee / baseball bat to the left, even the Mavic drifts.
Up to now, I’ve deferred adding code to GitHub until a particular development phase is over. However, so much has changed in the past few months, I ought to share. Features:
- X8 optional configuration
- more precise and efficient scheduling using select() allowing for extra sensors…
- LEDDAR – fully tested
- PX4FLOW – tested to the extent the quality of the PX4FLOW allows
- ViDAR – testing in progress
- Garmin LIDAR-Lite V3 – arrival imminent
- Compass – tested but unused except for logging
- Fusion – tested, but each sensors source requires different fusion paramters
The compass function is unused except for logging. The ViDAR and Fusion features require at least a height sensor and further calibration. Therefore, I strongly recommend setting
self.camera_installed = False
unless you want to see how well it isn’t working yet.
Your can enable logging for the ViDAR stats without including them in the Fusion by setting the above to True and also setting these two variables to False:
# Set the flags for horizontal distance and velocity fusion
hdf = True
hvf = True
This code comes with absolutely no warranty whatsoever – even less than it normally does. Caveat utilitor.
I flew Phoebe earlier with LEDDAR working well. She repeatedly drifted left, and self-corrected as can be seen in the graph below. You can see her repeatedly drifting and stopping. This is the expected behaviour due to the top level PID controlling velocity not distance: drift is stopped, but not reversed. On that basis, I’ve updated the code on GitHub.
To get her back to where she took off from, I need another set of ‘pseudo-PIDs’ to recognise and correct the distance drifted. I’m going to keep this as simple as possible:
- I’ll continue to use my velocity flight plan – integrating this over time will provide the ‘target’ for the distance PIDs in the earth reference frame
- I’ll integrate the velocity (rotated back to earth frame) over time to get the distance PID ‘input’ – although this is double integration of the accelerometer, it should be good enough short-term based upon the close match between the graph and the flight I watched.
- critically, I’ll be using fixed velocity output from the distance ‘pseudo PID’, rotated back to the quad-frame as the inputs to the existing velocity PIDs – the input and target of the distance ‘pseudo PID’ only provide the direction, but not speed of the correction.
This should be a relatively simple change which will have a visible effect on killing hover drift, or allowing intentional horizontal movement in the flight plan.
After that, I’ll add the compass for yaw control so Phoebe is always facing the way she’s going, but that’s for another day.
I’ve updated the code on GitHub. The only significant change is the Raspberry Pi camera works correctly with the restructured code.
As far as flight quality is concerned, it’s better than earlier versions prior to when I added the class code and switched to the IMU FIFO, but it’s a long way to go before it’s good enough.
I don’t think there’s much else I can do without adding the laser sensors, and that means it’s going to continue to be quiet here for a while.
I’ve finally got the FIFO buffer code working with lots of protection against overflow, and also using a guaranteed way to ensure the code and FIFO are always synchronised. It works perfectly, so I’ve updated the code on GitHub
Then I took Zoe and Phoebe about; with the floppy props Zoe flew OK but not as well as usual and, as usual, neither would fly at all with the CF props. Some stats revealed unsurprisingly that it’s Z-axis noise from the props; Zoe’s floppy props aren’t so floppy at freezing temperatures but when I brought her indoors, she was fine again.
The problem is the motors / props can’t react fast enough to sharp spikes in acceleration, so drift ensues – in this case downwards vertical drift keeping them both pinned to the ground when the sensors felt the spikes. I need to find a way to soften those acceleration spikes such that the net integrated velocity is the same, and the motors can react to it.
There’s a couple of approaches I can take here, and as usual, I’ll be trying both.
The first is to add additional foam buffering between the HoG and the frame to soften the blows just like Zoe’s floppy props do. The second is to tweak the vertical velocity PID gains to be dominated by the I-gain and reduce the P-gain significantly.
Top row are variations of the breadboard PCBs; bottom row are Zoe’s and Phoebe’s new custom PCBs. Phoebe is already wearing her new HoG and I hope I can show her flying the IMU FIFO code imminently.
Gerber files and Eagle board layouts are on GitHub.
Big thanks to Ragworm for there super-fast turnaround.
I’ve tried various ways to acclimatise Zoe’s sensors prior to flight. The best so far is to set the props spinning at minimum speed, and after 5 seconds, grab a FIFO full of data (42 batches of samples in the 512 byte FIFO and 12 byte batch size), and use these to calculated start-of-flight gravity. The props then continue to run at this base speed up to the point the flight kicks off.
The net result is a stable flight with no vertical drift during hover, but with horizontal drift of about a meter. Without this code, horizontal drift is half this but she continues to climb during hover.
I’m not sure how I can improve this, so I’ll leave it alone for now and instead have a look at making a DIY cardboard box to keep Zoe out of the wind.
In passing, I did a quick analysis of the code size: 1021 lines of python code, 756 lines of comments and 301 blank lines giving a total of 2078 lines in Quadcopter.py. Here’s the script I knocked together quickly FYI:
code = 0
comments = 0
spaces = 0
with open("Quadcopter.py", "rb") as f:
for line in f.readlines():
line = line.strip()
if len(line) == 0:
spaces += 1
elif line == '#':
comments += 1
code += 1
print "Code %d, Comments %d, Blank Lines %d, Total %d" % (code, comments, spaces, code + comments + spaces)
I’ve put Zoe’s code up on GitHub as the best yet, although the either / or of vertical / horizontal drift is seriously starting to pᴉss me off.
Note that since I’ve moved the IMU FIFO into the Quadcopter.py code, QCIMUFIFO.py is not longer on GitHub; Quadcopter.py is the latest best working version and QCDRI.py is the best version that uses the data ready interrupt in case you are seeing the I2C errors like I used to.