In early spring this year (March – mid-April) conditions were good for flying; light stable winds with sunny days and rainy days all forecast accurately days in advance. The net of this was lots of test flights, lots of videos, and the only downside was the need to put Hermione’s HoG RPi B3 and PCB in a customize Pimoroni Tangerine PiBow to try to keep temperatures in the IMU above 20°C where its sensor offsets and gains are most stable.
In the second half of spring, the temperature soared to the high twenties / early thirties; there was no breeze and no rain; the weather was Mediterranean midsummer. The lawn was starched and featureless for the down-facing lateral-motion tracking RPi camera and so flights moved to the gravel drive; more damage to the props ensued but progress was good.
It’s now nominally mid-summer here. The weather’s moved to blustery showers, torrential rain storms, wind speed in the teens and sometimes gale force, with bursts of thunder and lightening and to make it worse, it changes every half an hour. The temperature is mid-to-high teens. It’s standard autumnal weather in all but name, and it’s impossible to fly in; I get my kit out in bright sunshine and low twenties with no breeze, and by the time it’s set up, the sky has clouded over threatening rain, the wind has risen into the teens and the temperature has dropped into the teens.
Why am I telling you? Because now, I’ve pretty much finished the GPS routing branch code, and in doing so, I’ve spotted a couple of fixes / enhancements I’ve back-applied to the main line; if I’m right, I may have fixed the feature where Hermione always faces the way she’s going rather than the direction she was pointing at takeoff: to change her direction of flight, she continues to fly ‘forwards’ from her point of view but she yaws her body round to point her nose in the new direction. Completely pointless really, just another challenge to tick of the list.
Here’s to decent English summer weather again so I can post something that might actually be interest to you, dear readers!
OK, back to the plan; the next step is to get the compass readings in sync with the gyro readings for yaw as per my bus inspired plan:
I’m pretty impressed with this; the compass is measuring magnetic north pole orientation, where north to east clockwise rotation is positive; in contrast the gyro anti-clockwise rotation is positive following the right hand rule. Also the integrated gyro has no bounds on the yaw values it produces (±∞°), where as the magnetometer results (after some trig on the vector it outputs) is ±180°.
Note to self: gonna have to cancel the fusion when integrated gyro and compass yaw are significantly either side of zero. I do have one further concern: this test was run without props / ESCs powered up; I’m worried that once the motors are running, the magnetic force that makes the motors spin will completely ruin the brilliant synchronicity above. The next test for the plan will be to test GPS tracking over a 10m flight which, in passing, should also show the effect of the motors’ magnetism has on the compass readings.
Coding this combination of features has taken me several days, covering complex interaction depending on whether each is installed and how they are configured. This has required a rework of the GPS processing and Flight Plan objects to ensure they interact cleanly, along with enhanced configuration parameters and use of compass. Here’s the overview.
||Defines always forward flight with yaw to turn.
||Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift.
||No interactions without compass
||Defines flight plan X, Y as N, S, E & W; fused to prevent integrated IMU yaw rate drift.
||Defines NSEW during flight
||Defines direction of flight between waypoints.
||No interaction without compass
||Defines direction of flight between waypoints.
||Defines flight plan waypoints; tracks locations during flights.
I _think_ I’ve got all bases covered now in the code, but there’s a lot of new pieces to be tested individually, and in all possible combinations, passively first, and then in real flights. For the first time in this project, I’m going to need to plan my testing so I don’t miss a combination until it’s too late.
On the plus side, somehow after her last video, Hermione’s shoulder joint broke while I was testing higher video frame sizes and rates, and the camera stopped working. Until the replacement parts arrive, there’s a lot of spare time for thinking and coding, and also occasionally tickling my fancy – more on the latter anon.
Hermione is still causing trouble with yaw control flights despite lots of refinements. Here’s the latest.
@3s she’s climbed to about a meter high and then hovered for a second. All the X, Y, and Z flight plan targets and sensor inputs are nicely aligned.
The ‘fun’ starts at 4 seconds. The flight plan, written from my point of view says move left by 1m over 4 seconds. From Hermione’s point of view, with the yaw code in use, this translates to rotate anti-clockwise by 90° while moving forwards by 1m over 4 seconds. The yaw graph from the sensors shows the ACW rotation is happening correctly. The amber line in the Y graph shows the left / right distance target from H’s POV is correctly zero. Similarly, the amber line in the X graph correctly shows she should move forwards by 1m over 4s. All’s good as far as far as the targets are concerned from her and my POV.
But there’s some severe discrepancy from the sensors inputs POV. From my POV, she rotated ACW 90° as expected, but then she moved forwards away from me, instead of left. The blue line on the Y graph (the LiDAR and ground-facing video inputs) confirms this; it shows she moves right by about 0.8m from her POV. But the rusty terracotta line in the Y graph (the double integrated accelerometer – gravity readings) shows exactly the opposite. The grey fusion of the amber and terracotta cancel each other out thus following the target perfectly but for completely the wrong reasons.
There are similar discrepancies in the X graph, where the LiDAR + Video blue line is the best match to what I saw: virtually no forward movement from H’s POV except for some slight forward movement after 8s when she should be hovering.
So the net of this? The LiDAR / Video processing is working perfectly. The double integrated IMU accelerometer results are wrong, and I need to work out why? The results shown are taken directly from the accelerometer, and double integrated in excel (much like what the code does too), and I’m pretty convinced I’ve got this right. Yet more digging to be done.
In other news…
- Ö has ground facing lights much like Zoe had. Currently they are always on, but ultimately I intend to use them in various ways such as flashing during calibration etc – this requires a new PCB however to plug a MOSFET gate into a GPIO pin.
- piNet has changed direction somewhat: I’m testing within the bounds of my garden whether I can define a target destination with GPS, and have enough accuracy for the subsequent flight from elsewhere to get to that target accurately. This is step one in taking the GPS coordinates of the centre of a maze, and then starting a flight from the edge to get back there.
That’s all for now, folks. Thanks for sticking with me during these quiet times.
P.S. I’ve got better things to do that worry about why everything goes astray @ 7s, 3s after the yaw to move left started; it’s officially on hold as I’ve other stuff lurking in the background that’s about the flower.
I’m doing some very careful testing before I set Hermione loose live to fly in a circle. This morning, I’ve confirmed the video lateral motion block tracking is working well.
For this first unpowered flight, I walked her forwards about 3m and then left by about the same. Note that she always pointed in the same direction; I walked sideways to get the left movement:
Forward – Left
For this second unpowered flight, again, I walked her forwards about 3m, but then rotated her by 90° CCW before walking another. Because of the yaw, from her point of view, she only flew forwards, and the yaw is not exposed on the graph. This is exactly how it should be:
Forward – Yaw 90° CCW – Forward
So I’m happy the lateral motion tracking is working perfectly. Next I need to look at the target. I can go that with the same stats.
The only problem I had was that the sun needs to be shining bright for the video tracking to ‘fly’ above the lawn; clearly it needs the high contrast in the grass when sunlit.
From the same run as the last two posts, I’ve been analysing the stats.
First the integrated gyro showing the yaw it detected:
This shows the unintentional clockwise (negative) yaw of about -10° shortly after take-off followed by the intentional anticlockwise yaw of +90° as Hermione headed off left. I still need to have a play with the yaw rate PID to try to kill that initial -10° yaw.
More interesting, to me at least, is the motion processing:
The graph shows how often the FIFO was emptied. In an unhindered world, the FIFO is emptied every 10ms (0.01s) and the PWM to the ESCs update. Peaks above that means something else was taking up the spare time. The FIFO overflows at just above 84ms (512 bytes total FIFO size / 12 bytes per IMU sample / 500Hz IMU sampling rate = 85.333ms), and the highest shown here is 38ms, well within safety limits. I’m particularly delighted that the majority of the spikes are within the 10 to 20ms range – that strongly suggests the split phases of macro-block processing is working like a dream.
The 20ms norm means the PWM is updated at 50Hz. Were the PWM consistently updated at less than 50Hz, it would really start to show in the stability of the flight. But luckily it isn’t, meaning there’s probably just enough room to finally squeeze in compass and GPS processing.
In passing, it’s worth saying that such levels of stats would be impossible if I was using a microcontroller (Arduino etc) – this 11s flight logged 1.46MB of data to shared memory, and ultimately to SD card. It logs both initial hard coded constants, and every dynamic variable for every cycle of motion processing – that means nothing is missing and it’s possible to diagnose any problem as long as the reader knows the code intimately. Without these logs, it would have made it nigh on impossible for the ignorant me 4+ years ago to achieve what I have now.
* I rate myself as experienced having spent over 4 years on this.
not perfect, but dramatically better. The flight plan was:
- take off to 1m
- move left over 6s at 0.25m/s while simultaneously rotating ACW 90° to point in the direction of travel
I’d spent yesterday’s wet weather rewriting the macro-block processing code, breaking it up into 5 phases:
- Upload the macro block vectors into a list
- For each macro-block vector in the list, undo yaw that had happened between this frame and the previous one
- Fill up a dictionary indexed with the un-yawed macro-block vectors
- Scan the directory, identifying clusters of vectors and assigned scores, building a list of highest scoring vector clusters
- Average the few, highest scoring clusters, redo the yaw of the result from step 2, and return the resultant vector
Although this is quite a lot more processing, splitting it into five phases compared to yesterday’s code’s two means that between each phase, the IMU FIFO can be checked, and processed if it’s filling up thus avoiding a FIFO overflow.
Two remaining more subtle problems remain:
- She should have stayed in frame
- She didn’t quite rotate the 90° to head left.
Nonetheless, I once more have a beaming success smile.
I’ve implemented the yaw code such that Hermione points in the direction that she should be travelling based upon the flight plan velocity vector. She should take-off, then move left at 0.25 m/s for 6 seconds, while also rotating anti-clockwise by 90° to face the way she’s supposed to be travelling. However, here’s what my Mavic & I saw:
My best guess is the camera lateral tracking which simply looks for peaks in macro-block after stashing them all in a dictionary indexed by the vectors. This ignores yaw, which was fine up to now, as I’d set the yaw target to zero. I think I need to add an extra stage which un-yaws each macro-block vector before adding them to the dictionary and looking for peaks. That’s relatively easy code, involving tracking yaw between video frame, but costly as it adds an extra phase to unraw each MB vector, before dictionarying them and checking for peaks. Time will tell.
Currently, yaw angle PID target stays set at zero throughout the flight; the piDrone always faces the same direction regardless of the way it’s travelling. Next step on the way to ultimately tracking between GPS points is to have her point in the direction she’s travelling; it’s not actually necessary for my girls; the Mavic Pro does it as the forward facing camera video is streamed back to it’s remote control so I get an FPV (first-person view – i.e. as though I was a pilot sitting on it).
My intention is that the yaw angle tracks the earth frame flight plan velocity vector, so the piDrone points the way it _should_ be going. This is roughly what the Mavic does.
I though it would be trivial, and added the one new line of code, and then I realised the gotchas which led to me blogging the details. There are 3 problems.
- Conversion of the lateral velocity vector to an angle. tangent only covers ±90°; this means the 1/1 and -1/-1 vectors both come out as 45° angles. Also 1/0 would throw a exception rather than 90°. Luckily math.atan2(x, y) resolves this spitting out angles of ±180°. Thats my yaw angle PID target resolved.
- Alignment of the integrated gyro input in the same scale as above. If the piDrone flies 2.25 clockwise loops around the circle, the integrated gyro will read 810° when it needs to read 90° – doing yaw = yaw % 180° should sort this out. That’s the yaw angle PID input sorted
- Finally if the yaw PID input is 179° and the yaw PID target is -179°, the PID (target – input) needs to come out at +2° not -358° i.e. the angle must always be <= |180°|. I’ve sorted this out in the code by adding a custom subclass overriding the default error = (target – input):
# PID algorithm subclass to cope with the yaw angles error calculations.
def Error(self, input, target):
# An example in degrees is the best way to explain this. If the target is -179 degrees
# and the input is +179 degrees, the standard PID output would be -358 degrees leading to
# a very high yaw rotation rate to correct the -358 degrees error. However, +2 degrees
# achieves the same result, with a much lower rotation rate to fix the error.
error = target - input
return (error if abs(error) < math.pi else (error - 2 * math.pi * error / abs(error)))
Now I just need the spring wind and torrential rain showers to ease up for an hour.
I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw. There are a few aspects to this:
- During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0. But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying. It’s a little bit tricker than it sounds for two reasons:
- The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
- Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.
It’s also worth noting this is only for looks and there are no flight benefits from doing this.
- The camera X and Y values are operating partially in the quadframe and partially in the earth frame. I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
- Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability. I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
- the first is that the compass needs calibration each time it boots up, just like is necessary with your phones. You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
- I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.
First step is to include yaw into the camera processing as a matter of urgency. The magnetometer stuff can once more wait until it becomes absolutely necessary.
FYI the rotation matrix from Earth to Quadcopter from is as follows:
|xe| | cos ψ, sin ψ, 0| |xq|
|ye| = |-sin ψ, cos ψ, 0| |yq|
|ze| | 0, 0, 1| |zq|