Yaw(n)

I think I’ve finished writing the code to support GPS tracking: a target GPS location is stored prior to the flight; the flight takes off pointing in any direction, and once it has learned it’s own GPS takeoff position, it heads towards the target, updating the direction to the target each time it gets a new GPS fix for where it is, and finally landing when it’s current position matches the target GPS position.

There’s a lot of testing to be done before I can try it live.  The new autopilot code is pretty much a rewrite from scratch.  It now has multiple partial flight plans:

  • take-off
  • landing
  • file defined direction + speed + time
  • GPS initial location finding
  • GPS tracking to destination
  • Scanse abort.

It switches between these depending on either external Scanse or GPS inputs, or the flight plan section completing e.g. the GPS flight plan reaches its destination or the file-based flight plan has completed the various time-based lateral movements and hovers (both of which will then swap to the landing flight plan).  Luckily, most of this testing can be carried out passively.

The first test preempts all of these; as per yesterday’s post, I should be able to use my magnetometer combined with its magnetic declination to find Hermione’s initial orientation compared with true (i.e. GPS) north.  At the same time, I could use the magnetometer to provide long term yaw values to fused with the integrated yaw rate from the gyrometer.  Here’s what I got from two sequential passive tests:

Compass & integrated Gyro yaw

Compass & integrated Gyro yaw

I’m not hugely impressed with the results of either.

  • The difference between compass and integrated gyro yaw doesn’t match as tightly as I was expecting – in its current state, I won’t be using the compass direction as a long term fuse with the integrated gyro yaw unless I can improve this.
  • The blobs in the lower pair are compass orientation values as she’s sitting on the ground – half way through I rotate her clockwise roughly 90°.  The rotation angle is pretty good as are the direction (NW -> NE -> SE, but I don’t like the distributed density of the blobs as she sat still on the ground at each location – I think I’ll have to use an average value for the starting orientation value passed to the autopilot for GPS tracking processing.

Video Distance + Compass Direction ≈ GPS

Distance + Direction = GPS

Distance + Direction = GPS

By human measurements, the distance was about 7m at about 45° (i.e NE).  GPS says 8.6m, video camera tracking says 5 which is the flight plan defined length to travel.

It was never going to be perfect due to the difference between magnetic and true north, the resolution of GPS of around 1m, and how video distance tracking will always be a best guess, but it’s more than good enough for my cunning plan to work.

However, the plan’s taking a premature diversion; during this test, I was less careful and she ended up (in vertical descent mode) clipping 5 props against the drive stone wall.  Next step (after replacing the props!) is now to deploy my Scanse Sweep code which will trigger an orderly landing if any object is detected less than 1.5m away – Hermione’s radius is 50cm prop tip to tip diagonally so that’s 1m clearance.

One interesting point: the compass readings are mostly in a very dense cluster, with just a few (relatively) pointing in very different directions – that’s as Hermione passed the family car!

Cotswold Raspberry Jam

So there I was this morning on the bus heading to Cheltenham to co-host the Cotswold Raspberry Jam, gazing at the beautiful Cotswold countryside, and my mind began to wander – well actually it was unleashed from the constraints of dealing with details it couldn’t fix while on a bus.  And out came a better idea about where to go for the next step to autonomy.

What’s been bugging me is the difference between magnetic and GPS North.  The ‘bus’ solution is to ignore magnetic north completely.  The compass still has a role to play maintaining zero yaw throughout a long flight, but it plays no part in the direction of flight – in fact, this is mandatory for the rest of the plan to work.

Instead, the autopilot translates between GPS north, south, east and west coordinate system and Hermione’s forward, backwards, right and left coordinate.  The autopilot only speaks to Hermione in her own coordinates when telling her where to go.  The autopilot learns Hermione’s coordinate system at the start of each flight by asking her to fly forwards a meter or so, and comparing that to the GPS vector change it gets.  This rough translation is refined continuously throughout the flight.  Hermione can start a flight pointing in any direction compared to the target GPS point, and the autopilot will get her to fly towards the GPS target regardless of the way she’s pointing.

Sadly now, I’m back from today’s Jam, and the details confront me once more, but the bus ride did yield a great view of where to go next.


P.S. As always, the jam was great; over 100 parents and kids, lots of cool things for them to see and do, including a visit this time by 2 members of local BBC micro:bit clubs.  Next one is 30th September.

A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.

Hermione’s progress

Here’s Hermione with her new PCB.  It’s passed the passive tests; next step is to make sure each of the 8 motor ESCs are connected the right way to the respective PWM output on the PCB, and finally, I’ll do a quick flight with only the MPU-9250 as the sensors to tune the X8 PID gains.  Then she’s getting shelved.

Hermione's progress

Hermione’s progress

Zoe’s getting a new PCB so I can run the camera and Garmin LiDAR-Lite V3 on her too.  Hermione is huge compared to Zoe, and with the winter weather setting in, I’m going to need a system that’s small enough to test indoors.

Hermione will still be built – I need her extra size to incorporate the Scance Sweep and GPS, but I suspect only when an A3 arrives on the market – Hermione’s processing with a new 512MB A+ overclocked to 1GHz is nearly maxed out with the camera and diagnostics.  She’s probably just about got CPU space for the compass and Garmin LiDAR lite over I2C but I think that’s it until the A3 comes to market.  My hope for the A3 is that it uses the same 4 core CPU as the B2 with built in Bluetooth and WiFi as per the B3 but no USB / ethernet hub to save power.  Fingers crossed.

 

yaw

I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

The future’s bright; the future’s orange!

Well, tangerine actually!

With the IMU FIFO code taking the pressure off critical timings for reading the IMU, a huge range of options have opened up for what to do next with the spare time the FIFO has created:

  • A simple keyboard remote control is tempting where the QC code polls stdin periodically during a hover phase and amends the flight plan dynamically; ultimately, I’d like to do this via a touch screen app on my Raspberry Pi providing joystick buttons.  However for this to work well, I really need to add further sensor input providing feedback on longer-term horizontal and vertical motion…
  • A downward facing Ultrasonic Range Finder (URF) would provide the vertical motion tracking when combined with angles from the IMU.  I’d looked at this a long time ago but it stalled as I’d read the sensors could only run at up to 100kbps I2C baudrate which would prevent use of the higher 400kbps required for reading the FIFO.  However a quick test just now shows the URF working perfectly at 400kbps.
  • A downward facing RPi camera when combined with the URF would provide horizontal motion tracking.  Again I’d written this off due to the URF, but now it’s worth progressing with.  This is the Kitty++ code I started looking at during the summer and abandoned almost immediately due both to the lack of spare time in the code, and also the need for extra CPU cores to do the camera motion processing; Chloe with her Raspberry Pi B2 and her tangerine PiBow case more than satisfy that requirement now.
  • The magnetometer / compass on the IMU can provide longer term yaw stability which currently relies on just the integrated Z-axis gyro.

I do also have barometer and GPS sensors ‘in-stock’ but their application is primarily for long distance flights over variable terrain at heights above the range of the URF.  This is well out of scope for my current aims, so for the moment then, I’ll leave the parts shelved.

I have a lot of work to do, so I’d better get on with it.

 

Compass

The MPU-9250 has a built in magnetometer which up to now I’ve been ignoring.  However while I’m stuck waiting for stuff in the post, I’ve given it more than a seconds thought and it could be useful.

The magnetometer is reading the strength and direction of the local magnetic field, primarily from the Earth’s magnetic north pole.  So it can be used for knowing the direction a flight is taking which is useful if the flight is tracking between a series of GPS positions.  That aspect is of no interest to me at the moment, but there’s another trick.

Because the output of the magnetometer is a vector of the local magnetic field, it can be used to calculate the absolute quad-frame pitch, roll and yaw angles since take-off – exactly what the integrated gyro is doing short term.  But because the magnetometer gives absolute angles based upon an absolute fixed source, the magnetometer angles can be fused with short-term integrated gyro readings to keep them very tightly on the straight and narrow.

Use of a magnetometer as a backup a yaw sensor is common, but I’ve not come across it being used as an absolute pitch, roll and yaw sensor.

There is only one problem I can see, and that’s that metals in the flight area distort the earth’s magnetic field, so some care is needed fusing the readings to ignore any rapid changes that don’t match the gyro integration.

For now though, I’ll just be adding the magnetometer readings into the diagnostics, polling it perhaps once a second just to check the validity of the readings compared to the integrated gyro readings.  If that holds out, then this could lead to much longer zero drift flights.

Blind, deaf, disorientated, lost and scared of heights…

and yet, amazing.

So in that last video of a ~10s flight, Phoebe drifted about 1m when she shouldn’t have.  To me, that’s amazing; to you, that may be underwhelming.  So here’s why you should be amazed.  Basic school math(s):

distance = ½ x acceleration x time² ⇒ plugging in the numbers above says she was accelerating at 0.02m/s² (2cm/s²) instead of zero to travel 1 meter in 10 seconds.

Gravity is roughly 10m/s².  The accelerometer is set to read up to ±4g.  So that’s a range of 8g or roughly 80m/s²

So 0.02/80 * 100 = 0.025% error or a ±8 error value in the sensor range of ±32768.

Now the critical part – the sensors are only rated at 2% (±655) accuracy, and yet I’m getting 0.025% or 80 times that degree of accuracy.

And that’s why I don’t think I can get things very much better, whatever I do.

There is a slight chance that when the A2 is released (sounds like that’s shifted to next year now), I may be able to run the sampling rate at 1kHz without missing samples (I’ve had to drop to 500Hz to ensure I don’t miss samples at the moment).

Other than that though, she needs more sensors:

  • camera for motion tracking (!blind)
  • ultrasonic sensors for range finding (!deaf)
  • compass (!disorientated)
  • GPS (!lost)
  • altimeter (!scared of heights).

but they’ll also need an A2 for the extra processing.  So this is most definitely it for now.