Hibernation

It’s Autumn, and that means the MPU-9250 is running outside of its ideal temperature zone.  Every year, what flies beautifully during the summer deteriorates once the temperature falls firmly into the teens.  That’s what’s happened yesterday with ambient @ 15°C.  What was a rock solid hover two days ago has become unstable.  Previous years, I’ve been able to move indoors with one of my smaller models, but that’s not an option this year.  Hermione is simply too big to fly in the house and she needs to be this big to incorporate all the sensors and RPi 3B.  Additionally, GPS is virtually inaccessible indoors.

To make things worse, all summer Hermione has been running an IMU which is outside of the spec accuracy range: it reads gravity as 0.88g – her accuracy should be 3% not 12%.  She just about got away with it during the summer temperatures, but not now down in the teens.

Net?  I’ve a new MPU-9250 on the way which will meet their spec; this should help somewhat, but at the same time, I think it’s fair to say that like the hedgehogs and squirrels round here, Hermione will be going into GPS hibernation, and only waking occassionally to test the sweep mapping of the indoors with very limited lateral flight movement.

As I result, I have updated the latest code onto GitHub.

What the GP(S) saw

Hermione did two 2m straight line flights today, and both landed about 3.1m away measured manually with a tape-measure.  The stats from both flights showed Hermione thought she’d flow just the 2m:

2m from video POV

2m from video POV

So there’s a scale of 1.55 x real distance = down-facing-video distance that needs fixing.  The scale formula I used to convert the video macroblocks to meters was an informed guess, and it may just be missing a π/2.

The 2m square flight plan I flew and plotted the other day is actually a 1.8m square (0.3m/s for 6s) when I checked the flight plan earlier.  After applying the video tracking error scale of 1.55, this is the comparison of the rescaled flight plan against what GPS saw.

GPS vs. virtual reality

GPS vs. virtual reality

The square is about 3.95m diagonally, and the errors between this and the GPS plot seem to be ≈±0.75m.  Horizontally readings are about 1m apart whereas vertically, it’s more like 0.75m – not sure yet whether that’s how it works out due to our latitude, or if there’s a bug in my GPS processing.

What’s this all mean? First, I need to add a factor of 1.55 to the video, and second, GPS directed flights could be made to work after all if

  1. obstacles surrounding Hermione are at least a meter away, probably two
  2. destinations are several meters away, best guess is 5 meters or more.
  3. sweep gets turning back on!

OK, that’s better

Same 2m square flight, 10 satellites detected throughout, square direction is roughly SE → NE → NW → SW:

GPS square tracking

GPS square tracking

If you ignore any horizontal and vertical lines, what’s left are diagonals pointing SE, NE, NW and SW matching reality.  So net, the real directional / locations are there, but so’s a variable inaccuracy of a few meters.  That means that GPS tracking should be possible, but the flight target distance needs to be longer (i.e. a 10m square in order to fly the square based on it’s 4 GPS waypoint corners)  and there needs to be some softening in the changes in GPS locations before they are passed to the motion processor as the direction to fly.

Just in passing, the compass / magnetometer showed takeoff at 160° which is a lot more SSE or S than the GPS SE – there’s clearly more work I need to do on compass calibrations as there’s only a degree or so here between true and magnetic north.

Compare and Contrast…

the 2m squared flight tracked with fused IMU, LiDAR and Video…

Fusion tracking

Fusion tracking

against GPS:

GPS tracking

GPS tracking

The vertical axis for fusion is the height in meters; for the GPS, it’s the number of satellites. This is complete garbage; a standard chart shows more clearly what’s happening:

2D GPS track

2D GPS track

The difference between the 1st and 2nd GPS has an error of roughly -27 by -11 meters after which, every following dot seems to be a combination of the square tracking combined with some sort of settling which slowly removes the error initial -27 by -11 error.  It’s time I did more investigation into how to handle a GPS warm-boot.

Back to basics

I’m back from “Hell on Earth” (a.k.a. Disneyland Paris).  I’m still hunting for a decent GPS receiver that provides anything like the reception level of my DJI Mavic.  It’s a bright sunny day outside, and the Mavic GPS can see a stable 10+ satellites; the minimum level it will use GPS flight tracking seems to be about 9 satellites.  In comparison, Hermione seated in the same position can only see between 4 and (occasionally) 8 satellites depending on which of the 5 GPS recorders I have.  This may explain why my 2m square flight GPS tracking was so poor.

I have one more arriving in the next week or so upon which I have high hopes.

Until that arrives, my focus is on fine tuning the existing code.  Primarily, this is the vertical velocity PID gains; during a flight, the maximum value from the Z axis accelerometer is ±4g – the maximum configured – whereas the average is just over 2g.  Because it’s hitting the maximum, that means the integrated velocity is less that it should be, meaning Hermione continues to climb when the motion processing thinks she’s hovering.


Sod’s law struck again: just after first posting this, the new GPS receiver arrived and it was no better than the others.  The best of the bunch is the GlobalSat SiRF Start IV which has reached 11 satellites on a good day.

I also did some quick testing on the acceleration and it’s reporting a 5.91g ascent peak and -4.58g descent peak.  Tweaking the PID to reduce P gain and increase I gain just made it sluggish, but the peaks remained > ±4g.  For the moment, I’ve just updated the code to us ±8g range.  Another possible source could be noise from the props’ minor damage.  I’ll just tolerate this for now while I’m playing with the GPS tracking.

SLAM dunk

Once last brain dump before next week’s torture in Disneyland Paris: no, not crashing into inanimate objects; quite the opposite: Simultaneous Location And Mapping i.e. how to map obstacles’ location in space, attempting to avoid them initially through random choice of change in direction, mapping both the object location and the trial-and-error avoidance and in doing so, feeding backing into future less-randomized, more-informed direction changes i.e. a.i.

My plan here, as always, ignores everything described about standard SLAM processes elsewhere and does it my way based upon the tools and restrictions I have:

  • SLAM processing is carried out by the autopilot process.
  • GPS feeds it at 1Hz as per now.
  • Sweep feeds it every time a near obstacle is spotted within a few meters – perhaps 5?
  • The map is 0.5m x 0.5m resolution python dictionary indexed by integer units of 1,1 (i.e. twice the distance GPS measurement) into whose value is a score (resolution low due to GPS accuracy and Hermione’s physical size of 1m tip to tip)
  • GPS takeoff location = 0,0 on the map
  • During the flight, each GPS position is stored in the map location dictionary with a score of +100 points marking out successfully explored locations
  • Sweep object detection are also added to the dictionary, up to a limited distance of say 5m (to limit feed from Sweep process and ignore blockages too far away to matter).  These have a score of say -1 points due to multiple scans per second, and low res conversion of cm to 0.5m
  • Together these high and low points define clear areas passed through and identified obstructions respectively, with unexplored areas having zero value points in the dictionary.
  • Height and yaw are fixed throughout the flight to local Sweep and GPS orientation in sync.
  • The direction to travel within the map is the highest scoring next area not yet visited as defined by the map.

The above code and processing is very similar to the existing code processing the down facing video macro-blocks to guess the most likely direction moved; as such, it shouldn’t be too hard to prototype.  Initially the map is just dumped to file for viewing the plausibility of this method in an Excel 3D spreadsheet.


P.S. For the record, despite autonomous GPS testing being very limited, because the file-based flight plan works as well or better than the previous version, I’ve unloaded the latest code to GitHub.

Excel’s pants

I want to make a 3D graph from my flight stats i.e. given 3 columns, says X, Y and Z from the ground facing video (x 2) and down-facing LiDAR, I want a 3D picture.  It turns out you can’t just plug in these 3 columns into a 3D surface chart, you need a mesh which Excel doesn’t support directly, so you have to purchase another app; the first one I stumbled across was called XYZ MESH which cost $90 for a single user.  So instead I’ve done a simple version with 85 lines of python code.

Python Excel Mesh

Python Excel Mesh

These are the results of another square flight this morning.  There’s lots more that can be done such as working out how XL chose these colours, and getting the points to join up, but this is the hard bit done.  Here’s the code.

import csv

min_x = 0.0
max_x = 0.0
min_y = 0.0
max_y = 0.0

with open("3DSquare.csv", 'rb') as fp_csv:
    fp_reader = csv.reader(fp_csv)
    row_num = 0
    data = []
    for row in fp_reader:
        if row_num == 0:
            column_num = 0
            for column in row:
                if column.strip() == "qdx_fuse":
                    x_index = column_num
                elif column.strip() == "qdy_fuse":
                    y_index = column_num
                elif column.strip() == "qdz_fuse":
                    z_index = column_num
                column_num += 1    
            print
            print x_index
            print y_index
            print z_index    
        else:
            x = float(row[x_index].strip())
            y = float(row[y_index].strip())
            z = float(row[z_index].strip())

            if x < min_x: min_x = x if x > max_x:
                max_x = x
            if y < min_y: min_y = y if y > max_y:
                max_y = y

            data.append((x,y,z))

        row_num += 1


x_increment = (max_x - min_x) / 250        
y_increment = (max_y - min_y) / 250        

print row_num        

map_dict = {}
x_dict = {}
y_dict = {}
for (x, y, z) in data:

    x_index = int(round(x / x_increment))
    y_index = int(round(y / y_increment))


    x_dict[x_index] = True
    y_dict[y_index] = True
    map_dict[(x_index,y_index)] = z

lines = []
line = ""
for x in sorted(x_dict.keys()):
    line += ", %f" % x   

lines.append(line + "\n")   

for y in sorted(y_dict.keys()):
    line = "%f" % y

    for x in sorted(x_dict.keys()):
        if (x, y) in map_dict:
            line += ", %f" % map_dict[(x,y)]
        else:
            line += ", "

    lines.append(line + "\n")

with open("mashed.csv", "wb") as mash:
    for line in lines:
        mash.write(line)                

The reason I want this will become clear tomorrow as part of my last post for a week while I’m at Disney.


Ah, that looks better…

Way cooler!

Way cooler!

Autonomous Autopilot, Almost

So the autopilot is now full autonomous, able to direct the main motion processor between various intermediate GPS waypoints to the final destination.  It will soft-abort flights via a controlled immediate landing if objects are in the way, the number of ‘visible’ satellites drops below a safety limit, or at the end of the flight.

Or it would if my GPS receiver / the weather would behave.  It’s only ever got 11 satellites max, and today it was struggling to keep 8.  Roughly 10 seems to be the acceptable lower limit my Mavic will accept for GPS control and today, it had no problem finding 11+.  Hermione on the other hand could only see 8 at the highest and this often drops mid-flight.  Without a constant set of satellites the position drifts – I’ve seen 40+m errors today.  Luckily I have code to abort a flight if the number of satellites drops below the minimum, and that’s what I’ve seen; in addition, the GPS is used to produce the direction of travel, but not the speed which is fixed at a gentle 0.3m/s to allow the down-facing video to track lateral motion accurately.

Here’s the GPS data I got for my standard 2m square flight; as you can see the return to base which in real life was very accurate, was 4m out by GPS’ account:

GPS accuracy & resolution

GPS accuracy & resolution

So I need the skies to clear and ideally to find a better GPS receiver, and herein lies the problem: I’m off to Disney Land, Paris next week.  Sod’s law says GPS conditions will be perfect here in England next week!


P.S. Just found myself a USB receiver that covers both GPS and GLONASS. GPS is USA owned and has 31 satellites; GLONASS is the Russian equivalent with 24 satellites; hopefully together I’ll be able to get much higher number of satellites, and hence much more accuracy as a result. Sadly, the proof of this pudding will only happen after a week of Disney junk-food eating. 🙁

Yaw(n)

I think I’ve finished writing the code to support GPS tracking: a target GPS location is stored prior to the flight; the flight takes off pointing in any direction, and once it has learned it’s own GPS takeoff position, it heads towards the target, updating the direction to the target each time it gets a new GPS fix for where it is, and finally landing when it’s current position matches the target GPS position.

There’s a lot of testing to be done before I can try it live.  The new autopilot code is pretty much a rewrite from scratch.  It now has multiple partial flight plans:

  • take-off
  • landing
  • file defined direction + speed + time
  • GPS initial location finding
  • GPS tracking to destination
  • Scanse abort.

It switches between these depending on either external Scanse or GPS inputs, or the flight plan section completing e.g. the GPS flight plan reaches its destination or the file-based flight plan has completed the various time-based lateral movements and hovers (both of which will then swap to the landing flight plan).  Luckily, most of this testing can be carried out passively.

The first test preempts all of these; as per yesterday’s post, I should be able to use my magnetometer combined with its magnetic declination to find Hermione’s initial orientation compared with true (i.e. GPS) north.  At the same time, I could use the magnetometer to provide long term yaw values to fused with the integrated yaw rate from the gyrometer.  Here’s what I got from two sequential passive tests:

Compass & integrated Gyro yaw

Compass & integrated Gyro yaw

I’m not hugely impressed with the results of either.

  • The difference between compass and integrated gyro yaw doesn’t match as tightly as I was expecting – in its current state, I won’t be using the compass direction as a long term fuse with the integrated gyro yaw unless I can improve this.
  • The blobs in the lower pair are compass orientation values as she’s sitting on the ground – half way through I rotate her clockwise roughly 90°.  The rotation angle is pretty good as are the direction (NW -> NE -> SE, but I don’t like the distributed density of the blobs as she sat still on the ground at each location – I think I’ll have to use an average value for the starting orientation value passed to the autopilot for GPS tracking processing.

Magnetic Declination

The one thing holding me back with GPS tracking is the direction Hermione is pointing at launch; in particular, she can find out the flight direction to the target based on the difference between the GPS positions of takeoff and landing points, but Hermione’s compass only gives orientation angles based on magnetic north, and the difference between true and magnetic north varies over time and space.

After digging around, it turns out “Magnetic Declination” is the name of the angle between magnetic and true north, which for me is currently -1° 5′ i.e. at takeoff, when Hermione’s compass says she’s pointing magnetic north, she’s actually point +1° 5′ from true north.  For me, that means it’s irrelevant for the short distance flights I’m implementing.

In the perfect world, I would like to have a magnetic declination table, mapping GPS longitude / latitude to magnetic declination angle, or use a GPS receiver which has one of these built in (they do exist apparently), but for now simply knowing the magnetic declination is negligible for my few meter flights is enough.

Just FYI, Cambridge’ magnetic declination is -0º 30′ which is even more negligible!