GPS, Compass and Object Avoidance

It’s time to down tools on

  • Ö – she just can’t run fast enough to test on grass
  • yaw control – I just can’t work out why something goes very wrong three seconds after the yaw control kicks in

and move on to the next steps.

Compass

The compass data is already available, but needs calibrating to allow for magnetic materials in the area; I’ve even got the calibration code written, but it fails; it uses the gyro to track that the compass has turned > 360° to find the maximum and minimum readings, and hence the offset due to the local readings.  The angles from the gyro were utter rubbish, and I have no idea why – I just need to try harder here.

GPS

I’ve put together a hand-held GPS tracker, and took it for a walk outside my house*.  I also took a photo of our house with my Mavic and overlaid the results (it’s worth clicking on the image to see it full size):

GPS tracking

GPS tracking

It’s not perfect, but the shape tracked by the GPS is reasonable enough once the GPS has settled; note the both green splodges are at the same point, but only the end one is in the right position due the first few samples from the GPS – I’ll have to remember this when incorporating this with Hermione’s flight plan.  Ignoring the initial few samples, I think mostly the errors were less than a meter once away from the house.  The GPS code for Hermione will be closely based on what I used for my handheld version:

from __future__ import division
import gps
import os
import math

###################################################################################################
# Set up the tty to used in /etc/default/gpsd - if the GPS is via USB, then this is /dev/ttyUSB0. #
# The python script listens on port 2947 (gpsd) of localhost.  This can be reconfigured in        # 
# /etc/default/gpsd also.                                                                         #
###################################################################################################

session = gps.gps()
session.stream(gps.WATCH_ENABLE | gps.WATCH_NEWSTYLE)

num_sats = 0
latitude = 0.0
longitude = 0.0
time = ""
epx = 0.0
epy = 0.0
epv = 0.0
ept = 0.0
eps = 0.0
climb = 0.0
altitude = 0.0
speed = 0.0
direction = 0.0

lat = 0.0
lon = 0.0
alt = 0.0
new_lat = False
new_lon = False
new_alt = False

base_station_set = False

dx = 0.0
dy = 0.0
dz = 0.0

R = 6371000 # radius of the earth in meters

fp_name = "gpstrack.csv"
header = "time, latitude, longitude, satellites, climb, altitude, speed, direction, dx, dy, dz, epx, epy"

os.system("clear")

print header

with open(fp_name, "wb") as fp:
    fp.write(header + "\n")

    #---------------------------------------------------------------------------------
    # With a based level longitude and latitude in degrees, we can be the current X and Y coordinates
    # relative to the takeoff position thus:
    # psi = latitude => p below
    # lambda = longitude => l below
    # Using equirectangular approximation:
    #
    # x = (l2 - l1) * cos ((p1 + p2) / 2)
    # y = (p2 - p1)
    # d = R * (x*x + y*y) ^ 0.5
    #
    # More at http://www.movable-type.co.uk/scripts/latlong.html
    #---------------------------------------------------------------------------------

    while True:
        try:
            report = session.next()
#            print report
#            os.system("clear")
            if report['class'] == 'TPV':
                if hasattr(report, 'time'):  # Time
                    time = report.time

                if hasattr(report, 'ept'):   # Estimated timestamp error - seconds
                    ept = report.ept

                if hasattr(report, 'lon'):   # Longitude in degrees
                    longitude = report.lon
                    new_lon = True

                if hasattr(report, 'epx'):   # Estimated longitude error - meters
                    epx = report.epx

                if hasattr(report, 'lat'):   # Latitude in degrees
                    latitude = report.lat
                    new_lat = True

                if hasattr(report, 'epy'):   # Estimated latitude error - meters
                    epy = report.epy

                if hasattr(report, 'alt'):   # Altitude - meters
                    altitude = report.alt
                    new_alt = True

                if hasattr(report, 'epv'):   # Estimated altitude error - meters
                    epv = report.epv

                if hasattr(report, 'track'): # Direction - degrees from true north
                    direction = report.track

                if hasattr(report, 'epd'):   # Estimated direction error - degrees
                    epd = report.epd

                if hasattr(report, 'climb'): # Climb velocity - meters per second
                    climb = report.climb

                if hasattr(report, 'epc'):   # Estimated climb error - meters per seconds
                    epc = report.epc

                if hasattr(report, 'speed'): # Speed over ground - meters per second
                    speed = report.speed

                if hasattr(report, 'eps'):   # Estimated speed error - meters per second
                    eps = report.eps


            if report['class'] == 'SKY':
                if hasattr(report, 'satellites'):
                    num_sats = 0
                    for satellite in report.satellites:
                        if hasattr(satellite, 'used') and satellite.used:
                            num_sats += 1

            #-----------------------------------------------------------------------------
            # Calculate the X,Y coordinates in meters
            #-----------------------------------------------------------------------------
            if new_lon and new_lat and new_alt and num_sats > 6:

                new_lon = False
                new_lat = False
                new_alt = False

                lat = latitude * math.pi / 180
                lon = longitude * math.pi / 180
                alt = altitude


                if not base_station_set:
                    base_station_set = True

                    base_lat = lat
                    base_lon = lon
                    base_alt = alt

                dx = (lon - base_lon) * math.cos((lat + base_lat) / 2) * R
                dy = (lat - base_lat) * R
                dz = (alt - base_alt)

            else:
                continue


            output = "%s, %f, %f, %d, %f, %f, %f, %f, %f, %f, %f, %f, %f" % (time,
                                                                             latitude,
                                                                             longitude,
                                                                             num_sats,
                                                                             climb,
                                                                             altitude,
                                                                             speed,
                                                                             direction,
                                                                             dx,
                                                                             dy,
                                                                             dz,
                                                                             epx,
                                                                             epy)




            print output
            fp.write(output + "\n")
        except KeyError:
            pass
        except KeyboardInterrupt:
            break
        except StopIteration:
            session = None
            print "GPSD has terminated"
            break

The main difference will be that while this code writes to file, the final version will write to a shared memory pipe / FIFO much like the camera video macro-blocks are now.  The GPS will run in a separate process, posting new results as ASCII lines into the FIFO; Hermione’s picks up these new results with the select() she already uses.  The advantage of the 2 processes is both that they can be run on difference cores of Hermione’s CPU, and that the 9600 baudrate GPS UART data rate won’t affect the running speed of the main motion processing to get the data from the pipe.

Lateral object avoidance

My Scanse Sweep is very imminently arriving, and based on her specs, I plan to attach her to Hermione’s underside – she’ll have 4 blind spots due to her legs, but otherwise a clear view to detect objects up to 40m away.  Her data comes in over a UART like the GPS, and like the GPS, the data is ASCII text.  That makes it easy to parse.  The LOA does churn out data at 115,200 bps, so it too will be in a separate process.  Only proximity alerts will be passed to Hermione on yet another pipe, again listened to on Hermione’s select(); the LOA code will just log the rest providing a scan of the boundaries where it is.

Buttercups and daisies…

are lacking yet this spring, and having mown the lawn yesterday, features are hard to find for the video lateral tracking.  So I think this is a pretty good 37s hover.  In fact, I think it’s as good as it can be until the daisies start sprouting:

This is with a frame size of 640² pixels.  There’s an check in the code which reports whether the code keeps up with the video frame rate.  At 640² it does; I tried 800² and 720² but the code failed to keep up with the video frame rate of 20fps.

As a result, I’ve uploaded the changes to GitHub.  There’s work-in-progress code there for calibrating the compass “calibrateCompass()”, although that’s turning out to be a right PITA.  I’ll explain more another time.

As a side note, my Mavic uses two forward facing camera to stereoscopically track horizontal movement, combined with GPS and a corresponding ground facing pair of cameras and the IMU accelerometer integration, yet if you watch the frisbee / baseball bat to the left, even the Mavic drifts.

Stats analysis

From the same run as the last two posts, I’ve been analysing the stats.

First the integrated gyro showing the yaw it detected:

yaw

yaw

This shows the unintentional clockwise (negative) yaw of about -10° shortly after take-off followed by the intentional anticlockwise yaw of +90° as Hermione headed off left.  I still need to have a play with the yaw rate PID to try to kill that initial -10° yaw.

More interesting, to me at least, is the motion processing:

FIFO stats

FIFO stats

The graph shows how often the FIFO was emptied.  In an unhindered world, the FIFO is emptied every 10ms (0.01s) and the PWM to the ESCs update.  Peaks above that means something else was taking up the spare time.  The FIFO overflows at just above 84ms (512 bytes total FIFO size / 12 bytes per IMU sample / 500Hz IMU sampling rate =  85.333ms), and the highest shown here is 38ms, well within safety limits.  I’m particularly delighted that the majority of the spikes are within the 10 to 20ms range – that strongly suggests the split phases of macro-block processing is working like a dream.

The 20ms norm means the PWM is updated at 50Hz.  Were the PWM consistently updated at less than 50Hz, it would really start to show in the stability of the flight.  But luckily it isn’t, meaning there’s probably just enough room to finally squeeze in compass and GPS processing.

In passing, it’s worth saying that such levels of stats would be impossible if I was using a microcontroller (Arduino etc) – this 11s flight logged 1.46MB of data to shared memory, and ultimately to SD card.  It logs both initial hard coded constants, and every dynamic variable for every cycle of motion processing – that means nothing is missing and it’s possible to diagnose any problem as long as the reader knows the code intimately.  Without these logs, it would have made it nigh on impossible for the ignorant me 4+ years ago to achieve what I have now.


* I rate myself as experienced having spent over 4 years on this.

The GPS + compass plan

My intent with GPS and compass it that Hermione flies from an arbitrary take-off location to a predetermined target GPS location, oriented in the direction she’s flying.

Breaking that down into a little more detail.

  • Turn Hermione on and calibrate the compass, and wait for enough GPS satellites to be acquired.
  • Carry her to the destination landing point and capture the GPS coordinated, saving them to file.
  • Move to a random place in the open flying area and kick off the flight.
  • Before take-off, acquire the GPS coordinates of the starting point, and from that and the target coordinates, get the 3D flight direction vector
  • On takeoff, climb to 1m, and while hovering, yaw to point in the direction of the destination target vector using the compass as the only tool to give a N(X), W(Y), Up(Z) orientation vector – some account needs to be taken for magnetic north (compass) vs. true north (GPS)
  • Once done, fly towards the target, always pointing in the way she’s flying (i.e. yaw target is linked to velocity sensor input), current GPS position changing during the flight always realigning the direction target vector to the destination position.
  • On arrival at the target GPS location, she hovers for a second (i.e. braking) and decends.

There’s a lot of detail hidden in the summary above, not least the fact that GPS provides yet another feed for 3D distance and velocity vectors to be fused with the accelerometer / PiCamera / LiDAR, so I’m going to have to go through it step by step

The first is to fly a square again, but with her oriented to the next direction at the hover, and once moving to the next corner, have yaw follow the direction of movement.  Next comes compass calibration, and flight plan based upon magnetic north west and up.

However, someone’s invoked Murphy’s / Sod’s law on me again: Hermione is seeing the I2C errors again despite no hardware or software changes in this area.  Zoe is behaving better, and I’m trying to double the motion tracking by doubling the video frame rate / sampling rate for the Garmin LiDAR-Lite; the rate change is working for both, but the LiDAR readings see to be duff, reading 60cm when the flight height is less than 10cm.  Grr 🙁

Hermione’s proof of the pudding

Here finally is her flying in a stable hover for a long time without rocketing off into space.  Yes, she’s wobbly, but that’s a simple pitch / roll rotation rate PID tune much like I had to do with Zoe.  She’s running the video at 560 x 560 pixels at 10 fps, hence no need for the IKEA play mat.

Finally I can move on to adding the compass and GPS into the mix.

Hermione’s progress

Here’s Hermione with her new PCB.  It’s passed the passive tests; next step is to make sure each of the 8 motor ESCs are connected the right way to the respective PWM output on the PCB, and finally, I’ll do a quick flight with only the MPU-9250 as the sensors to tune the X8 PID gains.  Then she’s getting shelved.

Hermione's progress

Hermione’s progress

Zoe’s getting a new PCB so I can run the camera and Garmin LiDAR-Lite V3 on her too.  Hermione is huge compared to Zoe, and with the winter weather setting in, I’m going to need a system that’s small enough to test indoors.

Hermione will still be built – I need her extra size to incorporate the Scance Sweep and GPS, but I suspect only when an A3 arrives on the market – Hermione’s processing with a new 512MB A+ overclocked to 1GHz is nearly maxed out with the camera and diagnostics.  She’s probably just about got CPU space for the compass and Garmin LiDAR lite over I2C but I think that’s it until the A3 comes to market.  My hope for the A3 is that it uses the same 4 core CPU as the B2 with built in Bluetooth and WiFi as per the B3 but no USB / ethernet hub to save power.  Fingers crossed.

 

yaw

I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

Compass and ViDAR

First, ViDAR: I’ve got fed up of calling it video macro-block something or the other, so now it’s Video Detection and Ranging.  I think it’s now working as well as it can without a height detector; it’s a long way from perfect, but until I can rely on stable height, I can’t identify further bugs.  I’m almost certainly going to wait for the Garmin LiDAR-Lite (GLL) to provide the height information.

There’s one bug I’ve fixed which explains why the PX4FLOW only uses a gyro – the angles involved are increments not absolutes.  The picture below tries to show that it’s not the absolute tilt of the quadcopter that needs to be accounted for, but the increment:

ViDAR angles

ViDAR angles

I’ve also got the compass working – I’ve just pulled out the critical bits.  Most of the samples I found use the MPU9250 master I2C to attach to the compass; compass data is then picked from MPU9250 registers.  But I found this version which exposed the compass registers for direct access which is much cleaner and clearer.

####################################################################################################
#
#  Gyroscope / Accelerometer class for reading position / movement.  Works with the Invensense IMUs:
#
#  - MPU-6050
#  - MPU-9150
#  - MPU-9250
#
#  The compass / magnetometer of the MPU-9250 is not used
#
####################################################################################################
class MPU6050:
    .
    .
    .
    .
    .
    .
    .
    def initCompass(self):    
        #-------------------------------------------------------------------------------------------
        # Set up the I2C master pass through.
        #-------------------------------------------------------------------------------------------
        int_bypass = self.i2c.readU8(self.__MPU6050_RA_INT_PIN_CFG)
        self.i2c.write8(self.__MPU6050_RA_INT_PIN_CFG , int_bypass | 0x02)

        #-------------------------------------------------------------------------------------------
        # Connect directly to the bypassed magnetometer, and configured it for 16 bit continuous data
        #-------------------------------------------------------------------------------------------
        self.i2c_compass = I2C(0x0C)
        self.i2c_compass.write8(self.__AK893_RA_CNTL1, 0x16);
   
    def readCompass(self):
        compass_bytes = self.i2c_compass.readList(self.__AK893_RA_X_LO, 7)

        #-------------------------------------------------------------------------------------------
        # Convert the array of 6 bytes to 3 shorts - 7th byte kicks off another read
        #-------------------------------------------------------------------------------------------
        compass_data = []
        for ii in range(0, 6, 2):
            lobyte = compass_bytes[ii]
            hibyte = compass_bytes[ii + 1]
            if (hibyte > 127):
                hibyte -= 256

            compass_data.append((hibyte << 8) + lobyte)

        [mgx, mgy, mgz] = compass_data
        return mgx, mgy, mgz


mpu6050 = MPU6050()
mpu6050.initCompass()

try:
    while True:
        mx, my, mz = mpu6050.readCompass()
        print "%d, %d, %d" % (mx, my, mz)
        time.sleep(0.5)


except KeyboardInterrupt, e:
    pass
finally:
    pass


I probably won’t include the compass data for yaw and orientation yet, but at least the main hurdle has been overcome.

MPU-9½50 compass

Just for the record, I’ve finally found out why my first attempt to access the compass had failed; it’s not part of the MPU-9½50 slave I2C bus directly, but is attached as a slave to the MPU-9½50 master I2C bus as slave0.  Thus it’s configured via the MPU-9½50 slave0 I2C bus registers, and then the data is read by my code from the MPU-9½50 slave0 I2C registers that the MPU-9½50 supports.

There’s a couple of bits of code here and here for Arduino usage which probably give me enough useful information for me to try it out with.

I’ll probably tackle that next once I’ve got the first phase of the camera motion working well enough to demonstrate which could well be later today.