Fecking rangefinders

  • SRF02 can’t run at 400kbps I2C baudrate necessary for reading maximum data from the MPU-9250 IMU.
  • TeraRanger needs a 12V power supply and provides 5V serial or I2C meaning level shifting to interface with the Raspberry Pi serial / I2C pins
  • LEDDAR uses weird and slow modbus protocol over serial meaning the IMU FIFO overflows if I’m sampling it above 500Hz – 1kHz works perfectly without LEDDAR.  Essentially, it’s wasting time I’m going to need for Camera, GPS, and Scanse Sweep processing,
  • Garmin LiDAR-Lite supports the necessary 400kbps I2C baudrate at 3.3V, but requires low level I2C access requiring a 20ms gap between sending the read request, and reading the response data.  Arduino provides this low level access, higher level smbus I2C via Python does not.  There are also comments around suggesting no other I2C activity can take place during that 20ms i.e. can’t access IMU during the 20ms!

All the sensors work, it’s just the API to access the data that’s non-standard in every one of these.  Did nobody on the design teams consider using a standard API for modern interface technology?  FFS!

P.S. Yes, I know there’s a URF that supports 400kbps I2C baudrate at 3.3V, but it has a bloody great potentiometer on the underside meaning it’s nigh on impossible to attach it ground-facing under a quadcopter.

P.P.S.  I know I could use the PX4FLOW; I actually have 3 but only one of these (the original) works; the clones both do not.  And anyway, where’s the fun in that compared to a vertical rangefinder, the Raspberry Pi Camera and the MPU-9250 gyro i.e. the three components that make up the PX4FLOW?

Compass and ViDAR

First, ViDAR: I’ve got fed up of calling it video macro-block something or the other, so now it’s Video Detection and Ranging.  I think it’s now working as well as it can without a height detector; it’s a long way from perfect, but until I can rely on stable height, I can’t identify further bugs.  I’m almost certainly going to wait for the Garmin LiDAR-Lite (GLL) to provide the height information.

There’s one bug I’ve fixed which explains why the PX4FLOW only uses a gyro – the angles involved are increments not absolutes.  The picture below tries to show that it’s not the absolute tilt of the quadcopter that needs to be accounted for, but the increment:

ViDAR angles

ViDAR angles

I’ve also got the compass working – I’ve just pulled out the critical bits.  Most of the samples I found use the MPU9250 master I2C to attach to the compass; compass data is then picked from MPU9250 registers.  But I found this version which exposed the compass registers for direct access which is much cleaner and clearer.

#  Gyroscope / Accelerometer class for reading position / movement.  Works with the Invensense IMUs:
#  - MPU-6050
#  - MPU-9150
#  - MPU-9250
#  The compass / magnetometer of the MPU-9250 is not used
class MPU6050:
    def initCompass(self):    
        # Set up the I2C master pass through.
        int_bypass = self.i2c.readU8(self.__MPU6050_RA_INT_PIN_CFG)
        self.i2c.write8(self.__MPU6050_RA_INT_PIN_CFG , int_bypass | 0x02)

        # Connect directly to the bypassed magnetometer, and configured it for 16 bit continuous data
        self.i2c_compass = I2C(0x0C)
        self.i2c_compass.write8(self.__AK893_RA_CNTL1, 0x16);
    def readCompass(self):
        compass_bytes = self.i2c_compass.readList(self.__AK893_RA_X_LO, 7)

        # Convert the array of 6 bytes to 3 shorts - 7th byte kicks off another read
        compass_data = []
        for ii in range(0, 6, 2):
            lobyte = compass_bytes[ii]
            hibyte = compass_bytes[ii + 1]
            if (hibyte > 127):
                hibyte -= 256

            compass_data.append((hibyte << 8) + lobyte)

        [mgx, mgy, mgz] = compass_data
        return mgx, mgy, mgz

mpu6050 = MPU6050()

    while True:
        mx, my, mz = mpu6050.readCompass()
        print "%d, %d, %d" % (mx, my, mz)

except KeyboardInterrupt, e:

I probably won’t include the compass data for yaw and orientation yet, but at least the main hurdle has been overcome.

What’s better than a PX4FLOW?

The Raspberry Pi camera of course:

RaspiCam Video motion blocks

RaspiCam Video motion blocks

A very similar walk around the garden as before, but running the Raspberry Pi camera, ground facing, videoing at 10 frames per second at 320 x 320 resolution, producing 16 x 16 macro-blocks per frame, which are averaged per frame and logged.

The macro blocks give the pixel shift between one frame and the next to help with the frame compression; I’m not sure whether the units are in pixels or macro blocks, but that’s simple to resolve.  Combined with the height from the LEDDAR, and the focal length of the lens, it’ll be trivial to convert these readings to a distance in meters.

The results here are at least as good as the PX4FLOW, if not better, and the processing of the macro-blocks to distance is very lightweight.

This is definitely worth pursuing as it’s much more in keeping with how I want this to work.  The PX4FLOW has served its purpose well in that with my understanding how it worked, it opened up how it could be replaced with the RPi Camera.

There are further bonuses too: because of the video fixed frame rate, the macro blocks are producing distance increments, whereas the PX4FLOW only produced velocities, and that means I can add in horizontal distance PIDs to kill drift and ensure the quad always hovers over the same spot.  And even better, I’m no longer gated on the arrival of the new PCBs: these were required for X8 and I2C for the PX4FLOW; I’ll need them eventually for X8 but for now, the current PCB provides everything I need.

We are most definitely on a roll again!

Raw PX4FLOW data

Raw PX4FLOW stats

Raw PX4FLOW stats

Another walk around the garden, with nearly raw data – raw data is in pixels per second, so requires knowledge of the resolution of the camera and it’s focal length; the next step up from there comes from the PX4FLOW knowing the camera stats, and so being able to converts pixels /  second to radians per second.  To then convert to velocities requires the height, hence the URF.

In this test, I’d actually removed the URF because it’s still not working, and instead, I used a fixed height of 1m as a reasonable approximation of the height I was holding the test rig off the ground.  The units on the graph are meters, and the results look mostly pretty accurate up to a point: from 0,0 there’s some squiggles getting the PX4FLOW from the office to the garden, then I walked along the side of the house for ~10m, then across the garden for about 7.5m, before returning back to the office.  It’s that last section that makes no sense: the direction of the walk is accurate, and you can see the corresponding squiggles at the end as I took the PX4HAWK back into the office, but the graph clearly shows the start and end “offices” separated by 5m x 7.5m.

There’s absolutely no way I can deploy the PX4FLOW into Chloe / Hermione while there’s a 9m (hypotenuse of 5 x 7.5 triangle) inaccuracy in the integration of the horizontal velocities over the course of a 98.5s stroll.


Progress report

Here’s Chloe’s HoG in Hermione’s frame.  You can see I’ve put a large WiFi antenna on one of the side platforms; the other is for GPS in the future.  The frame itself is not quite complete – I still need to install a platform on the underside to hang the sensors off.  In addition, the LID (LiDAR Installation Desktop) needs assembling – it’s just here for show currently.

Chloe's HoG in Hermione's frame

Chloe’s HoG in Hermione’s frame

Here’s a flight with just accelerometer and gyro in control for basic stability testing.

With these 1340 high pitch Chinese CF props, there’s no shortage of lift power despite the fact she weighs 2.8kg, so I’m going to defer the X8 format for a while on financial grounds – 4 new T-motor U3 motors and 40A ESCs costs nearly £400.

The PCBs are on order, and first setup will be for LEDDAR and PX4FLOW.

Oddly, only one of my PX4FLOWs works properly – for some reason, the newer one can’t see the URF, so can’t provide velocities, only angular shift rates; however, LEDDAR will give me the height allowing me to convert the angular rates to horizontal velocities.  If that works, that also opens up the possibility of replacing the PX4FLOW with a Raspi Camera using the H.264 video macro block increments to allow me to do the equivalent of the PX4FLOW motion processing myself, which if possible, would please me greatly.

Still lurking in the background is getting the compass working to overcome the huge yaw you can see in the video.


Hermione development calendar



  • New legs (awaiting arrival)
  • New PX4FLOW 1.3.1 (awaiting arrival of a slow boat from China)
  • New PCBs – X8 compatible but testing first in quad mode



  • URF (and perhaps LEDDAR) height fused with double integrated accelerometer Z axis for qdz_input distance PID
  • PX4FLOW x and y velocities fused with single integrated accelerometer X, Z axes for qvx_input, qvy_input velocity PIDs
  • ignore the PX4FLOW gyro – stick with the MPU9250
  • ignore the PX4FLOW timings – stick to the MPU9250 sampling clock for integration etc
  • working compass
  • look after the kids during their summer holidays


  • get X8 motors and ESC but hold off installation depending on progress of above.



  • GPS via USB (or 2 port hub if A3 wire built in WiFi has not been announced by then)
  • external antenna for A3 if arrived
  • test flight to GPS defined target at fixed speed / height
  • Cotswold Jam 24th September 1pm – 4pm



  • A3 (speculative arrival)



  • Remove LEDDAR and instead use UART for Scanse Sweep
  • test obstacle avoidance on a GPS target flight

A walk in the park

I wired up PIX4FLOW to my test rig, knocked together some test code, set up the I2C baudrate to 400kbps to make sure it worked at the same rate as the IMU needs.

PX4FLOW test rig

PX4FLOW test rig

I took it for a walk around the garden: from the office to the garden via the hallway, then an anticlockwise 6m diameter circle around the garden before returning back to the office.  The code was sampling at about 20Hz, with the test rig about 60cm from the ground with the Y axis always pointing away from me.  The walk took about 80s.

Here’s the X, Y distance graph based upon integrating the velocities the PIX4FLOW gives.

Garden plot

Garden plot

A quick walk through:

  • 0,0 is in the office
  • throughout the test the Y axis pointed away from me
  • beyond the 4m point, I walked in an anti-clockwise circle
  • once complete I doubled back and headed back to the office.

I’m delighted with the garden section; because the y axis was always facing forwards, from the PX4FLOW point of view, it always went forwards, but when transformed to the earth frame using the gyro readings, it shows a really accurate plot shape.  Given this was just a green grass surface, I’m amazed the down facing camera did such a good job

Here’s the height graph from the inbuilt URF:


It’s good except for the spikes – I may need LEDDAR to make this better.  On the plus side, the height is not integrated, so the spikes do not have a persistent effect.

There were a few problems or inaccuracies:

  • the sensors should timestamp each read, but the register value did not change so I had to do this myself with time.time() – I have a second sensor on the way to see if it’s the sensors faul (ebay PX4FLOW to find them)
  • the scale of the circle is wrong:  the graph shows the circle to be about 3m diameter, but it should be more like 6m – this may just be a problem in my maths
  • the end of the walk should return to the start, yet it’s 6m out – the shape of the walk out of and back to the office match, but there’s a 30° error by the end of the flight.  I suspect only compass will fix this.

One unexpected positive was that although I’ve heard the I2C SMBus only supports 32 byte buffers, it seemed fine reading the 77 byte registers in a single sequence.

Overall then as a first trial I’m pretty delighted to the extent it’s now worth getting the new PCB for Chloe / Hermione.

Flight of the body snatcher

Body snatcher flight from Andy Baker on Vimeo.

So this is the first flight of Chloe’s HoG running on Hermione’s frame.

She drifts right as Chloe always has but more so, and she doesn’t have LEDDAR installed, so the hover height is wrong. She’s seriously underpowered with the T-motor 1240 props, but the X8 configuration will fix that, but will require new PCBs.

Due to the lack of LEDDAR, she’s running the velocity rather than distance as the outer PID; combined with the lower power, that means she doesn’t get as high as she should.

Finally, you’ll notice significant yaw that eventually corrects as the yaw rate PID i-gain kicks in; again that due to the size of the props and X8 will resolve that.

Next steps are to

  • add code for PX4FLOW
  • get legs to allow underside sensors
  • add PIX4FLOW – LEDDAR is going into (permanent?) retirement as long as the PIX4FLOW provides good vertical height data with it’s URF.
  • bodge up the current PCB to add another 5V output pin for PIX4FLOW
  • bodge up the PX4FLOW I2C wiring for 0.1″ pitch connection to PCB
  • get the PCBs for X8.

Plenty to keep me busy for a while!


Basic Installation

  • sudo apt-get gpsd gpsd_tools python-gps
  • Edit /etc/default/gpsd and add the GPS device into the boot configuration of gpsd
  • Reboot or restart the GPS daemon gpsd.
  • Try these various tests to make sure it’s working.

Python Script

I’ve written an extension of an Adafruit python script which captures GPS data and writes it to screen and file:

from __future__ import division
import gps
import os
import math

# Listen on port 2947 (gpsd) of localhost
session = gps.gps("localhost", "2947")
session.stream(gps.WATCH_ENABLE | gps.WATCH_NEWSTYLE)

num_sats = 0
latitude = 0.0
longitude = 0.0
time = ""
epx = 0.0
epy = 0.0
epv = 0.0
ept = 0.0
eps = 0.0
climb = 0.0
altitude = 0.0
speed = 0.0
direction = 0.0

lat1 = 0.0
lon1 = 0.0

R = 6371000 # radius of the earth in meters

fp_name = "gpstrack.csv"
header = "time, latitude, longitude, satellites, climb, altitude, speed, direction, dx, dy"


print header

with open(fp_name, "wb") as fp:
    fp.write(header + "\n")

    # With a based level longitude and latitude in degrees, we can be the current X and Y coordinates
    # relative to the takeoff position thus:
    # psi = longitude => p below
    # lambda = latitude => l below
    # Using equirectangular approximation: 
    # x = (l2 - l1) * cos ((p1 + p2) / 2)
    # y = (p2 - p1)
    # d = R * ((x*x + y*y) ^ 0.5)
    # More at http://www.movable-type.co.uk/scripts/latlong.html

    while True:
            report = session.next()
#            print report
#            os.system("clear")
            if report['class'] == 'TPV':
                if hasattr(report, 'time'):  # Time 
                    time = report.time
                if hasattr(report, 'ept'):   # Estimated timestamp error - seconds 
                    ept = report.ept

                if hasattr(report, 'lon'):   # Longitude in degrees
                    longitude = report.lon
                if hasattr(report, 'epx'):   # Estimated longitude error - meters 
                    epx = report.epx

                if hasattr(report, 'lat'):   # Latitude in degrees
                    latitude = report.lat
                if hasattr(report, 'epy'):   # Estimated latitude error - meters 
                    epy = report.epy

                if hasattr(report, 'alt'):   # Altitude - meters
                    altitude = report.alt
                if hasattr(report, 'epv'):   # Estimated altitude error - meters
                    epv = report.epv

                if hasattr(report, 'track'): # Direction - degrees from true north
                    direction = report.track     
                if hasattr(report, 'epd'):   # Estimated direction error - degrees 
                    epd = report.epd

                if hasattr(report, 'climb'): # Climb velocity - meters per second
                    climb = report.climb
                if hasattr(report, 'epc'):   # Estimated climb error - meters per seconds
                    epc = report.epc

                if hasattr(report, 'speed'): # Speed over ground - meters per second
                    speed = report.speed     
                if hasattr(report, 'eps'):   # Estimated speed error - meters per second
                    eps = report.eps

            if report['class'] == 'SKY':
                if hasattr(report, 'satellites'):
                    num_sats = 0
                    for satellite in report.satellites:
                        if hasattr(satellite, 'used') and satellite.used:
                            num_sats += 1

            # Calculate the X,Y coordinates in meters
            lat2 = latitude * math.pi / 180
            lon2 = longitude * math.pi / 180

            dx = (lat2 - lat1) * math.cos((lon1 + lon2) / 2) * R
            dy = (lon2 - lon1) * R
            lon1 = lon2
            lat1 = lat2

            output = "%s, %f, %f, %d, %f, %f, %f, %f, %f, %f" % (time, 

            print output
            fp.write(output + "\n")
        except KeyError:
        except KeyboardInterrupt:
        except StopIteration:
            session = None
            print "GPSD has terminated"

Here’s what I get plotting a walk around the garden carrying her – it’s not a great match.

Garden track

Garden track

0.0m is my office window.  Everything beyond is -10m, -20m is the garden;  the walk in the garden was

  • 15m SSW
  • 9m ESE
  • 18m N

That’s an approximate right angle triangle, but the scan doesn’t show that without a lot of imagination.  The scaling is right, but the direction is very noisy.  I’m not sure whether this is a code bug or just pushing the limits of GPS accuracy.  I’ll have a few more goes to see if I can get any better.  GPS samples are at roughly 1Hz.

Code update plans for Chloe

  • ultimately, that output file with be a FIFO read by Chloe’s code.  She’ll use select.select() to monitor the file contents, and sleep, whereas now she just sleeps.
  • I’ll add command line parameters –indoors or –outdoors to select the range of sensors to be used – primarily GPS; PX4FLOW / Scanse Sweep and LEDDAR will work in both environments.
  • If using the GPS, add startup code to ensure at least 4 satellites are feeding the GPS for accurate triangulation – during the walk, I had 4 indoors and 6 outdoors; obviously the more satellites, the greater the accuracy.


Chloe, LEDDAR and distance PIDs from Andy Baker on Vimeo.

My distance / direction PIDs are now at nearly perfect tuning, and along the way, I’ve also learned a lot about how to tune the three tiers of PIDs in the horizontal motion hierarchy; it’s been slow, hence the lack of posts.

As far as the PIDs are concerned, the key is that the I-gain on all but the vertical velocity and yaw PIDs is now set to 1/100th of the P-gain; from the still existing wobbles it could probably do with going lower yet, though I think I may still need some to manage windier conditions.

The remaining horizontal drift is likely to be unfixable without the Scanse Sweep or PX4FLOW; here it’s just using double integration of (acceleration – gravity) which frankly, I’m amazed it performs as well as it does!

The yaw still needs some attention; currently, it too has two PIDs (angle and rotation), and I think that may be one too many and the latter can be removed.  At the same time, since the input is the integration of the gyro, then it’s never going to be much better until I get the compass working – it’s continuing to not play ball.

So for now, focus is on minor PID tuning, sorting out the yaw PIDs and getting the compass working, but before then, I think she’s ready to finally try some intentional horizontal motion; I have a flight plan put together than has her tracing a horizontal square at fixed height – 2 second segments each of take-off, forward, left, back, right and finally landing.  Forecast for today is good, so I’ll try that later, and post the results unless they are shamefully embarrassing!

Oh, and the reason behind the title of this post?  It’s now confirmed that the distance PIDs are working, so that a combination of PX4FLOW for 3 axis velocities, along with Scanse Sweep object avoidance could be used successfully to wander around a simple maze successfully  The premature but cunning plan to build a back-garden maze is already underway.