Compass and ViDAR

First, ViDAR: I’ve got fed up of calling it video macro-block something or the other, so now it’s Video Detection and Ranging.  I think it’s now working as well as it can without a height detector; it’s a long way from perfect, but until I can rely on stable height, I can’t identify further bugs.  I’m almost certainly going to wait for the Garmin LiDAR-Lite (GLL) to provide the height information.

There’s one bug I’ve fixed which explains why the PX4FLOW only uses a gyro – the angles involved are increments not absolutes.  The picture below tries to show that it’s not the absolute tilt of the quadcopter that needs to be accounted for, but the increment:

ViDAR angles

ViDAR angles

I’ve also got the compass working – I’ve just pulled out the critical bits.  Most of the samples I found use the MPU9250 master I2C to attach to the compass; compass data is then picked from MPU9250 registers.  But I found this version which exposed the compass registers for direct access which is much cleaner and clearer.

####################################################################################################
#
#  Gyroscope / Accelerometer class for reading position / movement.  Works with the Invensense IMUs:
#
#  - MPU-6050
#  - MPU-9150
#  - MPU-9250
#
#  The compass / magnetometer of the MPU-9250 is not used
#
####################################################################################################
class MPU6050:
    .
    .
    .
    .
    .
    .
    .
    def initCompass(self):    
        #-------------------------------------------------------------------------------------------
        # Set up the I2C master pass through.
        #-------------------------------------------------------------------------------------------
        int_bypass = self.i2c.readU8(self.__MPU6050_RA_INT_PIN_CFG)
        self.i2c.write8(self.__MPU6050_RA_INT_PIN_CFG , int_bypass | 0x02)

        #-------------------------------------------------------------------------------------------
        # Connect directly to the bypassed magnetometer, and configured it for 16 bit continuous data
        #-------------------------------------------------------------------------------------------
        self.i2c_compass = I2C(0x0C)
        self.i2c_compass.write8(self.__AK893_RA_CNTL1, 0x16);
   
    def readCompass(self):
        compass_bytes = self.i2c_compass.readList(self.__AK893_RA_X_LO, 7)

        #-------------------------------------------------------------------------------------------
        # Convert the array of 6 bytes to 3 shorts - 7th byte kicks off another read
        #-------------------------------------------------------------------------------------------
        compass_data = []
        for ii in range(0, 6, 2):
            lobyte = compass_bytes[ii]
            hibyte = compass_bytes[ii + 1]
            if (hibyte > 127):
                hibyte -= 256

            compass_data.append((hibyte << 8) + lobyte)

        [mgx, mgy, mgz] = compass_data
        return mgx, mgy, mgz


mpu6050 = MPU6050()
mpu6050.initCompass()

try:
    while True:
        mx, my, mz = mpu6050.readCompass()
        print "%d, %d, %d" % (mx, my, mz)
        time.sleep(0.5)


except KeyboardInterrupt, e:
    pass
finally:
    pass


I probably won’t include the compass data for yaw and orientation yet, but at least the main hurdle has been overcome.

Huzzah!

First test flight with the new code fusing camera motion with accelerometer motion, and it all just worked; despite a breeze she drifted just a few centimeters during an eight second flight.  The flight would have been longer but without a height sensor, Zoe descended ever so slowly during flight, and I had to kill the flight before she clipped her take-off cube and damaged her props!

I’ve double checked the stats, and they prove clearly the camera was feeding into the horizontal distance mix as shown by these spikes – just a bit of tuning required here to level them out.

Spikey!

Spikey!

Next step is to take a video so you can see for yourselves!  But that’s for another day, I’m too high on adrenaline to risk another!

Here’s her and her new rainbow ‘runway’:

Post-flight pride

Post-flight pride

Rainbow runway

Rainbow runway

I need to do a bit more testing, then I’ll update the code on GitHub.

Notes on “Fusion Thoughts”

Coupla things on the second half about video macro block fusion from the previous post. First, it’s

δx2 = h2 tan(θ)

not

δx2 = atan(δx2 / h2)

The latter is working out θ which is known anyway.  Just a bit of careless writing down thoughts by me.

More interestingly, the picture represents forward positive movement from left to right across the page.  However, the pitch angle is negative following the right hand rule which my code adheres to.  Therefore, the full conversion equation is:

δx = δx1 + h2 tan(θ)

The situation on the Y / roll axis is different. If you mentally flip the diagram left / right, following the right to left represents a positive move on the Y axis. The roll angle in this case is also positive according to the right hand rule so the equation looks like this:

δy = δy1 - h2 tan(φ)

There are other places in the code which already follow this ruling.

Fusion thoughts

Here’s my approach to handling the tilt for downward facing vertical and horizontal motion tracking.

Vertical compenstation

Vertical compensation

This is what I’d already worked out to take into account any tilt of the quadcopter frame and therefore LEDDAR sensor readings.

With this in place, this is the equivalent processing for the video motion tracking:

Horizontal compensation

Horizontal compensation

The results of both are in the earth frame, as are the flight plan targets, so I think I’ll swap to earth frame processing until the last step of processing.

One problem as I’m now pushing the limit of the code keeping up with the sensors: with diagnostics on and 1kHz IMU sampling, the IMU FIFO overflows as the code can’t keep up.  This is with Zoe (1GHz CPU speed) and without LEDDAR.

LEDDAR has already forced me to drop the IMU sample rate to 500Hz on Chloe; I really hope this is enough to also allow LEDDAR, macro-block processing and diagnostics to work without FIFO overflow.  I really don’t want to drop to the next level of 333Hz if I can help it.

Coding is in process already.

Video motion working perfectly…

but I’m sure how to fix it.

Imagine a video camera facing the ground over a high colour / contrast surface.  If the camera moves forward, the macro-blocks shows forward movement.  But if the camera stays stationary, but pitches backwards, then the macro-blocks show a similar motion forwards because the rearward tilt means the camera is now points forward of where it was pointing.

The motion processing code sees this camera rearward tilt as forward motion, and so applies more rearward tilt to compensate – I think you can see where this leads too!

I’m obviously going to have to remove the perceived forward motion due to rearward pitch by applying some maths using the pitch angle, but I’m not quite sure what it is yet.  The need for this is backed up by the fact the PX4FLOW has a gyro built in, so it too must have been using it for that type of compensation.  The fact it’s a gyro suggests it may not be rotation matrix maths, but something simpler.  Thinking hat on.


Just realized I’d solved exactly this problem for height, essentially knowing height (PX4FLOW has URF) and incremental angle change in the quadcopter frame (PX4FLOW has a gyroscope), you can calculate the distance shift due to the tilt, subtract it from the camera values and thus come up with the real distance

Everything but the kitchen sink from macro blocks?

Currently the macros blocks are just averaged to extract lateral distance increment vectors between frames.  Due to the fixed frame rate, these can produce velocity increments.  Both can then be integrated to produce absolutes.  But I suspect there’s even more information available.

Imagine videoing an image like this multi-coloured circle:

Newton Disc

Newton Disc

It’s pretty simple to see that if the camera moved laterally between two frames, it’d be pretty straight forward for the video compression algorithm to break the change down into a set of identical macro-block vectors, each showing the direction and distance the camera had shifted between frames.  This is what I’m doing now by simply averaging the macro-blocks.

But imagine the camera doesn’t move laterally, but instead it rotates and the distance between camera and circle increases.  By rotating each macro-block vector by the position it is in the frame compared to the center point and averaging the results, what results is a circumferential component representing yaw change, and an axial component representing height change.

I think the way to approach this is first to get the lateral movement by simply averaging the macro-block vectors as now; the yaw and height components will cancel themselves out.

Then by shifting the contents of the macro-block frame by the averaged lateral movement, the axis is brought to the center of the frame – some macro-blocks will be discarded to ensure the revised macro-block frame is square around the new center point.

Each of the macro-block vectors is then rotated according to the position in the new square frame.The angle of each macro-block in the frame is pretty easy to work out (e.g. a 4×4 square has rotation angles of 45, 135, 225 and 315 degrees, 9×9 square has blocks to be rotated by 0, 45, 90, 135, 180, 225, 270, 315 degrees), so now averaging the X and Y axis of these rotated macro-block vectors gives a measure of yaw and size change (height).  I’d need to include distance from the center when averaging out these rotated blocks.

At a push, even pitch and roll could be obtained because they would distort the circle into an oval.

Yes, there’s calibration to do, and there’s a dependency on textured multicoloured surfaces, and the accuracy will be very dependent on frame size and rate.  Nevertheless, in the perfect world, it should all just work(TM).  How cool would that be to having the Raspberry Pi camera providing this level of information!  No other sensors would be required except for a compass for orientation, GPS for location, and LiDAR for obstacle detection and avoidance.  How cool would that be!

Anyway, enough dreaming for now, back to the real world!

Goal!

The paint has hardly dried on the raspivid solution for unbuffered macro-block streaming before a conversation on the Raspberry Pi forums yielded the way to get picamera to stream the video macro-block without buffering by explicitly opening an unbuffered output file rather than leaving it to picamera to second guess what’s needed.

Cutting to the chase, here’s the result.

Down-facing RaspiCam stabilized Zoe flight from Andy Baker on Vimeo.

Yes she drifted forward into the football goal over the course of the 20 seconds,  but that’s 4 times longer to drift that far so frankly, I’m stunned how good the flight was!  I have ideas as to why, and these are my next post.

And the best bit? No damage was done to my son’s goal net!

 

Motion tracking is now working

Courtesy of a discussion with 6by9 on the Raspberry Pi forum, the motion tracking for Zoe is now working using raspivid which has an option to not buffer the macro-block output.  Here’s a plot of a passive flight where she moves forward and back (X) a couple of times, and then left and right (Y).

Macro-block vs accelerometer

Macro-block vs accelerometer

The plot clearly shows the macro-blocks and accelerometer X and Y readings are in sync (if not quite at the same scale), so tomorrow’s the day to set Zoe loose in the garden with the motors powered up – fingers crossed no cartwheels across the lawn this time!

Video buffering

I flew Zoe over the weekend with the camera motion running, and it was pretty exiting watching her cartwheel across the lawn – clearly there’s more to do in the testing before I try again!

So I did a passive flight in the ‘lab’ just now and got these stats; I had hoped to show a comparison of the accelerometer measured distances vs the camera video distances, but that’s not what I got:

Motion stats

Motion stats

The lower 2 graphs are the interesting ones: the left one shows how bad the integration error is with the accelerometer – all the details in the accelerometer are swamped by integrated offset errors.  It also shows that we are only getting data from the video every four seconds.

So I did a test with my raw motion code (i.e. no overheads from the other sensors in the quadcopter code), and it showed those 4 second batches contain 39 samples, so clearly there’s some buffering of the 10 Hz video frame rate as configured.

So next step is to work out how to identify whether it’s the FIFO or the video that’s doing the buffering, and how to stop it!