Hermione’s proof of the pudding

Here finally is her flying in a stable hover for a long time without rocketing off into space.  Yes, she’s wobbly, but that’s a simple pitch / roll rotation rate PID tune much like I had to do with Zoe.  She’s running the video at 560 x 560 pixels at 10 fps, hence no need for the IKEA play mat.

Finally I can move on to adding the compass and GPS into the mix.

First lateral flight plan.

For the first time in 4 years, I tried a lateral flight plan, fairly confident in the belief that stable hover in a headwind is no different to intentional movement forward in no wind.  The flight plan was:

  • Climb at 30cm/s for 2s
  • Hover for 1s
  • Move forwards at 30cm/s for 2s
  • Hover for 1s
  • Move backwards at 30s/s for 2s
  • Hover for 1s
  • Descend at 30cm/s for 2s

Was I right?  You tell me:

Zoe++

Zoe is now running my split cluster gather + process code for the RaspiCam video macro-blocks.  She has super-bright LEDs from Broadcom with ceramic heatsinks so the frame doesn’t melt and she’s running the video at 400 x 400 px at 10fps.

The results?:

And this peops, is nearly a good as it can be without more CPU cores or (heaven forbid) moving away from interpreted CPython to pre-compiled C*.  Don’t get me wrong, I can (will?) probably add minor tweaks to process compass data – the code is already collecting that; adding intentional lateral motion to the flight plan costs absolutely nothing – hover stably in a stable headwind is identical processing to intentional forwards movement in no wind.  But beyond that, I need more CPU cores without significant additional power requirements to support GPS and Scanse Sweep. I hope that’s what the A3 eventually brings.

I’ve updated everything I can on GitHub to represent the current (and perhaps final) state of play.


* That’s not quite true; PyPy is python with a just in time (JIT) compiler. Apparently, it’s the dogs’ bollocks, the mutts’ nuts, the puppies’ plums. Yet when I last tried, it was slower, probably due to the RPi.GPIO and RPIO libraries needed. To integrate those with pypy requires a lot of work which up until now has simply not been necessary.

On a cold winter’s morning…


Both flights use identical code.  There are two tweaks compared to the previous videos:

  1. I’ve reduced the gyro rate PID P gain from 25 to 20 which has hugely increased the stability
  2. Zoe is using my refined algorithm for picking out the peaks in the macro-block frames – I think this is working better but there’s one further refinement I can make which should make it better yet.

I’d have liked to show Hermione doing the same, but for some reason she’s getting FIFO overflows.  My best guess is that her A+ overclocked to turbo (1GHz CPU) isn’t as fast as a Zero’s default setting of 1GHz – no idea why.  My first attempt on this has been improved scheduling by splitting the macro-block vectors processing into two phases:

  1. build up the dictionary of the set of macro-blocks
  2. processing the dictionary to identify the peaks.

Zoe does this in one fell swoop; Hermione schedules each independently, checking in between that the FIFO hasn’t filled up to a significant level, and if it has, deal with that first.  This isn’t quite working yet in passive test, even on Zoe, and I can’t find out why!  More anon.

Hermione’s I/O errors

Rather than spending ages trying to work out how to get rid of Hermione’s electronic errors, I’ve opted for a much simpler solution of powering her HoG from a 5V 2.1A battery bank.  I used to do this in the olden days, and only swappend to LiPo + regulator when I needed space for the LiDAR and camera.  But Hermione has plenty of spare space for both.  Here’s what happened:

She was doing pretty well tracking against the lawn in windy conditions, right up to the point she went nuts and zoomed off into the sky. Shortly after, the LiDAR detected she was higher than the flight plan said she should be and the killed the flight.  Once she vanishes from the shot, watch for her reflection in the window as she falls down to earth.  She landed in a soft flower bed but that still resulted in two cut motor wires.  No I/O error happened (good), but this type of radical error normally accompanies a FIFO overflow, and one wasn’t reported either.  That leaves me stuck having to keep risking these dodgy flights until I can track down the cause.

30s with, <10s without

A few test runs.  In summary, with the LiDAR and Camera fused with the IMU, Zoe stays over her play mat at a controlled height for the length of the 30s flight.  Without the fusion, she lasted just a few seconds before she drifted off the mat, lost her height, or headed to me with menace (kill ensued).  I think that’s pretty conclusive code fusion works!

With Fusion:

Without Fusion:

Strutting her stuff

Finally, fusion worth showing.

Yes, height’s a bit variable as she doesn’t accurate height readings below about 20cm.

Yes, it’s a bit jiggery because the scale of the IMU and other sensors aren’t quite in sync.

But fundamentally, it works – nigh on zero drift for 20s.  With just the IMU, I couldn’t get this minimal level of drift for more than a few seconds.

Next steps: take her out for a longer, higher flight to really prove how well this is working.

Goal!

The paint has hardly dried on the raspivid solution for unbuffered macro-block streaming before a conversation on the Raspberry Pi forums yielded the way to get picamera to stream the video macro-block without buffering by explicitly opening an unbuffered output file rather than leaving it to picamera to second guess what’s needed.

Cutting to the chase, here’s the result.

Down-facing RaspiCam stabilized Zoe flight from Andy Baker on Vimeo.

Yes she drifted forward into the football goal over the course of the 20 seconds,  but that’s 4 times longer to drift that far so frankly, I’m stunned how good the flight was!  I have ideas as to why, and these are my next post.

And the best bit? No damage was done to my son’s goal net!

 

Camera motion tracking code

I’ve reworked the code so that the video is collected in a daemonized process and fed into a shared memory FIFO. The main code for processing the camera output is now organised to be easily integrated into the quadcopter code. This’ll happen for Hermione once I’ve built her with a new PCB and 512MB memory A+. I suspect the processing overhead of this is very light, given that most of the video processing happens in the GPU, and the motion processing is some very simple averaging.

#!/usr/bin/python
from __future__ import division
import os
import sys
import picamera
import select
import struct
import subprocess
import signal
import time

####################################################################################################
#
# Motion.py - motion tracker based upon video-frame macro-blocks.  Note this happens per flight - 
#             nothing is carried between flights when this is merged with the quadcopter code.
#
####################################################################################################

#--------------------------------------------------------------------------------------------------
# Video at 10fps. Each frame is 320 x 320 pixels.  Each macro-block is 16 x 16 pixels.  Due to an 
# extra column of macro-blocks (dunno why), that means each frame breaks down into 21 columns by 
# 20 rows = 420 macro-blocks, each of which is 4 bytes - 1 signed byte X, 1 signed byte Y and 2 unsigned
# bytes SAD (sum of absolute differences). 
#--------------------------------------------------------------------------------------------------
def RecordVideo():
    print "Video process: started"
    with picamera.PiCamera() as camera:
        camera.resolution = (320, 320)
        camera.framerate = 10

        camera.start_recording('/dev/null', format='h264', motion_output="/dev/shm/motion_stream", quality=23)

        try:
            while True:
                camera.wait_recording(1.0)
        except KeyboardInterrupt:
            pass
        finally:            
            try:
                camera.stop_recording()
            except IOError:
                pass
    print "Video process: stopped"

#---------------------------------------------------------------------------------------------------
# Check if I am the video process            
#---------------------------------------------------------------------------------------------------
if len(sys.argv) > 1 and sys.argv[1] == "video":
    RecordVideo()
    sys.exit()

#---------------------------------------------------------------------------------------------------
# Setup a shared memory based data stream for the PiCamera video motion output
#---------------------------------------------------------------------------------------------------
os.mkfifo("/dev/shm/motion_stream")

#---------------------------------------------------------------------------------------------------
# Start up the video camera as a new process. Run it in its own process group so that Ctrl-C doesn't
# get through.
#---------------------------------------------------------------------------------------------------
def Daemonize():
    os.setpgrp()
video = subprocess.Popen(["python", "motion.py", "video"], preexec_fn =  Daemonize)

#---------------------------------------------------------------------------------------------------
# Off we go; set up the format for parsing a frame of macro blocks
#---------------------------------------------------------------------------------------------------
format = '=' + 'bbH' * int(1680 / 4)

#---------------------------------------------------------------------------------------------------
# Wait until we can open the FIFO from the video process
#---------------------------------------------------------------------------------------------------
camera_installed = True
read_list = []
write_list = []
exception_list = []

if camera_installed:
    while True:
        try:
            py_fifo = open("/dev/shm/motion_stream", "rb")
        except:
            continue
        else:
            break
    print "Main process: fifo opened"
    read_list = [py_fifo]


motion_data = open("motion_data.csv", "w")
motion_data.write("idx, idy, adx, ady, sad\n")

total_bytes = 0
frame_rate = 10
scale = 10000

#---------------------------------------------------------------------------------------------------
# Per frame distance and velocity increments
#---------------------------------------------------------------------------------------------------
ivx = 0.0
ivy = 0.0
idx = 0.0
idy = 0.0

#---------------------------------------------------------------------------------------------------
# Per flight absolute distance and velocity integrals
#---------------------------------------------------------------------------------------------------
avx = 0.0
avy = 0.0
adx = 0.0
ady = 0.0

start_time = time.time()

try:
    while True:
        #-------------------------------------------------------------------------------------------
        # The sleep time for select is defined by how many batches of data are sitting in the FIFO
        # compared to how many we want to process per motion processing (samples per motion)
        #
        # timeout = (samples_per_motion - mpu6050.numFIFOSamles()) / sampling_rate
        # check for negative timeout.
        #-------------------------------------------------------------------------------------------

        #-------------------------------------------------------------------------------------------
        # Wait for the next whole frame
        #-------------------------------------------------------------------------------------------
        read_out, write_out, exception_out = select.select(read_list, write_list, exception_list)

        if camera_installed and len(read_out) != 0:
            #---------------------------------------------------------------------------------------
            # We have new data on the video FIFO; get it, and make sure it really is a whole frame of 
            # 21 columns x 20 rows x 4 bytes for macro block.
            #---------------------------------------------------------------------------------------
            frame = py_fifo.read(1680)
            if len(frame) != 1680:
                print "ERROR: incomplete frame received"
                break

            #---------------------------------------------------------------------------------------
            # Convert to byte, byte, ushort of x, y, sad
            #---------------------------------------------------------------------------------------
            iframe = struct.unpack(format, frame)

            #---------------------------------------------------------------------------------------
            # Iterate through the 21 x 20 macro blocks averaging the X and Y vectors of the frame based
            # upon the SAD (sum of absolute differences, lower is better).  
            #---------------------------------------------------------------------------------------
            ivx = 0.0
            ivy = 0.0
            sad = 0

            for ii in range(0, 420 * 3, 3):
                ivy += iframe[ii]
                ivx += iframe[ii + 1]
                sad += iframe[ii + 2]

            #---------------------------------------------------------------------------------------
            # Scale the macro block values to the speed increment in meters per second
            #---------------------------------------------------------------------------------------
            ivx /= scale
            ivy /= scale     

            #---------------------------------------------------------------------------------------
            # Use the frame rate to convert velocity increment to distance increment
            #---------------------------------------------------------------------------------------
            idt = 1 / frame_rate
            idx = ivx * idt
            idy = ivy * idt

            #---------------------------------------------------------------------------------------
            # Integrate (sum due to fixed frame rate) the increments to produce total distance and velocity.
            # Note that when producing the diagnostic earth frame distance, it's the increment in distance
            # that's rotated and added to the total ed*
            #---------------------------------------------------------------------------------------
            avx += ivx
            avy += ivy
            adx += idx
            ady += idy

            time_stamp = time.time() - start_time

            motion_data.write("%f, %f, %f, %f, %f, %d\n" % (time_stamp, idx, idy, adx, ady, sad))

except:
    pass


#---------------------------------------------------------------------------------------------------
# Stop the video process
#---------------------------------------------------------------------------------------------------
video.send_signal(signal.SIGINT)

motion_data.flush()
motion_data.close()

py_fifo.close()
os.unlink("/dev/shm/motion_stream")