Stats analysis

From the same run as the last two posts, I’ve been analysing the stats.

First the integrated gyro showing the yaw it detected:

yaw

yaw

This shows the unintentional clockwise (negative) yaw of about -10° shortly after take-off followed by the intentional anticlockwise yaw of +90° as Hermione headed off left.  I still need to have a play with the yaw rate PID to try to kill that initial -10° yaw.

More interesting, to me at least, is the motion processing:

FIFO stats

FIFO stats

The graph shows how often the FIFO was emptied.  In an unhindered world, the FIFO is emptied every 10ms (0.01s) and the PWM to the ESCs update.  Peaks above that means something else was taking up the spare time.  The FIFO overflows at just above 84ms (512 bytes total FIFO size / 12 bytes per IMU sample / 500Hz IMU sampling rate =  85.333ms), and the highest shown here is 38ms, well within safety limits.  I’m particularly delighted that the majority of the spikes are within the 10 to 20ms range – that strongly suggests the split phases of macro-block processing is working like a dream.

The 20ms norm means the PWM is updated at 50Hz.  Were the PWM consistently updated at less than 50Hz, it would really start to show in the stability of the flight.  But luckily it isn’t, meaning there’s probably just enough room to finally squeeze in compass and GPS processing.

In passing, it’s worth saying that such levels of stats would be impossible if I was using a microcontroller (Arduino etc) – this 11s flight logged 1.46MB of data to shared memory, and ultimately to SD card.  It logs both initial hard coded constants, and every dynamic variable for every cycle of motion processing – that means nothing is missing and it’s possible to diagnose any problem as long as the reader knows the code intimately.  Without these logs, it would have made it nigh on impossible for the ignorant me 4+ years ago to achieve what I have now.


* I rate myself as experienced having spent over 4 years on this.

The GPS + compass plan

My intent with GPS and compass it that Hermione flies from an arbitrary take-off location to a predetermined target GPS location, oriented in the direction she’s flying.

Breaking that down into a little more detail.

  • Turn Hermione on and calibrate the compass, and wait for enough GPS satellites to be acquired.
  • Carry her to the destination landing point and capture the GPS coordinated, saving them to file.
  • Move to a random place in the open flying area and kick off the flight.
  • Before take-off, acquire the GPS coordinates of the starting point, and from that and the target coordinates, get the 3D flight direction vector
  • On takeoff, climb to 1m, and while hovering, yaw to point in the direction of the destination target vector using the compass as the only tool to give a N(X), W(Y), Up(Z) orientation vector – some account needs to be taken for magnetic north (compass) vs. true north (GPS)
  • Once done, fly towards the target, always pointing in the way she’s flying (i.e. yaw target is linked to velocity sensor input), current GPS position changing during the flight always realigning the direction target vector to the destination position.
  • On arrival at the target GPS location, she hovers for a second (i.e. braking) and decends.

There’s a lot of detail hidden in the summary above, not least the fact that GPS provides yet another feed for 3D distance and velocity vectors to be fused with the accelerometer / PiCamera / LiDAR, so I’m going to have to go through it step by step

The first is to fly a square again, but with her oriented to the next direction at the hover, and once moving to the next corner, have yaw follow the direction of movement.  Next comes compass calibration, and flight plan based upon magnetic north west and up.

However, someone’s invoked Murphy’s / Sod’s law on me again: Hermione is seeing the I2C errors again despite no hardware or software changes in this area.  Zoe is behaving better, and I’m trying to double the motion tracking by doubling the video frame rate / sampling rate for the Garmin LiDAR-Lite; the rate change is working for both, but the LiDAR readings see to be duff, reading 60cm when the flight height is less than 10cm.  Grr 🙁

Hermione’s proof of the pudding

Here finally is her flying in a stable hover for a long time without rocketing off into space.  Yes, she’s wobbly, but that’s a simple pitch / roll rotation rate PID tune much like I had to do with Zoe.  She’s running the video at 560 x 560 pixels at 10 fps, hence no need for the IKEA play mat.

Finally I can move on to adding the compass and GPS into the mix.

Hermione’s progress

Here’s Hermione with her new PCB.  It’s passed the passive tests; next step is to make sure each of the 8 motor ESCs are connected the right way to the respective PWM output on the PCB, and finally, I’ll do a quick flight with only the MPU-9250 as the sensors to tune the X8 PID gains.  Then she’s getting shelved.

Hermione's progress

Hermione’s progress

Zoe’s getting a new PCB so I can run the camera and Garmin LiDAR-Lite V3 on her too.  Hermione is huge compared to Zoe, and with the winter weather setting in, I’m going to need a system that’s small enough to test indoors.

Hermione will still be built – I need her extra size to incorporate the Scance Sweep and GPS, but I suspect only when an A3 arrives on the market – Hermione’s processing with a new 512MB A+ overclocked to 1GHz is nearly maxed out with the camera and diagnostics.  She’s probably just about got CPU space for the compass and Garmin LiDAR lite over I2C but I think that’s it until the A3 comes to market.  My hope for the A3 is that it uses the same 4 core CPU as the B2 with built in Bluetooth and WiFi as per the B3 but no USB / ethernet hub to save power.  Fingers crossed.

 

yaw

I know I said about using this post to investigate that value of 1.2, but I’m just going to sit on that for now in preference for yaw.  There are a few aspects to this:

  1. During flight, currently the quadcopter stays facing whichever way it was facing on the ground; there’s a yaw angle PID and it’s target is 0.  But should be trivial to change this yaw angle target so that the quadcopter faces the direction; the target for the yaw angle is derived from the velocity – either input or target to the velocity PID i.e. the way the quadcopter should or is flying.  It’s a little bit tricker than it sounds for two reasons:
    • The tan (velocity vectors) gives 2 possible angle and only consideration of signs of both vectors actually defines the absolute direction e.g. (1,1) and (-1,-1) needs to be 45° and 135° respectively.
    • Care needs to be taken that the transition from one angle to the next goes the shortest way, and when flight transitions from movement to hover, the quad doesn’t rotate back to the takeoff orientation due to the velocity being 0,0 – the angle needs to be derived from what it was doing before it stopped.

    It’s also worth noting this is only for looks and there are no flight benefits from doing this.

  2. The camera X and Y values are operating partially in the quadframe and partially in the earth frame.  I need to rotate the X and Y values totally into the earth frame by accounting for yaw.
  3. Yaw tracking by the integrated gyro Z axis seems to be very stable, but I do need to include the compass readings for even longer term stability.  I think I can get away with just using the compass X and Y values to determine the yaw angle but I’ll need to test this, but I have 2 further concerns:
    • the first is that the compass needs calibration each time it boots up, just like is necessary with your phones.  You can see from my previous post the offsets of the X and Y values as I span Zoe on my vinyl record player – see the circle is not centered on 0, 0.
    • I’m not sure how much iron in the flight zone will affect the calibrations based on the distance of the compass from the iron; iron in my test area may be the structural beams inside the walls of a building indoors, or garden railings outside, for example.

First step is to include yaw into the camera processing as a matter of urgency.  The magnetometer stuff can once more wait until it becomes absolutely necessary.

FYI the rotation matrix from Earth to Quadcopter from is as follows:

|xe|   | cos ψ,  sin ψ,   0|  |xq|
|ye| = |-sin ψ,  cos ψ,   0|  |yq|
|ze|   |   0,      0,     1|  |zq|

Compass and ViDAR

First, ViDAR: I’ve got fed up of calling it video macro-block something or the other, so now it’s Video Detection and Ranging.  I think it’s now working as well as it can without a height detector; it’s a long way from perfect, but until I can rely on stable height, I can’t identify further bugs.  I’m almost certainly going to wait for the Garmin LiDAR-Lite (GLL) to provide the height information.

There’s one bug I’ve fixed which explains why the PX4FLOW only uses a gyro – the angles involved are increments not absolutes.  The picture below tries to show that it’s not the absolute tilt of the quadcopter that needs to be accounted for, but the increment:

ViDAR angles

ViDAR angles

I’ve also got the compass working – I’ve just pulled out the critical bits.  Most of the samples I found use the MPU9250 master I2C to attach to the compass; compass data is then picked from MPU9250 registers.  But I found this version which exposed the compass registers for direct access which is much cleaner and clearer.

####################################################################################################
#
#  Gyroscope / Accelerometer class for reading position / movement.  Works with the Invensense IMUs:
#
#  - MPU-6050
#  - MPU-9150
#  - MPU-9250
#
#  The compass / magnetometer of the MPU-9250 is not used
#
####################################################################################################
class MPU6050:
    .
    .
    .
    .
    .
    .
    .
    def initCompass(self):    
        #-------------------------------------------------------------------------------------------
        # Set up the I2C master pass through.
        #-------------------------------------------------------------------------------------------
        int_bypass = self.i2c.readU8(self.__MPU6050_RA_INT_PIN_CFG)
        self.i2c.write8(self.__MPU6050_RA_INT_PIN_CFG , int_bypass | 0x02)

        #-------------------------------------------------------------------------------------------
        # Connect directly to the bypassed magnetometer, and configured it for 16 bit continuous data
        #-------------------------------------------------------------------------------------------
        self.i2c_compass = I2C(0x0C)
        self.i2c_compass.write8(self.__AK893_RA_CNTL1, 0x16);
   
    def readCompass(self):
        compass_bytes = self.i2c_compass.readList(self.__AK893_RA_X_LO, 7)

        #-------------------------------------------------------------------------------------------
        # Convert the array of 6 bytes to 3 shorts - 7th byte kicks off another read
        #-------------------------------------------------------------------------------------------
        compass_data = []
        for ii in range(0, 6, 2):
            lobyte = compass_bytes[ii]
            hibyte = compass_bytes[ii + 1]
            if (hibyte > 127):
                hibyte -= 256

            compass_data.append((hibyte << 8) + lobyte)

        [mgx, mgy, mgz] = compass_data
        return mgx, mgy, mgz


mpu6050 = MPU6050()
mpu6050.initCompass()

try:
    while True:
        mx, my, mz = mpu6050.readCompass()
        print "%d, %d, %d" % (mx, my, mz)
        time.sleep(0.5)


except KeyboardInterrupt, e:
    pass
finally:
    pass


I probably won’t include the compass data for yaw and orientation yet, but at least the main hurdle has been overcome.

MPU-9½50 compass

Just for the record, I’ve finally found out why my first attempt to access the compass had failed; it’s not part of the MPU-9½50 slave I2C bus directly, but is attached as a slave to the MPU-9½50 master I2C bus as slave0.  Thus it’s configured via the MPU-9½50 slave0 I2C bus registers, and then the data is read by my code from the MPU-9½50 slave0 I2C registers that the MPU-9½50 supports.

There’s a couple of bits of code here and here for Arduino usage which probably give me enough useful information for me to try it out with.

I’ll probably tackle that next once I’ve got the first phase of the camera motion working well enough to demonstrate which could well be later today.

Left (a)drift

I flew Phoebe earlier with LEDDAR working well.  She repeatedly drifted left, and self-corrected as can be seen in the graph below.  You can see her repeatedly drifting and stopping.  This is the expected behaviour due to the top level PID controlling velocity not distance: drift is stopped, but not reversed.  On that basis, I’ve updated the code on GitHub.

Left (a)drift

Left (a)drift

To get her back to where she took off from, I need another set of ‘pseudo-PIDs’ to recognise and correct the distance drifted.  I’m going to keep this as simple as possible:

  • I’ll continue to use my velocity flight plan – integrating this over time will provide the ‘target’ for the distance PIDs in the earth reference frame
  •  I’ll integrate the velocity (rotated back to earth frame) over time to get the distance PID ‘input’ – although this is double integration of the accelerometer, it should be good enough short-term based upon the close match between the graph and the flight I watched.
  • critically, I’ll be using fixed velocity output from the distance ‘pseudo PID’, rotated back to the quad-frame as the inputs to the existing velocity PIDs – the input and target of the distance ‘pseudo PID’ only provide the direction, but not speed of the correction.

This should be a relatively simple change which will have a visible effect on killing hover drift, or allowing intentional horizontal movement in the flight plan.

After that, I’ll add the compass for yaw control so Phoebe is always facing the way she’s going, but that’s for another day.

Mothballed

The new Pi Zero with camera requires the latest Raspian version from the 10th May.  The RPIO.PWM code throws an obscure error with this kernel.  The RPIO.PWM library appears to not be managed by the original author, and currently, I don’t understand this error so cant’ fix it.  Deadlock.

Using Zoe’s SD card on Phoebe’s A+ also shows the same problem so it’s a change in the kernel, meaning further development on Phoebe can no longer pick up the latest code.  I’m worried that when the A3 is released later this year, I’ll be forced to use the latest kernel for it and the same RPIO.PWM problem will rear it’s ugly head and prevent further kernel upgrades on Phoebe too.

The only plus for the moment is that I can continue current development on Phoebe based on the January jessie to add the LEDDAR and compass function; in addition, by disconnecting the motors from the ESCs, Phoebe stops whining when there’s no PWM signal meaning I can do development / testing indoors.  Next stop is replace the ground facing camera and URF with the LEDDAR.