Amazonians fly!

Only just stumbled upon this:

Cool, but safe and financial viable?  It can only work in low density population areas outside of CAA restrictions (that counts me out); each base station covers only a 10 mile radius; the luggage is limited in size and can weigh up to 2.3kg (5lb).

GPS + compass plan

Here’s what I’ll be attempting to achieve with GPS and compass data integrated into the code – its data’s available now, but just logged.

  • In a large, open field, I’ll take Hermione to a random point, and she’ll record the GPS location; this will be the destination for the flight.
  • I’ll then carry her to any other position in that field and pop her down on the ground pointing in any direction i.e. not necessarily towards the destination.
  • As part of the current take-off procedure, she already records her GPS take-off location; from this and the destination GPS location, she’ll build up a flight plan in much the same format as she reads from file now: from the calculated distance and direction between the GPS points, she’ll convert these to a fixed velocity vector and flight period, wrapped in the stand take-off, hover and landing flight plan sequence.
  • During the initial hover of the flight. she’ll rotate so she is pointing towards the target destination based upon the compass input.
  • Once actually heading towards this destination, compass + yaw fuse to ensure she’d always pointing in the direction she’s flying
  • At the same time, the fused combination of the double integrated acceleration, down facing video combined and GPS track the current location, comparing it to the predetermined destination position and thus tightly tracking her way to the destination.

All the above isn’t difficult unless I’m missing something hidden in plain site; a large part is in preflight checks, getting the destination and takeoff GPS points and compass orientation.  The changes to the flight based on this target are pretty much in place already.  Just minor changes are required to fuse the compass into the integrated yaw rate both for post-takeoff hover to point her towards the destination, and to maintain that during the flight to that destination.

This breaks down into several easy safe stages:

  • the automated flight plan file creation can be done in the back garden with the motors not even powered up.
  • Initially the flight plan will only cover take-off, hover with reorientation and landing, just to check she’s pointing the right way
  • then the full flight plan maintains further incremental yaw as zero while she flies forwards towards the target for the time at the speed defined in the flight plan.

I’d better stop blogging and get on with it.


After a little more thought, I’m going to break it down further.

  1. I’ll still take Hermione around the park collecting a destination GPS location along with possible intermediate way points.
  2. Once at the arbitrary starting point, her initial GPS location is also recorded as part of her take-off routine.
  3. Together these GPS locations are passed to a sub-class of the FlightPlan() object to produce a flight plan structure identical in looks to the file based flight plan used now.  The only difference is the coordinate system is GPS North, South, East and West in meters rather than piDrone take-off orientation of forward, backward, left and right.
  4. The compass will be used to determine the initial orientation WRT NSEW (magnetic) and set the base level yaw angle thus aligning it with the earth frame GPS flight plan (note to self – consider adding gain as well as the current offsets to the compass calibration; also what’s the angular error between compass and GPS north here?)
  5. The flight then runs using only the existing yaw control code – compass and GPS information are just logged as now.

This intermediate steps have clarified a few uncertainties and I definitely do now need to get on with it.

A busy week…

First, the result: autonomous10m linear flight forwards:

You can see her stabilitydegrade as she leaves the contrasting shadow area cast by the tree branches in the sunshine.  At the point chaos broke loose, she believed she had reached her 10m target and thus she was descending; she’s not far wrong – the start and end points are the two round stones placed a measured 10m apart to within a few centimetres.

So here’s what’s changed in the last week:

  • I’ve added a heatsink to my B3 and changed the base layer of my Pimoroni Tangerine case as the newer one has extra vents underneath to allow better airflow.  The reason here is an attempt to keep a stable temperature, partly to stop the CPU over heating and slowing down, but mostly to avoid the IMU drift over temperature.  Note that normally the Pi3 has a shelf above carrying the large LiPo and acting as a sun-shield – you can just about see the poles that support it at the corners of the Pi3 Tangerine case.

    Hermione's brain exposed

    Hermione’s brain exposed

  • I’ve moved the down-facing Raspberry Pi V2.1 camera and Garmin LiDAR-Lite sensor to allow space for the Scanse Sweep sensor which should arrive in the next week or two courtesy of Kickstarter funding.
    Hermione's delicate underbelly

    Hermione’s delicate underbelly

    The black disk in the middle will seat the Scanse Sweep perfectly; it lifts it above the Garmin LiDAR Lite V3 so their lasers don’t interfere with each other; finally the camera has been moved far away from both to make sure they don’t feature in its video.

  • I’ve changed the fusion code so vertical and lateral values are fused independently; this is because if the camera was struggling to spot motion in dim light, then the LiDAR height was not fused and on some flights Hermione would climb during hover.
  • I’ve sorted out the compass calibration code so Hermione knows which way she’s pointing.  The code just logs the output currently, but soon it will be both fusing with the gyrometer yaw rate, and interacting with the below….
  • I’ve added a new process tracking GPS position and feeding the results over a shared-memory OS FIFO file in the same way the video macro-block are passed across now. The reason both are in their own process is each block reading the sensors – one second for GPS and 0.1 second for video – and that degree of blocking must be completely isolated from the core motion processing.  As with the compass, the GPS data is just logged currently but soon the GPS will be used to set the end-point of a flight, and then, when launched from somewhere away from the target end-point, the combination of compass and GPS together will provide sensor inputs to ensure the flight heads in the right direction, and recognises when it’s reached its goal.

As a result of all the above, I’ve updated GitHub.

Cloudy conditions

The sun wasn’t shining brightly, so no high contrast on the lawn; the lawn had been mowed, removing the contrasting grass clumps too. Yet she still did a great attempt at a 1m square. I think this is about as good as she can get – it’s time for me to move on to adding compass, GPS and object avoidance.  The code as been updated on GitHub.

GPS, Compass and Object Avoidance

It’s time to down tools on

  • Ö – she just can’t run fast enough to test on grass
  • yaw control – I just can’t work out why something goes very wrong three seconds after the yaw control kicks in

and move on to the next steps.

Compass

The compass data is already available, but needs calibrating to allow for magnetic materials in the area; I’ve even got the calibration code written, but it fails; it uses the gyro to track that the compass has turned > 360° to find the maximum and minimum readings, and hence the offset due to the local readings.  The angles from the gyro were utter rubbish, and I have no idea why – I just need to try harder here.

GPS

I’ve put together a hand-held GPS tracker, and took it for a walk outside my house*.  I also took a photo of our house with my Mavic and overlaid the results (it’s worth clicking on the image to see it full size):

GPS tracking

GPS tracking

It’s not perfect, but the shape tracked by the GPS is reasonable enough once the GPS has settled; note the both green splodges are at the same point, but only the end one is in the right position due the first few samples from the GPS – I’ll have to remember this when incorporating this with Hermione’s flight plan.  Ignoring the initial few samples, I think mostly the errors were less than a meter once away from the house.  The GPS code for Hermione will be closely based on what I used for my handheld version:

from __future__ import division
import gps
import os
import math

###################################################################################################
# Set up the tty to used in /etc/default/gpsd - if the GPS is via USB, then this is /dev/ttyUSB0. #
# The python script listens on port 2947 (gpsd) of localhost.  This can be reconfigured in        # 
# /etc/default/gpsd also.                                                                         #
###################################################################################################

session = gps.gps()
session.stream(gps.WATCH_ENABLE | gps.WATCH_NEWSTYLE)

num_sats = 0
latitude = 0.0
longitude = 0.0
time = ""
epx = 0.0
epy = 0.0
epv = 0.0
ept = 0.0
eps = 0.0
climb = 0.0
altitude = 0.0
speed = 0.0
direction = 0.0

lat = 0.0
lon = 0.0
alt = 0.0
new_lat = False
new_lon = False
new_alt = False

base_station_set = False

dx = 0.0
dy = 0.0
dz = 0.0

R = 6371000 # radius of the earth in meters

fp_name = "gpstrack.csv"
header = "time, latitude, longitude, satellites, climb, altitude, speed, direction, dx, dy, dz, epx, epy"

os.system("clear")

print header

with open(fp_name, "wb") as fp:
    fp.write(header + "\n")

    #---------------------------------------------------------------------------------
    # With a based level longitude and latitude in degrees, we can be the current X and Y coordinates
    # relative to the takeoff position thus:
    # psi = latitude => p below
    # lambda = longitude => l below
    # Using equirectangular approximation:
    #
    # x = (l2 - l1) * cos ((p1 + p2) / 2)
    # y = (p2 - p1)
    # d = R * (x*x + y*y) ^ 0.5
    #
    # More at http://www.movable-type.co.uk/scripts/latlong.html
    #---------------------------------------------------------------------------------

    while True:
        try:
            report = session.next()
#            print report
#            os.system("clear")
            if report['class'] == 'TPV':
                if hasattr(report, 'time'):  # Time
                    time = report.time

                if hasattr(report, 'ept'):   # Estimated timestamp error - seconds
                    ept = report.ept

                if hasattr(report, 'lon'):   # Longitude in degrees
                    longitude = report.lon
                    new_lon = True

                if hasattr(report, 'epx'):   # Estimated longitude error - meters
                    epx = report.epx

                if hasattr(report, 'lat'):   # Latitude in degrees
                    latitude = report.lat
                    new_lat = True

                if hasattr(report, 'epy'):   # Estimated latitude error - meters
                    epy = report.epy

                if hasattr(report, 'alt'):   # Altitude - meters
                    altitude = report.alt
                    new_alt = True

                if hasattr(report, 'epv'):   # Estimated altitude error - meters
                    epv = report.epv

                if hasattr(report, 'track'): # Direction - degrees from true north
                    direction = report.track

                if hasattr(report, 'epd'):   # Estimated direction error - degrees
                    epd = report.epd

                if hasattr(report, 'climb'): # Climb velocity - meters per second
                    climb = report.climb

                if hasattr(report, 'epc'):   # Estimated climb error - meters per seconds
                    epc = report.epc

                if hasattr(report, 'speed'): # Speed over ground - meters per second
                    speed = report.speed

                if hasattr(report, 'eps'):   # Estimated speed error - meters per second
                    eps = report.eps


            if report['class'] == 'SKY':
                if hasattr(report, 'satellites'):
                    num_sats = 0
                    for satellite in report.satellites:
                        if hasattr(satellite, 'used') and satellite.used:
                            num_sats += 1

            #-----------------------------------------------------------------------------
            # Calculate the X,Y coordinates in meters
            #-----------------------------------------------------------------------------
            if new_lon and new_lat and new_alt and num_sats > 6:

                new_lon = False
                new_lat = False
                new_alt = False

                lat = latitude * math.pi / 180
                lon = longitude * math.pi / 180
                alt = altitude


                if not base_station_set:
                    base_station_set = True

                    base_lat = lat
                    base_lon = lon
                    base_alt = alt

                dx = (lon - base_lon) * math.cos((lat + base_lat) / 2) * R
                dy = (lat - base_lat) * R
                dz = (alt - base_alt)

            else:
                continue


            output = "%s, %f, %f, %d, %f, %f, %f, %f, %f, %f, %f, %f, %f" % (time,
                                                                             latitude,
                                                                             longitude,
                                                                             num_sats,
                                                                             climb,
                                                                             altitude,
                                                                             speed,
                                                                             direction,
                                                                             dx,
                                                                             dy,
                                                                             dz,
                                                                             epx,
                                                                             epy)




            print output
            fp.write(output + "\n")
        except KeyError:
            pass
        except KeyboardInterrupt:
            break
        except StopIteration:
            session = None
            print "GPSD has terminated"
            break

The main difference will be that while this code writes to file, the final version will write to a shared memory pipe / FIFO much like the camera video macro-blocks are now.  The GPS will run in a separate process, posting new results as ASCII lines into the FIFO; Hermione’s picks up these new results with the select() she already uses.  The advantage of the 2 processes is both that they can be run on difference cores of Hermione’s CPU, and that the 9600 baudrate GPS UART data rate won’t affect the running speed of the main motion processing to get the data from the pipe.

Lateral object avoidance

My Scanse Sweep is very imminently arriving, and based on her specs, I plan to attach her to Hermione’s underside – she’ll have 4 blind spots due to her legs, but otherwise a clear view to detect objects up to 40m away.  Her data comes in over a UART like the GPS, and like the GPS, the data is ASCII text.  That makes it easy to parse.  The LOA does churn out data at 115,200 bps, so it too will be in a separate process.  Only proximity alerts will be passed to Hermione on yet another pipe, again listened to on Hermione’s select(); the LOA code will just log the rest providing a scan of the boundaries where it is.

Still stuck

Hermione is still causing trouble with yaw control flights despite lots of refinements.  Here’s the latest.

Hermione's troubles

Hermione’s troubles

@3s she’s climbed to about a meter high and then hovered for a second.  All the X, Y, and Z flight plan targets and sensor inputs are nicely aligned.

The ‘fun’ starts at 4 seconds.  The flight plan, written from my point of view says move left by 1m over 4 seconds.  From Hermione’s point of view, with the yaw code in use, this translates to rotate anti-clockwise by 90° while moving forwards by 1m over 4 seconds.  The yaw graph from the sensors shows the ACW rotation is happening correctly.  The amber line in the Y graph shows the left / right distance target from H’s POV is correctly zero.  Similarly, the amber line in the X graph correctly shows she should move forwards by 1m over 4s.  All’s good as far as far as the targets are concerned from her and my POV.

But there’s some severe discrepancy from the sensors inputs POV.  From my POV, she rotated ACW 90° as expected, but then she moved forwards away from me, instead of left.  The blue line on the Y graph (the LiDAR and ground-facing video inputs) confirms this; it shows she moves right by about 0.8m from her POV.  But the rusty terracotta line in the Y graph (the double integrated accelerometer – gravity readings) shows exactly the opposite.  The grey fusion of the amber and terracotta cancel each other out thus following the target perfectly but for completely the wrong reasons.

There are similar discrepancies in the X graph, where the LiDAR + Video blue line is the best match to what I saw: virtually no forward movement from H’s POV except for some slight forward movement after 8s when she should be hovering.

So the net of this?  The LiDAR / Video processing is working perfectly.  The double integrated IMU accelerometer results are wrong, and I need to work out why?  The results shown are taken directly from the accelerometer, and double integrated in excel (much like what the code does too), and I’m pretty convinced I’ve got this right.  Yet more digging to be done.

In other news…

  • Ö has ground facing lights much like Zoe had.  Currently they are always on, but ultimately I intend to use them in various ways such as flashing during calibration etc – this requires a new PCB however to plug a MOSFET gate into a GPIO pin.
  • piNet has changed direction somewhat: I’m testing within the bounds of my garden whether I can define a target destination with GPS, and have enough accuracy for the subsequent flight from elsewhere to get to that target accurately.  This is step one in taking the GPS coordinates of the centre of a maze, and then starting a flight from the edge to get back there.

That’s all for now, folks.  Thanks for sticking with me during these quiet times.


P.S. I’ve got better things to do that worry about why everything goes astray @ 7s, 3s after the yaw to move left started; it’s officially on hold as I’ve other stuff lurking in the background that’s about the flower.

Ö

I chose to name my piDrones Phoebe, Chloe and Zoe as they can all be spelt with an umlaut – don’t ask me why, I have no idea.  I ran out of umlaut names (despite research) so I opted for Hermione as the latest, greatest model as it sounds similar although she can’t ever bear an umlaut as she lacks the critical ‘oe’.

Anyway, Phoebe, Zoe and the always short-lived* Chloe have all been merged into the best of each; the result is ‘Ö’ pronounced like the french for ‘yes’.  She has Phoebe’s ESCs, motors and props, Chloe’s amazing frame, and Zoe’s Pi0W and PCB.

Ö’s build is virtually indestructible as she weighs just 1kg fully loaded.  Because she’s so light, crash torques are tiny compared to the strength of the frame; the only perceivable damage is to broken props and these are cheap from e-bay and I already have a vast stock of them.  In comparison, Hermione weighs 4kg; this, and the fact she’s so large means crash torque is huge in comparison, damage always occurs for anything but a perfect landing, and replacement frame parts and props is expensive.  Ultimately I still want to have Hermione as queen piDrone because of her X8 format, and use of a B3 4-cores allowing further sensors**, but while I’m still diagnosing the current problems, I think little miss indestructible is better suited financially to the task-in-hand.

Sadly, there’s one problem; Ö’s Pi0W isn’t fast enough to cope with video resolution higher than about 400² pixels, ruling out lawn / gravel etc.  This is what she can do successfully:

On the plus side, I think that’s just enough to sort out my understanding of Hermione’s yaw flaws.


*Chloe got retired (again) as the 1.5A regulator on the PCB was insufficient to drive the A+, IMU, Camera and LiDAR-Lite. The same I2C errors I have before returned. Swapping Chloe’s A+ to Ö’s Pi0W resolved this.

** i.e. GPS, compass and Scanse Sweep

Hermione draws a square, kinda.

She would have drawn a better square had I got the flight plan right; it the event, the plan said to…

  • climb to 90cm over 3s
  • hover for a second
  • move forward by 1m over 4s
  • hover for a second
  • move left by 1m over 4s
  • hover for a second
  • move back by 2m over 8s
  • hover for a second
  • move right by 2m over 8s
  • hover for a second
  • land over 4s…

making a total of 36 seconds in all.

These last two sections meant she should land about a meter back and right from where she took off.  How well did she follow the flawed flight plan?

To me, this is working amazingly well, especially as the camera lateral tracking doesn’t have any significant markers, just grass blades.  I was lucky there was bright sunshine.

What I’d really like to have shown was her actually ‘turning’ each corner, always facing the direction she’s flying; this is completely unnecessary but would look good – the only point of doing it is if there’s a camera taking pics and streaming video live back to the RC as per my Mavic.  But currently my yaw code is still lacking something and I don’t know what yet.

And in other news…

the hedgehogs are back in our garden:

Welcome back!

Welcome back!

This is a combination of the PiNoIR camera (v.1) along with this IR motion detector (the sensor is expensive because it’s digital), and some near-IR LEDs.  Below is a picture of the older model – the newer one runs on an A+ (for size and power reasons) and has been updated to Jessie, and now the LEDs are switchable by GPIO via a mosfet to provide lighting for the camera only when needed.  The upgrade to Jessie means she boots without WiFi at night, and the next morning, I can add the WiFi dongle to connect and check the results.

HogCam

HogCam

Code

#!/usr/bin/env python
# NiteLite - a python daemon process started at system boot, and stopped on shutdown
#          - the default LED pattern is twinkling but if motion is detected, one of 4
#            different patterns are chosen and these are used for 10s after motion detection
#
# Please see our GitHub repository for more information: https://github.com/pistuffing/nitelite/piglow
#

import signal
import time
import RPi.GPIO as GPIO
import os
from datetime import datetime
import subprocess

#------------------------------------------------------------
# Set up the PIR movement detection
#------------------------------------------------------------
GPIO_PIR = 18
GPIO_IR_LED = 12
GPIO.setmode(GPIO.BOARD)
GPIO.setup(GPIO_PIR, GPIO.IN, GPIO.PUD_DOWN)
GPIO.setup(GPIO_IR_LED, GPIO.OUT)
GPIO.output(GPIO_IR_LED, GPIO.LOW)

#------------------------------------------------------------
# Final steps of setup
#------------------------------------------------------------
keep_looping = True

def Daemonize():
	os.setpgrp()

#------------------------------------------------------------
# Once booted, give the user a couple of minutes to place the camera
#------------------------------------------------------------
time.sleep(2 * 60.0)

try:
    while keep_looping:
        #----------------------------------------------------
        # Block waiting for motion detection
        #----------------------------------------------------
        GPIO.wait_for_edge(GPIO_PIR, GPIO.RISING)

        #----------------------------------------------------
        # Turn on the IR LED
        #----------------------------------------------------
        GPIO.output(GPIO_IR_LED, GPIO.HIGH)

        #----------------------------------------------------
        # Take a snap
        #----------------------------------------------------
	now = datetime.now()
	now_string = now.strftime("%y%m%d-%H%M%S")
	camera = subprocess.Popen(["raspistill", "-rot", "180", "-o", "/home/pi/Pictures/img_" + now_string + ".jpg", "-n", "-ISO", "800", "-ex", "night", "-ifx", "none"], preexec_fn =  Daemonize)

        #----------------------------------------------------
        # Turn off the IR LED after 5s
        #----------------------------------------------------
        time.sleep(5)
        GPIO.output(GPIO_IR_LED, GPIO.LOW)

        #----------------------------------------------------
        # Wait 30s before checking for motion again
        #----------------------------------------------------
        time.sleep(30.0)
except KeyboardInterrupt, e:
        pass

GPIO.cleanup()

Save it as /home/pi/hogcam.py and make it executable:

chmod 775 /home/pi/hogcam.py

Load on boot

Raspian uses systemd for running on boot.  In /etc/systemd/system, create a new file called hogcam.service:

[Unit]
Description=HogCam

[Service]
ExecStart=/home/pi/hogcam.py

[Install]
WantedBy=multi-user.target

Save it off.  You can then enable, start and stop it thus.  It will also start on boot.

sudo systemctl enable hogcam.service
sudo systemctl start hogcam.service
sudo systemctl stop hogcam.service