Progress report

Here’s Chloe’s HoG in Hermione’s frame.  You can see I’ve put a large WiFi antenna on one of the side platforms; the other is for GPS in the future.  The frame itself is not quite complete – I still need to install a platform on the underside to hang the sensors off.  In addition, the LID (LiDAR Installation Desktop) needs assembling – it’s just here for show currently.

Chloe's HoG in Hermione's frame

Chloe’s HoG in Hermione’s frame

Here’s a flight with just accelerometer and gyro in control for basic stability testing.

With these 1340 high pitch Chinese CF props, there’s no shortage of lift power despite the fact she weighs 2.8kg, so I’m going to defer the X8 format for a while on financial grounds – 4 new T-motor U3 motors and 40A ESCs costs nearly £400.

The PCBs are on order, and first setup will be for LEDDAR and PX4FLOW.

Oddly, only one of my PX4FLOWs works properly – for some reason, the newer one can’t see the URF, so can’t provide velocities, only angular shift rates; however, LEDDAR will give me the height allowing me to convert the angular rates to horizontal velocities.  If that works, that also opens up the possibility of replacing the PX4FLOW with a Raspi Camera using the H.264 video macro block increments to allow me to do the equivalent of the PX4FLOW motion processing myself, which if possible, would please me greatly.

Still lurking in the background is getting the compass working to overcome the huge yaw you can see in the video.


Feeding frenzy

My hunger to shop has just been sated: 2 new Pi Zeros + the new 8 mega pixel camera module + the Zero camera cable + Unicorm HAT and diffuser plate.

Yes, you heard me right, the latest Pi Zeros now have a camera slot.

One Zero is for showing off at the next Cotswold Jam, along with the new higher-resolution 8 megapixel camera module.  Perhaps set up as a tiny onesie cam?

The other is for Zoe – and maybe another camera too, once I’ve overcome the problem with WAPping the latest jessie, the use of which is mandatory if I want to use the new higher-res camera with her purely for FPV videos.  The new camera slot points horizontally which is great for retaining the low profile of the Pi Zero.

The unicorn and diffuser are mostly just for playing with on my main Pi.

Next frenzy will start when they launch the A3 which I’ll be transferring Phoebe to for the extra CPU cores and processor speed needed for the two LiDAR units I’ll eventually be installing.

Until then, I’ll be working on repaying my overdraft!

3 point laser tracking

This is a long detailed post – grab a cuppa if you intend to plough through.

Here’s the plan for Phoebe.

There are 3 downward facing red lasers. Two are attached to the underside of her rear arms both pointing in parallel along Phoebe’s Z axis i.e. if Phoebe is horizontal then the laser beams are pointing vertically downwards.  The third laser is handheld by a human. All are probably 5mW / Class 2 – although the power rating may need to be reduced to conform with legislation which is unclear.  5mW is safe due to the human blink reaction; 1mW is safe as long as it’s not viewed through a focusing device such as a lens.

The RaspiCam with NoIR filter is fitted in the center of Phoebe’s lower plate, also facing along her Z axis.  A red camera-style gel filter is fitted over it in the expectation that this will increase the contrast between the laser dots and the rest of the background.  The camera is set to ISO 800 – its maximum sensitivity.  The photos are low resolution to reduce the processing required.  Each shot is taken in YUV mode, meaning the first half of the photo data is luminance / brightness / contrast information.  Photos are taken as fast as possible, which may actually be only a few per second due to the lighting conditions.  The camera code runs on a separate thread from Phoebe’s main flight control code.

A typical flight takes places as follows:

Immediately prior to each flight, the camera is turned on and feeds its output into an OS FIFO.

Quadcopter take-off code is unchanged using the standard flight plan configuration to attain an approximation of the desired hover height (e.g. 3s at 0.3m/s gives roughly 90cm hover height).

Once at hover each motion processing loop, the motion processing code checks whether the FIFO is not empty, and if not empty, it is emptied and the last batch of camera data (i.e. the last shot taken) is processed.

It is scanned for bright dots and their position in the frame stored.  By using the red filter and the Y channel (brightness / contract / luminance of YUV)  from the camera, and because the lasers are fixed in Phoebe’s frame with respect to the camera, the dots should stand out in the photo, and lie between the center and the bottom corners of the photo.  If bright spots are detected in this area, there is a very high level of confidence that these are the red dots from the frame lasers.  The distance between the dots in pixels is proportional to the actual height in meters based upon the camera lens focal length.

This pixel-separation height is saved at the first pass in the hover phase and used thereafter as the target height; deviation in the separation of the dots compared to the target dot separation means a height error which is fed as target change to the vertical velocity PID.

Once the height is processed as above, any third similarly bright dot is assumed to be from the human laser.  If such a dot is not found in 5 seconds, the code moved irreversibly to descent mode.

However if a 3rd dot is found in that 5s period then it’s position relative to the frame laser dots provides targets to

  • the yaw PID so that the 3 dots form an isosceles triangle with the quad laser dots at the base and the human dot is at the peak
  • the horizontal velocity PID so that the 3 dots form an equilateral triangle.

Loss of the human dot returns to frame laser dot mode for 5 seconds to reacquire the lost human dot which if not found, triggers the irreversible standard descent mode based upon the flight plan alone.

Similarly, loss of either of the frame laser dots triggers irreversible standard descent mode but without any wait for reacquisition of the missing frame dot.

This should provide stable hover flight for as long as the battery lasts, with the option of following the human dot to take the Quad for a “walk” on a “laser leash”.

Sounds like a plan, doesn’t is?  Some details and concerns, primarily so I don’t forget:

  • -l per-flight command line control of laser tracking mode
  • mosfets to switch on lasers based on the above?  Depends on whether GPIO pins can drive 2 10mA lasers
  • Is a new PCB needed to expose GPIO switch pin for lasers? If so, don’t forget the pull down resistor!
  • Prototype can be done in complete isolation from Phoebe, using one of my many spare RPi’s along with some LEGO and my as yet unused PaPiRus e-paper screen to show dot location.  This could could then be easily battery powered for testing.


  • Merging a successful prototype into Phoebe requires an ‘A3’ and a rework of the PSU – currently direct feeding 5V into the RPi backfeeds the regulator (which would normally be taking input from the LiPo) causing to heat up significantly
  • None of this can take place before I’ve finished and bagged up the GPIO tutorial for the next Cotswold Jam on 30th April.

The future’s bright; the future’s orange!

Well, tangerine actually!

With the IMU FIFO code taking the pressure off critical timings for reading the IMU, a huge range of options have opened up for what to do next with the spare time the FIFO has created:

  • A simple keyboard remote control is tempting where the QC code polls stdin periodically during a hover phase and amends the flight plan dynamically; ultimately, I’d like to do this via a touch screen app on my Raspberry Pi providing joystick buttons.  However for this to work well, I really need to add further sensor input providing feedback on longer-term horizontal and vertical motion…
  • A downward facing Ultrasonic Range Finder (URF) would provide the vertical motion tracking when combined with angles from the IMU.  I’d looked at this a long time ago but it stalled as I’d read the sensors could only run at up to 100kbps I2C baudrate which would prevent use of the higher 400kbps required for reading the FIFO.  However a quick test just now shows the URF working perfectly at 400kbps.
  • A downward facing RPi camera when combined with the URF would provide horizontal motion tracking.  Again I’d written this off due to the URF, but now it’s worth progressing with.  This is the Kitty++ code I started looking at during the summer and abandoned almost immediately due both to the lack of spare time in the code, and also the need for extra CPU cores to do the camera motion processing; Chloe with her Raspberry Pi B2 and her tangerine PiBow case more than satisfy that requirement now.
  • The magnetometer / compass on the IMU can provide longer term yaw stability which currently relies on just the integrated Z-axis gyro.

I do also have barometer and GPS sensors ‘in-stock’ but their application is primarily for long distance flights over variable terrain at heights above the range of the URF.  This is well out of scope for my current aims, so for the moment then, I’ll leave the parts shelved.

I have a lot of work to do, so I’d better get on with it.


Blind, deaf, disorientated, lost and scared of heights…

and yet, amazing.

So in that last video of a ~10s flight, Phoebe drifted about 1m when she shouldn’t have.  To me, that’s amazing; to you, that may be underwhelming.  So here’s why you should be amazed.  Basic school math(s):

distance = ½ x acceleration x time² ⇒ plugging in the numbers above says she was accelerating at 0.02m/s² (2cm/s²) instead of zero to travel 1 meter in 10 seconds.

Gravity is roughly 10m/s².  The accelerometer is set to read up to ±4g.  So that’s a range of 8g or roughly 80m/s²

So 0.02/80 * 100 = 0.025% error or a ±8 error value in the sensor range of ±32768.

Now the critical part – the sensors are only rated at 2% (±655) accuracy, and yet I’m getting 0.025% or 80 times that degree of accuracy.

And that’s why I don’t think I can get things very much better, whatever I do.

There is a slight chance that when the A2 is released (sounds like that’s shifted to next year now), I may be able to run the sampling rate at 1kHz without missing samples (I’ve had to drop to 500Hz to ensure I don’t miss samples at the moment).

Other than that though, she needs more sensors:

  • camera for motion tracking (!blind)
  • ultrasonic sensors for range finding (!deaf)
  • compass (!disorientated)
  • GPS (!lost)
  • altimeter (!scared of heights).

but they’ll also need an A2 for the extra processing.  So this is most definitely it for now.

Drone Camera first shot

I’m taking a break from PID tuning today, as I need a rest.  Instead I had a little tinker with getting RaspiCam to work.  The drone is already flashed with the latest (June 2013?) image, and the wires connected so all I needed to do was to run raspi-config to enable the camera, and then do “raspistill -o test.jpg”.

All I got error messages suggesting I checked the wires.  The wires were good, but somehow I’d managed to detach the connector between the camera and it’s board – the little gold rectangular one – I didn’t even realize it was a connector, but there it was flapping around loose.  A gentle push reattached it, and it must have worked as another try with raspistill led to this – the box the drone was sitting on:

Drone camera first shot

Drone camera’s first shot

I’m already half way through the code which can spawn off raspivid, so once I’ve got the PIDs sorted, hopefully I’d be able to record videos too!

PiCam case

I’ve bought a PiCam case to protect the delicate cable and electronics the other day.  I fiddled and faffed to get the camera in safely and securely without damaging the cable or camera itself, and then finally realized that it was really simple and elegant once you know how.

There are two things you need to know: firstly there is some flexibility in the case; be gentle but it does bend; secondly, there are two slots in the back piece which the board clips into perfectly – this is the bit I missed at first.

Don’t faff like I did trying to ensure the camera poked through the hole in the front while attaching the back – that’s completely the wrong way round.  Instead, clip the board into the slots in the back, feeding the cable under the tiny slot in the base; once the board and cable are safely secured like this, just clip the front on and Bob’s your Mo’s Bro!

What do you get when you cross a Raspberry Pi, a PiBow and a Steamroller?

In preparation for getting a camera board to sling under my drone, I’m in the process of tweaking a RaspberryPi Model A, and designing my own variant of the PiBow case to match – although in doing so, I’ve found much broader use…

For the RPi changes, I’ve removed the A/V socket, and replaced the GPIO pins with an IDT connector so I can lose a couple of layers in the Model B PiBow. That’s now complete – removing the A/V connector wasn’t too bad, but removing the GPIO pins was a right PITA – more on that in another post. Here’s the end result.

For the PiBow, I wanted a cross between the Model B and Model A PiBows: thinner like the model A, but with bolts in the corner like the model B so it could be a direct but thinner replacement for the Model B Ninja PiBows I’m using at the mo on the drone and its RC. Primarily this is to reduce the weight, centre of gravity, profile and power consumption of both.

In the end, I had to compromise, and took 2 layers out of the model B, and sealed up the A/V socket, and opened up the GPIO hole to allow the larger GPIO connectors. I got a local laser cutter company to produce this from my Pibow001-AB3. Here’s what it looks like….

PiBow A + PiBow B

PiBow A + PiBow B cross

Note the addition of the IDC connector isn’t mandatory – there are 90 degree sets of pins available, and they were my first choice, but for the connectors on the ribbon cable, the 90 degree pins weren’t quite long enough.  They would still take the single wire pin connectors though if that’s what you’re after.