Kitty’s measuring up

Did some simple testing to determine the camera angle, resolution and suitable dual laser dot separation.  Kitty was sat on a stool, looking up at the ceiling, and I was waving the laser around the ceiling watching the curses display.  The picture below is upside down!

           ∧      ^      ^            
          /|\   21cm     |
         / | \____V      |
        /  |  \        205cm 
       / α |   \         |
      /    |    \        |
     /     |     \       |

U = camera position
Span of kitty laser dot detection = 170 x 170cm
Test height (camera to ceiling) = 205cm
Alpha = atan(170 / (2 x 205) = 22.5°

In addition…

Camera height at takeoff = 21cm
∴Camera span at takeoff = 17.4 x 17.4cm
Image size = 32 x 32 pixel
∴Image resolution at takeoff = 0.5 x 0.5cm
Landing leg separation = 23cm

Based on the above, 2 dots spaced by 15cm would make a good laser leash

Image span at 1m hover = 82 x 82cm
Double dot separation viewed from 1m = 15cm / 82cm x 32 pixels ≅ 6 pixels

So minimum capture resolution of 32 x 32 pixels should be fine for 1m hover, but by 2m, the risk of merged dots is too high as they’ll only be 3 dots apart.

Kitty’s monocle…

or as the lovely chaps at Pimoroni would say, a pirate’s eye patch, ooh arr Jim lad!

With a spare (presumed dead but just sleeping) A+, a Pimoroni Coupe Royale case (also spare), a Pi Camera (yet again spare!) and a fantastic camera case I saw a few days ago, Kitty has a home for these testing times.

Kitty's new home

Kitty’s new home

The camera housing is great, everything clips firmly into place, there’s been intelligent design about the routing of the cable, and it’s hinged, a bit like a classic pocket watch; from the girls’ point of view, this means with the camera attached to their underside, they can shoot POV video of a flight, or engage their Kitty guidance system.

Further software design thoughts for Kitty’s code:

  • she does run as a separate process with a localhost socket connection to the HoG code or initially, the test simulator
  • she sends interpreted FlightPlan() information to HoG – the HoG FlightPlan() code listens on a non-blocking socket for updates
  • is started with -k to spawn Kitty as a new process – HoG postpones action until the Kitty TCP connection is established
  • It’s Kitty who manages the state changes from “take-off” to “hover” to “tracking” – doesn’t know it’s tracking – it just knows what the required velocity PID inputs are as received from Kitty
  • When the RPi A2 is released, this all means Kitty can run on a separate core to the HoG code.

By giving Kitty a home, I can do much easier testing, right up to the point of simulated flights, and checking the velocity targets she produces match what Phoebe and Chloe’s HoGs are expecting; I can write and run all FlightPlan() code changes and test it without having to risk Phoebe and Chloe.  Which means that once all’s working in the simulated Phoebe / Chloe environment, it’ll be a quick and easy move into their real-world HoG code with confidence.

Pen and paper!

Sometimes, it’s just easier to draw a picture in the old-fashioned way:


The top line of text reads “h derived…

I think this sketch combined with yesterday’s outline of how the code will use this data has convinced me this can be made to work. Onwards and upwards!

How kitty will work?

Backgrounder on Kitty

Here’s my initial thoughts about how to implement a reference point for Phoebe to hover over and track:

  • She does pre-flight checks and warms up the sensors
  • The motors remain unpowered until she spots the laser underneath her
  • Once the laser lock is acquired she takes off to 1m hover.
  • Moving the laser at hover causes her to follow.
  • If laser lock is lost, immediate controlled descent.

In more detail…

  • 2 laser pointers in parallel at fixed spacing – hereafter known as the laser leash
  • 1 would give her a point to hover over and this is the first step in the development
  • 2 gives her orientation and height
    • the alignment of the two dots compared to the camera alignment gives yaw
    • the spacing of the two dots shrinks as she gains height. ground level separation is measured when she’s locked on to the laser.
  • loss of lock on one dot changes behaviour to drift towards the remaining one with the aim of centering the single dot and thereby reacquiring the second dot at the risk of ensuing height / yaw errors.
  • loss of both dots of both dots leads to immediate horizontal descent to ground.  If single latch reacquired during descent, then second dot acquisition procedure as above; if both dots reaquired, then normal flight resumes.
  • Various beep sequences to indicate no lock, single lock and double lock
  • double dot analysis produces quad frame velocity targets along the X, Y and Z axis plus yaw correction target around the Z axis.

Together that should mean she can be taken for a walk on a laser leash!

Primary concerns:

  • test area – garden is too bright, indoor at home has too many bits of furniture – there are redundant farm buildings within 5 minutes walk for testing – but only after a significant series of passive tests have been successful
  • feeding the periodic camera results through as updates to velocity / yaw targets earlier experiments show 1 per second is about the max rate
  • Kitty running on separate thread or process probably – assuming PiCamera blocks while taking a shot – how much will this effect performance and will the GIL block quad code completely while kitty takes a photo?  If so, it’ll need to be a separate process with an input queue the Phoebe checks periodically for target updates

First steps

  • add the beeper onto Phoebe and Chloe
  • build the double parallel laser leash
  • measure the camera dots spacing of the camera at take-off and 1m hover point


FIFO dodo

I took Phoebe out to fly this morning using her normal mode of reading the data registers triggered by the hardware data ready interrupt.  Well, that’s what I thought I was doing, but I’d forgotten to flip the config for which script to run, so in fact Phoebe flew the FIFO…for a second before she flipped over and mowed the lawn instead.  She did this 3 times before I considered pilot error as the cause for flipping Phoebe.

Anyway, with the config fixed, she flew well within the bounds of what I expect from her, and to be honest, well within the bounds of what I’m starting to believe is achievable with autonomous flight with only an accelerometer and gyro.

The fallout is that the FIFO is a proven no-go until the problem with python SMBus / I2C is found, and I’m not smart enough to be the person who does that.

But the flights from the last few days of Phoebe and Chloe reading the data registers do show that with a fixed reference point, they could be made to behave.  And that fixed reference point is Kitty.


My get up and go just got up and went!

You might have noticed the tag-line in the blog header has just changed from “Stuff to fulfil your Raspberry Pi!” to “Raspberry Pi and other Stuffing!” – I’m broadening the scope of this blog outside of Raspberry Pi land.

Things are grinding to a halt with Phoebe and Chloe; I have a small rebuild to do on Phoebe to add her new grommets and ESCs, but that’s only going to bring her up to Chloe’s standards at best.  I have some testing to try with the 0g offsets being zero at ambient / boot temperature, but I’m not convinced that’ll come to much.  And until I can get the horizontal drift reduced, then there’s no way I can even think about implementing kitty, the laser tracker..

I’ve always struggled to hang on to a full time hobby for more that a couple of year before my interest drifts.  This Quadcopter project has done well at 30 months and I would keep going if I had any ideas left of where to go, but I’ve been going round and round in ever decreasing circles for a while now, and I have to stop before I vanish up my own sphincter!

I’m sure I’ll never be far away from this project, and will be back periodically with a new idea, or perhaps just a new video after a sunny day (i.e. few seconds) flying the girls in the park.  But for now, time to let the girls rest, and enjoy the sunshine.

Me? Well I’d previously mentioned my interest in DIY HiFi and something over the weekend set that Rolling Stone off again.  More on that anon.

Once more, where next?

Just a list so I don’t forget:

  • add silicone grommets to Phoebe – probably a month before they arrive so on hold for the moment
  • resolve vertical height hover – hover is fine, but the height itself varies due to offsets different from Z-axis 0g calibration compare to actual gravity value of 1g.
  • resolve horizontal drift – harder because earth frame gravity is innately 0, so there’s no meaningful subtraction of gravity errors unlike vertical, so any difference between calibration / reality in X/Y 0g offsets will result in drift.
  • implement kitty as the solution for the above- slow due to lack of time – more on this later, breaking it into steps

Kitty squared

I’ve tidied up the code, added comments, and accounted for the fact console characters are rectangular so the width of the area showing the brightest dot is twice the number of characters as the height.

You need to run this on a console screen that’s at least 68 characters wide by 36 characters high otherwise you’ll just get a curses error on startup.

Here’s the result:

Kitty squared

Kitty squared

For it to work with a laser pointer, it needs a pretty evenly lit dark environment – you’ll be surprised how flexible our eyes are at dynamically changing the brightness of what we see across the photo frame; in a normally lit room or outside, ambient light is almost certain to dominate.

As far as a proof of concept, this mini-project is done, but I’ll probably tweak it for performance reasons when I next need a break.

Have a break…

Have a Kit(ty)-Kat*.

I needed a break today from I2C problems**, so I’ve spent some time with Kitty.  She’s the laser pointer dot tracking code which eventually will make its way onto HoG so I can direct her flights with a laser pointer.

She uses the RaspiCam and picamera python library along with the (aptly named) curses library to show on screen the brightest point in a 32 x 32 pixel image YUV image – the first 1024 bytes (32 x 32) are Y / luminance / brightness / contrast.  The other 512 bytes are the UV values which the code has no interest in.

Kitty curses display

Kitty curses display

Currently she’s taking a photo every 0.6s or so, and there are no sleeps in the code, so there’s a bit of work to do to set exposure times to perhaps 0.1s or thereabouts so that the peak brightness can be polled periodically by HoG and used to set velocity targets for its PIDs.

Code’s up on GitHub

Normal service will resume tomorrow

*An ancient tag-line for Kit-Kat chocolate biscuit adverts for those of you too young to get the pun.

** I have done a little, checking the interrupt pin from the MPU-9250 with a ‘scope – it looks untidy, but that may be down to my usage of the ‘scope itself, so I need to dig further.

News update

Sorry it’s been quiet here; thought I’d better update you what’s going on.  I’ve been doing test flights as the weather allows trying to track down a problem.  The symptoms have been inconsistency between flights, where a few are fantastic, but most end with a prop digging into the ground, and an alloy arm bent at the wrist.

For quite a long time since adding the hardware interrupt to announce to the code that new data was ready from the sensors, I’ve always received data I could trust from the MPU-6050.  With the build of HoG and the move to the MPU-9250, that’s no longer the case – I’m getting i2c read exceptions, yet the code in that area is untouched.

This might be related to the MPU-9250 registers, or it might be related to the latest distribution of Raspbian I installed on HoG.  Not clear yet.

I’ve also started looking at ‘Kitty’ – code using RaspiCam and picamera to identify the position of a laser pointer dot on the ground, so that the direction / distance of that dot could be turned into flight plan targets so that HoG would follow it.  Not a difficult piece of fun to add to HoG but while she’s not behaving, testing is restricted.

So it’s going to continue to be quiet here for a while until I’ve got I2C / GPIO hardware interupts back to the standard they were.