Fecking rangefinders

  • SRF02 can’t run at 400kbps I2C baudrate necessary for reading maximum data from the MPU-9250 IMU.
  • TeraRanger needs a 12V power supply and provides 5V serial or I2C meaning level shifting to interface with the Raspberry Pi serial / I2C pins
  • LEDDAR uses weird and slow modbus protocol over serial meaning the IMU FIFO overflows if I’m sampling it above 500Hz – 1kHz works perfectly without LEDDAR.  Essentially, it’s wasting time I’m going to need for Camera, GPS, and Scanse Sweep processing,
  • Garmin LiDAR-Lite supports the necessary 400kbps I2C baudrate at 3.3V, but requires low level I2C access requiring a 20ms gap between sending the read request, and reading the response data.  Arduino provides this low level access, higher level smbus I2C via Python does not.  There are also comments around suggesting no other I2C activity can take place during that 20ms i.e. can’t access IMU during the 20ms!

All the sensors work, it’s just the API to access the data that’s non-standard in every one of these.  Did nobody on the design teams consider using a standard API for modern interface technology?  FFS!

P.S. Yes, I know there’s a URF that supports 400kbps I2C baudrate at 3.3V, but it has a bloody great potentiometer on the underside meaning it’s nigh on impossible to attach it ground-facing under a quadcopter.

P.P.S.  I know I could use the PX4FLOW; I actually have 3 but only one of these (the original) works; the clones both do not.  And anyway, where’s the fun in that compared to a vertical rangefinder, the Raspberry Pi Camera and the MPU-9250 gyro i.e. the three components that make up the PX4FLOW?

Phoebe’s sensitive under belly*

Phoebe's delicate underside

Phoebe’s delicate underside

Phoebe has got the PiCamera and the SRF02 Ultrasonic Range Finder installed on her underside; legs are back in place to achieve both camera focus and URF minimum range on the ground.

Camera’s working fine though the software is proving tricky for using the camera to do laser dot following or motion tracking MP4 encoding.

URF isn’t working; i2cdetect -y 1 sees the sensor, but a write to send an ultrasonic ping just blocks.  There’s a couple of possible causes: the URF is running on 3.3V not the 5V as defined by the spec, and the I2C bus is running at 400kbps instead of the URF’s supported 100kbps.

I can’t drop the I2C baudrate – I have 1000 batches of 12 byte samples to read each second from the IMU FIFO and even ignoring the I2C overhead that requires 96kbps – not a cat in hell’s chance of running the IMU and the URF together at that baudrate.

The voltage is fixable with some 2-way level adjustors like these, but these are going to need a PCB rework.

Stuck again for the moment.

*much like a hedgehog

URF prototype

I put together one of the SRF02 ultrasonic range finders onto my prototyping Pi:

Prototype range finder

Prototype range finder

Knocked together some simple code…

An it all just worked! Ranges detected accurately to large objects, while ignoring little things in the way.

Next step is to install this on Chloe once her new arms turn up.

Ultrasonic range finder

I’ve bought 6 of these from here.

SRF02 Ultrasonic Range Finder

SRF02 Ultrasonic Range Finder

SRF02 Ultrasonic Range Finder

SRF02 Ultrasonic Range Finder

They are I2C with configurable addresses allowing up to 16 to be used simultaneously.  They can read proximity about 15 times a second up to a range of 6 meters and a resolution of 1 centimeter.  Essentially, you trigger them to send a pulse, and then poll them until they return a valid value.  The ‘slow’ response is required to allow the previous pulse to decay before a new one can be sent.

The first stage is to knock together a basic set up to write and test the code on my prototyping Pi.  Once that’s done, then the rough plan is as follows.

Attach one of these to Chloe’s underside, and use it to get her to fly at a fixed height above the ground. Set a target height, and use the sensor as the input to a PID, the output of which defines the desired vertical velocity, but with a cap how fast that velocity can be.  Horizontally, the existing PIDs are still used to constrain drift.

Next I can play for a bit with horizontal motion; linear, turning corners with yaw, and circling with inward fixed tilt and forward fixed velocity targets

Next is to add 5 more sensors and set up the horizontal PIDs; set up minimum and maximum boundaries for proximity of objects such that if she senses an object within that upper boundary, she moves away until it’s outside that boundary.  If in doing so, she detects (or is already aware of) another object in the opposite direction, then the aim is to position herself half way in between.  If the distance between two boundaries is less that twice the minimum boundary, she just lands.  I have lots of 4m x 2m * 0.2m polystyrene sheets left from a previous hobby that can be used to test this by building walls around her.  Again a maximum velocity is imposed.  This will allow her to drift aimlessly around a space without hitting something.  If no boundaries are detected within the maximum, then her horizontal speed targets are set to 0 so she just hover.  Moving a ‘wall’ towards her should then push her away.

After that, I need to add a maze algorithm to allow drift towards the most distant boundary, yet maintain the minimum boundary distance, in order to search for the next path, first perhaps to navigate an L shape path initially, moving onto an Π shape.  There needs to be a constraint that when there is a choice of directions, exploring new parts of a path overrides back-tracking even if the backtrack path is longer.

At some point, I’ll probably need a compass to control yaw, but hopefully, the integrated gyro can do that for long enough for me to carry out the majority of the above without one in shortish flights.