Why did I choose autonomous flight control?

Right at the beginning of this project, 18 months ago, I assumed autonomous control was a necessary precursor to introducing a human into the mix for safeties sake.  18 months later I have just realized it’s the replacement for the human.  As I mentioned in yesterday’s post, none of the PIDs shown in a google image search have an integrated accelerometer velocity PIDs – they use a human with an RC transmitter and eyes to provide the outer most feedback loop.

The replacement for the human needs a motion sensor which can also measure distance travelled.  There are off-the-shelf solutions here and here and projects based on some google phone technology (cheers Jeremy).

What they share if you look hard enough is the fact they are being used as PhD or above papers in some of the best Universities in the world.

I’d looked at motion detection previously and the RaspberryPi forums did offer DIY ideas and commercial solutions, based upon static orf video image processing using image change processing to produce motion detection.  But again it all seemed too tricky based upon the fact off-the-shelf drones don’t do it.

I’d also tried accelerometer calibration using temperature trends to ensure regardless of temperature, the accelerometer offset / gain was known.  But I abandoned these for the same reason: nobody else bothers.

But now I finally understand why nobody else bothers – they have a human included in their feedback loop, still using an optical positioning system but with a much more powerful processor to control drift.

I really don’t want to introduce a human into the control loop – that was never part of my original plan.

So where to next? My short term target is only a 20 – 30s flight with minimal drift.  I would hope that the combination of trend-line accelerometer offsets and gains, combined with a few fixes applied since then may be able to achieve it.

Once more, we shall see.

 

8 thoughts on “Why did I choose autonomous flight control?

  1. Some slightly random/ disorganised thoughts and comments.

    OK, lets look at this slightly differently.
    Ever tried crossing a very large field or the moors in dense fog?
    If the fog is bad enough, you may end up literally going around in circles. Especially on the moors.
    Now try with a compass. If you keep the compass in view at all times, you will probably travel in a straight line, but even then you might find there is some drift perpendicular to the course you are heading on.
    Now, add a reference like a bright light that you can see through the fog. Much better, but still not perfect.
    So back to the quadcopter. We are using a set of sensors to detect movement and direction, but don’t have anything external. Might as well be in the fog. We could add GPS, which uses a number of reference points with known position and movement vectors. The positioning is not entirely accurate, there is a randomisation factor added at the sources, but it’s quite close. As a result, my car GPS will happily tell me that I am travelling on a road parallel to the one I am actually on, if it is close, until some change of course means that it no longer makes sense. It is possible to add a local reference point which is known to a high level of accuracy, and use this to correct the satellite data. I understand that one of my clients uses this in a 3D surveying system.
    So, we could set up one or more local reference beacons, e.g. point light sources, sound beacons or even radio beacons.
    There is the question of just how accurately we need to know the position of the quadcopter, which may have considerable bearing on the beacon(s).
    On my own version of this project, still in very early stages, I started by looking at GPS (a USB dongle is available from Maplin.co.uk which is quite easy to install on the Raspberry Pi). I always thought I would need the GPS to help fix the position of my ‘copter. I also selected an MPU which has accelerometer, gyroscope, magnetometer and barometric sensors. At this point, I haven’t done any of the maths but I am expecting to have to do a partial differential on the various equations to determine the effects on accuracy of the various error sources. Since I haven’t done that sort of thing for at least 25 years I might have to do a lot of revision 🙂
    If you don’t hear from me for a bit, you’ll know I’m doing my homework.

    • My trigonometry and bits of matrix stuff all dates back to the same time as yours – luckily it wasn’t too painful digging it up again. External reference would be nice – it’s kind of like what Raffaello D’Andrea did with netting sensors and ping-pong balls but swapping the computing power and sensors onto the quad, and the references outside.

      I guess a large cube (5m?) of PWM driven LED references (for unique ID per LED at the corners would mean the quad with suitable sensors could fly inside and outside that cube as long as always 3 LEDs sensors are in sight. Excellent solution for indoor flights (demos and presentations) but less good for the park, I guess!

      I suspect I’ll never get to the point I’ll need GPS: once she’s drift free, and I can get her to move autonomously in a circle of given radius and speed, that’ll be enough to make me happy. Altimeter / barometer may well be necessary for that to maintain the height for longer autonomous flights (order 1 minute max), but I suspect not compass / magnetometer nor GPS.

      Anyway, current plan is an update to the rate PID targets are updates – the velocity PIDs output feed the attitude PIDs’ targets at a slower rate than the attitude PIDs’ outputs update the ESC PWM – comes from another research paper relating to noise management, allowing stabilzation to be prioritized and time to react to motion before motion outputs gets updated. I still hope I can get away without optical motion detection for 10s flights for demos / presentations indoors.

  2. As you said, autonomous flight at this scale ( i.e. high level of precision) is very hard and is the subject of predilection for many research labs and PhD studies…

    I think visual servoing/motion detection is the way to go though… Ether with a commercial flow detection sensors or with the picam, although would probably require a lot of work… A whole project of its own!

    A recent blog post on raspberrypi.org shows how the picam can be used for hardware motion detection:

    http://www.raspberrypi.org/vectors-from-coarse-motion-estimation/

    • Thanks David,

      I’ve already sent an e-mail to Gordon – someone on the forum picked up a similar question I’d posted and pointed me at the post – my memory is terrible!

      If nothing else I hope he can tell me how big and scary the project will be – depends on how much he’s put up on GibHub – if there isn’t much this may be brick wall that finally kills her autonomous control.

Leave a Reply to Hove Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.