Right at the beginning of this project, 18 months ago, I assumed autonomous control was a necessary precursor to introducing a human into the mix for safeties sake. 18 months later I have just realized it’s the replacement for the human. As I mentioned in yesterday’s post, none of the PIDs shown in a google image search have an integrated accelerometer velocity PIDs – they use a human with an RC transmitter and eyes to provide the outer most feedback loop.
The replacement for the human needs a motion sensor which can also measure distance travelled. There are off-the-shelf solutions here and here and projects based on some google phone technology (cheers Jeremy).
What they share if you look hard enough is the fact they are being used as PhD or above papers in some of the best Universities in the world.
I’d looked at motion detection previously and the RaspberryPi forums did offer DIY ideas and commercial solutions, based upon static orf video image processing using image change processing to produce motion detection. But again it all seemed too tricky based upon the fact off-the-shelf drones don’t do it.
I’d also tried accelerometer calibration using temperature trends to ensure regardless of temperature, the accelerometer offset / gain was known. But I abandoned these for the same reason: nobody else bothers.
But now I finally understand why nobody else bothers – they have a human included in their feedback loop, still using an optical positioning system but with a much more powerful processor to control drift.
I really don’t want to introduce a human into the control loop – that was never part of my original plan.
So where to next? My short term target is only a 20 – 30s flight with minimal drift. I would hope that the combination of trend-line accelerometer offsets and gains, combined with a few fixes applied since then may be able to achieve it.
Once more, we shall see.