Synchronicity

Just as I was about to shelve this project, the latest RPi update provided hardware video processing via an update for vlc.

It’s far from perfect, but infinitely better than before, and more-than good enough to investigate more.  In particular, currently the connection between camera and player are joined via a home WiFi with router and extender with perhaps 10 other computers with the kids running videos!  Next step is to move onto Penelope’s private network and see if that’s better.


P.S. Just run with Penelope (WAP host), Percy (VLC video player) and Pat (RPi video recorder). It’s better, but the video still stalls every few seconds. Not sure yet whether this is due to the network connectivity or VLC itself.

Penelope++

I’m still working on Penelope in the background, primary on her physical frame legs and body – subtle but better in my opinion; lighter yet longer legs combined with a longer overlapping frame makes her lighter and yet more stable in flight and a better protection of winter weather conditions.

There are some refined code changes here too here.

With regard to the on-board camera to be added to her, currently this is stalled due to both servo accuracy and video live streaming to another RPi screen, specifically displaying the video live.  Until boredom overtakes frustration, progress will be slow!

 

IMU + Servo

The IMU Gyro now controls the Servo motion. The aim is it compensates for vibration and drift in flight so the attached camera always points stably at the target.  Ultimately, the RPi0W will be attached to the servo with servo movement reversed to achieve this.

The problem at the moment is I can’t find a video player app on a seconds RPI to display the stability of the video output during piDrone instability flight.

Remote video feed

The RPi0W is now attached to the servo and rotating, while its camera runs a live video over to an RPi3 over WiFi, played via vlc. The result is rubbish, and I think this is due to vlc processong. Not sure what to do about this yet.

import socket
import time
import picamera

# Connect a client socket to my_server:8000 (change my_server to the
# hostname of your server)
client_socket = socket.socket()
client_socket.connect(('192.168.1.211', 8000))

# Make a file-like object out of the connection
connection = client_socket.makefile('wb')
try:
    camera = picamera.PiCamera()
    camera.resolution = (640, 480)
    camera.framerate = 24
    camera.rotation = 90

    # Start a preview and let the camera warm up for 2 seconds
    camera.start_preview()
    time.sleep(2)
    # Start recording, sending the output to the connection for 60
    # seconds, then stop
    camera.start_recording(connection, format='h264')
    camera.wait_recording(60)
    camera.stop_recording()
finally:
    connection.close()
    client_socket.close()

2D Servo

First part of the new forward-facing camera for the piDrones is a two-dimensional servo pair for left/right and up/down movement, in it’s basic test as a circle sweep:

from __future__ import division

import math
import time

from RPIO import PWM

RPIO_DMA_CHANNEL = 1

#-------------------------------------------------------------------------------------------
# Set up the globally shared single PWM channel
#-------------------------------------------------------------------------------------------
PWM.set_loglevel(PWM.LOG_LEVEL_ERRORS)
PWM.setup(1)                                    # 1us resolution pulses
PWM.init_channel(RPIO_DMA_CHANNEL, 20000)       # pulse every 20ms


####################################################################################################
#
#  Class for managing each serve via PWM.  Range is 1-2ms every 20ms though specified in micro seconds
#
####################################################################################################
class SERVO:

    def __init__(self, pin):
        #-------------------------------------------------------------------------------------------
        # The GPIO BCM numbered pin providing PWM signal for this ESC
        #-------------------------------------------------------------------------------------------
        self.bcm_pin = pin

        #-------------------------------------------------------------------------------------------
        # Initialize the RPIO DMA PWM for this ESC.
        #-------------------------------------------------------------------------------------------
        self.set(1500)

    def set(self, pulse_width):
        pulse_width = pulse_width if pulse_width >= 1000 else 1000
        pulse_width = pulse_width if pulse_width <= 2000 else 1999

        self.pulse_width = pulse_width

        PWM.add_channel_pulse(RPIO_DMA_CHANNEL, self.bcm_pin, 0, pulse_width)


lr = SERVO(18)
ud = SERVO(23)

try:
    lr.set(1500)
    ud.set(1500)

    while True:
        # Left / Right = 180o Up / Down = 150o
        # Servo is 1000us to 2000us ever 20ms, with 1500us in the middle
        # Ultimately accurate time comes from the IMU
        for ii in range(-100, 101):
            lr.set(1500 + int(round(500 * math.sin(ii / 100 * math.pi))))
            ud.set(1500 + int(round(410 * math.cos(ii / 100 * math.pi))))
            time.sleep(0.02)

except:
    lr.set(1500)
    ud.set(1500)

    del lr
    del ud

    PWM.cleanup()

The only problem to fix was the lower servo that needed dismantling and rebuilding so the centre point of the PWM signal matched the servo physical centre point, not 45 degrees or so out. Check out the difference between my and the supplier’s videos above.

Penelope, Percy and Pat.

When I was at the latest Cotswold Jam, one of the regulars suggested adding a camera to one of my piDrones to video its flight firsthand; that planted a seed which blossomed overnight:

  • Set up a live video stream from a RPi0W attached to one of my piDrones, the output of which is sent over WiFi to a RPi3+ RC touch-screen and display the video on a screen app there
  • Add on-screen pseudo-buttons to the RPi3+ RC touch-screen and use those to record the video to disk if specified
  • Add 2 on-screen pseudo-joysticks on the RPI3+ touch-screen RC, sending it to the piDrone, much like the physical joysticks do now
  • Finally, add IMU / servos hardware / software to keep the camera stable when it’s attached to a flying piDrone – trivial compared to the items above.

I’m completely ignorant how to implement all but the last item, much like the challenge to build the piDrones 6 years ago and hence that’s a fab challenge!  And in comparison to the piDrone itself, it’ll be cheap:  the parts either I already own, or are cheap to buy.  And I like the fact it gives a unique role for Penelope – currently she’s just Hermione without the object avoidance.

First job though is to name the Raspberry Pi’s:

 

Blog stats and a new project?

Just a couple of stats about my blog history.

First, the bandwidth of my web site over the last 6 years.

Blog hits

Blog hits

Points worth mentioning….

There’s the video hits when the RPi post was made:

Raspberry Pi Video Stats

Raspberry Pi Video Stats

  • On Saturday 1st September, I sent my sample blog post to the RPi team
  • It was tweaked and polished during Saturday and Sunday ready for posting on…
  • Monday 3rd September: the vimeo videos were hit by 1373 watching that day!
  • The hits dropped steadily until it dropped from the raspberrypi.org front page on Friday 7th, but it was compensated by the weekly RPi news e-mail on the same day running over the weekend.
  • Thereafter a slow descend to normality of <10 hits per day by the end of September.

Although I said the project is done, a colleague at the Cotswold Jam suggested I added a forward-facing camera; I’m considering this: with 2 servos and a gyrometer for stability, photos and video on an RPi0W would be interesting and challenging, especially feeding a video live to the remote control!  I’m investigating this right now!

 

Passion Flower

Zoë’s always my simplest and the best looking.

While Hermione and Penelope both have lids (custom cropped 50mm dome and cropped salad bowl respectively), Zoë and her predecessors never have.  This has now been fixed.  And finally, this is more DIY.  Starting with clear acrylic (perspex) domes and tubes (10cm in diameter and lengths), these are stuck together (fused effectively), sawed in half, filed and painted.  I made a prototype and final version that I prefer to prototype due to the unexpected slope of the frame.

At the same time, I’ve been enhancing her O/S to Stretch, and adding the Garmin LiDAR-Lite v3HP – a requirement to use Stretch I2C implementation – but also thinner so a little more protected on landing.

I’ve been refining the hardware, both with the PCB to accommodate the GLLv3HP effectively and an updated 2A voltage regulator, combined with ESC wiring shortened so they fit snugly inside the frame, both for safety and prettiness value.

Finally, I’ve been refining the code at the I2C level to make it as efficient as possible; Zoë is running on the brink of working due to the single CPU Pi0W.

Here’s the result:

She looks unstable.  She’s top heavy and hence very sensitive to the slightest breezes.  I may put more effort into tuning, but since her priority is for indoors, I’ll test that first.

Why six years?

It probably took a years at the beginning to get the system working at a basic level, and a couple of years at the end adding cool features like GPS location tracking, object-avoidance, human remote-control, custom cool lids etc.  So what happened in the intermediate three years?

The biggest timer-killer was drift in a hover: the accelerometer measures gravity and real acceleration along three perpendicular axes.  At the start of a flight, it reads a value for gravity before takeoff; during the flight, integration of new readings against the initial gravity reading provides velocity.  In a calm summer’s day, all worked well, but in other conditions, there were often collisions with brick-walls etc.  Essentially, the sensors say they weren’t moving whereas reality said it was.  It took years to recognise the correlation between winds, weather and crashes: lots of iterative speculation spanning the seasons were required to recognise the link to temperature variations*.

There are two factors: first, the IMU has been optimised to operate above 20°C – below that and small temperature changes lead to significant drift in values; secondly, once in flight, weather- and prop-breeze cools the sensor compared to the ground measurement of gravity.  I tried lots of ways to handle the temperature changes; at one point, I even bought a beer-fridge for mapping accelerometer gravity values against temperature!  It’s back to its intentional purpose now.

Digging back, it’s clear how others have coped:

  • DIY RC drones used synchronising the IMU and RC each flight along with sheltering the IMU from the wind.  The pilot wasn’t interested in static hover and is part of the feedback loop for where it to go.
  • at the bleeding edge, the DJI Mavic has a dynamic cooling system embedded in the depths of the frame keeping the IMU at a fixed temperature, along with two ground-facing cameras and GPS to long-term fine tuning.
  • All videos I saw were from the California coastline!!!

But I did it my way, the DIY experimentation way resulting ultimately with passive temperature stability by wrapping the IMU in a case to suppress wind cooling, combined with a Butterworth low-pass filter to extract gravity long-term, and the LiDAR / RPi camera to track the middle ground.  I perfected reinvention of the wheel!

Hindsight is wonderful, isn’t it!


*My apologies for the complexity of this sentence, it represents the frustration and complexity I encountered working this out over years!