I mentioned a year ago using the Raspberry Pi Camera to provide motion data; this motion data is a bi-product of the compression of h.264 video frames.

Each frame in a video is broken up into 16 x 16 pixel blocks (macro-blocks) which are compared to the blocks from the previous frame to find nearest matches.  Then an X, Y vectors is produced for each macro-blocks for the best matching block in the previous frame along with a score of absolute differences (SAD) – essentially a score of trustworthyness of the vector.  This happens at the frame rate for the video as part of the h.264 video compression algorithm, and is produced by the GPU alone.

Because of the fixed frame rate, these are effectively velocity vectors, and because the camera would be strapped under the quadcopter body, the velocities are in the quadcopter reference frame.  A crude processing of this data would simply be to average all the X,Y vectors based upon each SAD values to come up with a best guess estimate of an overall linear vector per frame.  Better processing would need to accommodate yaw also.

All this function is now available from the Python picamera interface making it very easy to create a motion tracking process which feeds the macro-block vectors + SAD data to the HoG code which can do the averaging and produce the quadframe velocity PID inputs for the X and Y axes.  The velocity PID targets are set as now to be earth-frame speeds reorientated to the quad-frame.


  • frame speed and processing on the GPU
  • no integration of (accelerometer – subtraction of gravity rotated to the quadframe) to get velocity hence no cumulative error from offsets


  • need some height information – possibly just best estimate as now

This opens up the possibility of indoor and outdoor motion processing over varying contrast surfaces, or laser tracking over surfaces without significant contrast difference.

First step is to adapt the existing Kitty code to churn out these motion vectors and stream them to a dummy module to produce the resultant velocity PID inputs.  More thoughts anon.

6 thoughts on “Kitty++

  1. Pingback: Motion vectors from Raspicam videos | PiStuffing

  2. Pingback: Motion sensors | PiStuffing

  3. Pingback: Detect motion vector with the Raspberry Pi Camera | Tim Delbrügger

  4. Pingback: Random bits and bobs | PiStuffing

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.