This program allows the tracking of specimen (e.g. a mouse, a rat ...) in an open field maze from pre-recorded videos or from a live feed. The live stream of frames can be generated from a USB camera or from the camera (normal or NoIR) of the Raspberry Pi. On the Raspberry Pi, a subclass of the standard PiCamera object is used to speed-up video acquisition and online processing.
The combination of recording and online position tracking allows the definition of complex behavioural experiment as the program can send a TTL pulse to an other computer upon detecting the mouse (for example: in a certain region of space).
The modules can be used interactively from the python interpreter or through the provided interfaces. This program provides both a command line interface and a graphical user interface. For the CLI, the defaults are saved to a user specific preference file.
Some basic analysis of the extracted position data is also available. Two classes are also supplied for viewing of the recorded videos or transcoding.
![A usage example of the software](https://github.com/SainsburyWellcomeCentre/pyper/raw/master/doc/source/exampleCapture.gif)
An example of the tracking software in action.
For further documentation, you can compile the documentation using sphinx or alternatively, head to http://pyper.readthedocs.org/en/latest/
Charly V Rousseau1, Antonio Gonzalez2, Andrew Erskin3, Christian J Niedworok1, Troy W Margrie1.
- Author information:
- 1Margrie lab. Sainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London, London, U.K.
- 2Burdakov lab. Mill Hill Laboratory, The Francis Crick Institute, London, U.K.
- 3Schaefer lab. Mill Hill Laboratory, The Francis Crick Institute, London, U.K.
The authors would like to thank Edward F Bracey, Nicholas Burczyk, Julia J Harris, Cornelia Schöne and Mateo Vélez-Fort for their useful comments about the interface design and the user instructions.