Track: Animals in Motion#
Machine learning methods for motion tracking have transformed a wide range of scientific disciplines—from neuroscience and biomechanics to conservation and ethology. Tools such as DeepLabCut and SLEAP enable researchers to track animal movements in video recordings with impressive accuracy, without the need for physical markers.
However, the variety of available tools can be overwhelming. It’s often unclear which tool is best suited to a given application, or how to get started. Moreover, generating motion tracks is only the first step: these tracks must then be further processed, visualised, and analysed to yield meaningful and interpretable insights into animal behaviour.
Target audience
This course is designed for researchers and students interested in learning about the latest free open-source tools for tracking animal motion from video footage and extracting quantitative descriptions of behaviour from motion tracks.
Course overview#
Tuesday morning: We’ll start with a primer on Computer Vision approaches for detecting and tracking animals in videos. We’ll also cover key concepts and terminology, and provide an overview of the most widely used tools.
Tuesday afternoon: We’ll continue with a hands-on tutorial on using SLEAP—a popular software library for animal pose estimation and tracking. The typical workflow, from annotating body parts to training a model and generating predictions, is common to most pose estimation tools, including DeepLabCut.
Wednesday: The second day will be dedicated to a practical tutorial on movement—a Python toolbox for analysing animal body movements across space and time. You’ll learn how to load, clean, visualise, and quantify motion tracks, and apply this knowledge to specific use cases through computational exercises.
Course materials
All course materials will be made available at https://animals-in-motion.neuroinformatics.dev/latest/ during the workshop and will remain accessible afterwards.
The source code for the course materials is publicly hosted at neuroinformatics-unit/course-animals-in-motion.
Instructors#
Prerequisites#
Hardware#
This is a hands-on course, so please bring your own laptop and charger. A mouse is strongly recommended, especially for tasks like image annotation. A dedicated GPU is not required, though it may speed up some computations.
Software#
For general software requirements, please see the prerequisites on the main event page and make sure you have these installed and properly configured.
In addition to the general tools, you will need to install the following specialised software:
Please install SLEAP following the official installation instructions.
For this workshop, use SLEAP version 1.3.4. Be sure to replace the default version number (e.g. 1.4.1) in the instructions with 1.3.4.
This should create a conda
environment named sleap
with the necessary dependencies. You can verify the installation by running:
conda activate sleap
sleap-label
This should launch the SLEAP graphical user interface (GUI).
You will also need a separate conda
environment to use for interactive coding exercises.
This environment will include the movement and jupyter packages.
We recommend cloning the workshop’s GitHub repository and creating the environment using the provided environment.yaml
file:
git clone https://github.com/neuroinformatics-unit/animals-in-motion.git
cd animals-in-motion
conda env create -n animals-in-motion-env -f environment.yaml
To test your setup, run:
conda activate animals-in-motion-env
movement launch
This should open the movement GUI, i.e. the napari image viewer with the movement
plugin docked on the right.
There are other ways to install the movement package.
However, for this workshop, we recommend using the environment.yaml
file to ensure that all necessary dependencies, including those beyond movement
, are included.
Python knowledge#
If you’re new to Python, we recommend attending our Intro to Python workshop on Monday, or completing an equivalent course beforehand. This hands-on session will cover the basics, including data types, control flow, functions, and core libraries—a great way to get up to speed before this event.
Data#
Bringing your own data is encouraged but not required. This could include video recordings of animal behaviour and/or motion tracks you’ve already generated. It’s a great chance to get feedback on your data and learn from others.
Download example datasets
We also provide some example datasets for you to use during the workshop. Please download these from Dropbox before the workshop starts (they are a few GB in size).
We expect that participant-led ideas emerging from this track may inspire collaborative projects during the Hackday on Friday.