Video-based analysis of animal behaviour

SWC/GCNU Neuroinformatics Unit

Niko Sirmpilatze, Chang Huan Lo, Alessandro Felder

Introductions

Neuroinformatics Unit (NIU)

Niko Sirmpilatze

Chang Huan Lo

Alessandro Felder

Schedule: morning

10:00 - 10:20: Welcome and troubleshooting

10:20 - 11:00: Background

  • What is behaviour and why do we study it?
  • Tracking animals with pose estimation

11:00 - 12:00: Practice with SLEAP

  • Annotate video frames
  • Train a pose estimation model

12:00 - 13:30: Lunch break and SWC lab meeting

Schedule: afternoon

13:30 - 14:30: Practice with SLEAP cont.

  • Evaluate trained models
  • Run inference

14:30 - 15:00: Coffee break and discussion

15:00 - 16:30: Practice with Jupyter notebook

  • Load and visualise pose tracks
  • Filter pose tracks
  • Quantify time spent in ROIs

16:30 - 17:00: Further discussion

  • Behaviour classification and action segmentation

Course materials

These slides

Course webpage

GitHub repository

Sample data

Install SLEAP via conda

Read the official SLEAP installation guide. If you already have conda installed, you may skip the mamba installation steps outlined there. Instead, install the libmamba-solver for conda:

conda install -n base conda-libmamba-solver
conda config --set solver libmamba

After that, you can follow the rest of the SLEAP installation guide, substituting conda for mamba in the relevant commands.

conda create -y -n sleap -c conda-forge -c nvidia -c sleap -c anaconda sleap=1.3.1
conda create -y -n sleap -c conda-forge -c anaconda -c sleap sleap=1.3.1

What is behaviour?

Answer on mentimeter

Defining behaviour

The total movements made by the intact animal

Tinbergen, 1955

Behavior is the internally coordinated responses (actions or inactions) of whole living organisms (individuals or groups) to internal and/or external stimuli, excluding responses more easily understood as developmental changes

Levitis et al., 2009

Neural activity and behaviour

Marr’s three levels of analysis

Neuroscience needs behaviour

…detailed examination of brain parts or their selective perturbation is not sufficient to understand how the brain generates behavior

…it is very hard to infer the mapping between the behavior of a system and its lower-level properties by only looking at the lower-level properties

The behavioral work needs to be as fine-grained as work at the neural level. Otherwise one is imperiled by a granularity mismatch between levels…

…the explanations of the results at the neural level are almost entirely dependent on the higher-level vocabulary and concepts derived from behavioral work. Lower levels of explanation do not “explain away” higher levels.

Krakauer et al., 2017

Quantifying behaviour: ethogram

Ethogram: a list of typical behaviours performed by an animal, including when and how often they occur

Time after start (min) Foraging Eating Grooming
0:30 0 0 1
1:00 0 0 1
1:30 1 0 0
2:00 0 1 0

Quantifying behaviour: modern

flowchart TB

    video -->|compression/re-encoding | video2["compressed video"]
    video2 -->|pose estimation + tracking| tracks["pose tracks"]
    tracks --> |calculations| kinematics
    tracks -->|classifiers| actions["actions / behav syllables"]
    video2 --> |comp vision| actions

Finding and tracking animals

Pose estimation

flowchart TB
    classDef emphasis fill:#03A062;

    video -->|compression/re-encoding | video2["compressed video"]
    video2 -->|pose estimation + tracking| tracks["pose tracks"]
    tracks --> |calculations| kinematics
    tracks -->|classifiers| actions["actions / behav syllables"]
    video2 --> |comp vision| actions

    linkStyle 1 stroke:#03A062, color:;
    class video2 emphasis
    class tracks emphasis
  • “easy” in humans - vast amounts of data
  • “harder” in animals - less data, more variability

Pose estimation software

DeepLabCut: transfer learning

SLEAP:smaller networks

source: sleap.ai

Multi-animal part grouping

Top-down vs bottom-up

Multi-animal identity tracking

3D pose estimation

Which mouse is more anxious?

Click here to post your answers

sub-01

sub-02

The Elevated Plus Maze

  • Structure: 2 open arms, 2 closed arms, central area
  • Exploits rodents’ natural aversion to open spaces and height
  • Less anxious animals spend more time in open arms

Task: quantify time spent in open arms / closed arms

The dataset

$ cd behav-analysis-course

.
├── LICENSE
├── README.md
└── mouse-EPM
    ├── derivatives
    └── rawdata

$ cd mouse-EPM/rawdata

.
├── sub-01_id-M708149
   └── ses-01_date-20200317
       └── behav
           └── sub-01_ses-01_task-EPM_time-165049_video.mp4
└── sub-02_id-M708154
    └── ses-01_date-20200317
        └── behav
            └── sub-02_ses-01_task-EPM_time-185651_video.mp4

The SLEAP workflow

Create a new project

Define a skeleton

Source Destination
snout left_ear
snout right_ear
snout centre
left_ear centre
right_ear centre
centre tail_base
tail_base tail_end

Save the project right after defining the skeleton!

Generate labeling suggestions

Label initial ~20 frames

Start a training job 1/3

Start a training job 2/3

Start a training job 3/3

Monitor training progress

Evaluate trained models

Run inference on new frames

Using SLEAP on the HPC cluster

Predictions in the sample dataset

$ cd behav-analysis-course/mouse-EPM

.
└── derivatives
    └── behav
        ├── software-DLC_predictions
        └── software-SLEAP_project
            └── predictions
  • Different pose estimation software produce predictions in different formats.
  • Different workflows are needed for importing predicted poses into Python for further analysis.

What happens after tracking?

flowchart TB
    classDef emphasis fill:#03A062;

    video -->|compression/re-encoding | video2["compressed video"]
    video2 -->|pose estimation + tracking| tracks["pose tracks"]
    tracks --> |calculations| kinematics
    tracks -->|classifiers| actions["actions / behav syllables"]
    video2 --> |comp vision| actions

    linkStyle 2 stroke:#03A062, color:;
    class tracks emphasis
    class kinematics emphasis

Enter movement

Python tools for analysing body movements across space and time.

movement features

Implemented: I/O

  • ✅ import pose tracks from DeepLabCut and SLEAP
  • ✅ represent pose tracks in common data structure
  • ⏳ export pose tracks in various formats

In progress / planned:

  • ⏳ Interactive visualisations: plot pose tracks, ROIs, etc.
  • 🤔 Data cleaning: drop bad values, interpolate, smooth, resample etc.
  • 🤔 Derive kinematic variables: velocity, acceleration, orientation, etc.
  • 🤔 Integrate spatial information about the environment (e.g. ROIs, arena)
  • 🤔 Coordinate transformations (e.g. egocentric)

The movement data structure

single-animal

multi-animal

Time to play 🛝

In a terminal, clone the course repository and go to the notebooks directory:

git clone https://github.com/neuroinformatics-unit/course-behavioural-analysis.git
cd course-behavioural-analysis/notebooks

Create a new conda environment and install required packages:

conda create -n epm-analysis -c conda-forge python=3.10 pytables
conda activate epm-analysis
pip install -r notebook_requirements.txt

Once all requirements are installed, you can:

  • open the EPM_analysis.ipynb notebook
  • select the environment epm-analysis as the kernel

We will go through the notebook step-by-step, together.

Which mouse was more anxious?

This time, with numbers!

Answer on mentimeter

From behaviour to actions

flowchart TB
    classDef emphasis fill:#03A062;

    video -->|compression/re-encoding | video2["compressed video"]
    video2 -->|pose estimation + tracking| tracks["pose tracks"]
    tracks --> |calculations| kinematics
    tracks -->|classifiers| actions["actions / behav syllables"]
    video2 --> |comp vision| actions

    linkStyle 3 stroke:#03A062, color:;
    linkStyle 4 stroke:#03A062, color:;
    class tracks emphasis
    class video2 emphasis
    class actions emphasis

Several tools:

Classifying behaviours

Supervised vs unsupervised approaches

Answer on mentimeter

Feedback

Tell us what you think about this course!

Write on IdeaBoardz or talk to us anytime.

Join the movement!