LabGym

LabGym

open_source

LabGym is a free, open-source deep learning platform that automatically tracks, identifies, and quantifies user-defined animal behaviors from video. Supports any species or behavioral task.

About

LabGym is an open-source AI-powered software tool designed for behavioral scientists who need a flexible, reproducible way to analyze animal behavior from video data. Built on deep learning and holistic assessment methodology, it automatically tracks subjects within video footage, recognizes user-defined behaviors, and outputs structured event records, annotated visual files, and 13 distinct quantification metrics per behavior. Unlike rigid behavior-analysis tools locked to specific species or predefined behavioral categories, LabGym is highly adaptable: researchers can train custom models for virtually any behavioral task, species, or experimental context—from rodent locomotion studies to complex group social interactions. The platform is developed and maintained by the Ye Lab at the University of Michigan's Life Sciences Institute, reflecting a strong academic research pedigree. Key capabilities include multi-subject tracking, behavior recognition across diverse experimental setups, automated annotation overlay on output videos, and the generation of rich quantitative data ready for downstream statistical analysis. Its open-source nature means full transparency, community-driven development, and no licensing costs—making it accessible to academic labs worldwide. LabGym is ideal for neuroscientists, behavioral ecologists, pharmacologists, and any researcher who needs to turn raw experimental video into objective, quantifiable behavioral data without relying on manual scoring. It runs locally on standard research computing hardware across major operating systems.

Key Features

  • User-Defined Behavior Recognition: Train custom deep learning models to recognize any behavior you define, across any species or experimental context—no predefined categories required.
  • Automated Multi-Subject Tracking: Automatically detects and tracks one or multiple subjects throughout a video, handling both individual and group behavior scenarios.
  • 13 Unique Behavioral Quantifications: Generates 13 distinct quantification metrics per recognized behavior, providing rich numerical data for downstream statistical analysis.
  • Annotated Visual Output: Produces annotated video overlays and structured event records, making results easy to review, share, and include in publications.
  • Species and Task Flexibility: Scales from simple locomotion tracking to complex social group interactions, and works with footage from any organism or research paradigm.

Use Cases

  • Neuroscience researchers automating the scoring of rodent behavioral assays (e.g., open field, elevated plus maze, social interaction tests) to eliminate manual annotation bottlenecks.
  • Pharmacology labs quantifying drug-induced changes in animal movement and behavior patterns across large video datasets.
  • Behavioral ecologists studying social dynamics and group interaction patterns in non-model organisms recorded in naturalistic settings.
  • Graduate students and academic labs building custom classifiers for novel, lab-specific behavioral paradigms without needing commercial software licenses.
  • Research groups generating reproducible, publication-ready behavioral data with consistent automated scoring across large experimental cohorts.

Pros

  • Completely Free and Open Source: No licensing costs, full code transparency, and an active academic community make LabGym accessible to labs of any size or budget.
  • Highly Flexible and Customizable: Can be trained for virtually any species, behavior, or experimental setup, removing the constraints of species-specific or pre-labeled software.
  • Reproducible and Quantitative Results: Outputs structured data records and 13 quantification metrics per behavior, supporting rigorous, reproducible scientific workflows.
  • No Manual Scoring Required: Automates the tedious process of frame-by-frame manual annotation, saving researchers significant time and reducing human scoring bias.

Cons

  • Requires Technical Setup: As an open-source Python-based tool, initial installation and model training may require technical expertise beyond typical point-and-click software.
  • Custom Training Data Needed: Training a new behavior classifier requires labeled example videos, which demands an upfront investment of time for annotation before automated analysis can begin.
  • Limited to Video-Based Analysis: LabGym is purpose-built for video data; it does not support other sensor modalities such as electrophysiology, audio, or wearable tracking.

Frequently Asked Questions

What types of animals or species does LabGym support?

LabGym is species-agnostic. Because users define and train their own behavior categories, it can be applied to any animal—from common model organisms like mice and zebrafish to insects, primates, or other species—as long as video recordings are available.

Do I need machine learning expertise to use LabGym?

LabGym is designed to make deep learning accessible to behavioral researchers. While some familiarity with Python and data preparation helps, the platform abstracts many ML complexities through a guided workflow for training and running models.

What does LabGym output after analyzing a video?

LabGym produces annotated videos with behavior labels overlaid, structured event records of when behaviors occurred, and 13 unique quantification metrics per behavior (such as duration, frequency, and intensity measures).

Is LabGym free to use?

Yes, LabGym is completely free and open source. It is developed and maintained by the Ye Lab at the University of Michigan and is available without any licensing fees.

Can LabGym track multiple animals simultaneously?

Yes, LabGym supports multi-subject tracking and can analyze both individual behaviors and group interactions, making it suitable for social behavior studies.

Reviews

No reviews yet. Be the first to review this tool.

Alternatives

See all