A Major LEAP for Motion Capture Technology

Researchers at Princeton University are using artificial intelligence (AI) to more accurately predict and model the locomotive patterns of animals in various situations. The cross-departmental effort has yielded a platform that could revolutionize the way movement is studied. Dubbed LEAP (LEAP Estimates Animal Pose), the new tool uses AI to track how individual body parts move in live video. The availability of such technology could have major implications in linking physical behaviors to neural processes.

How LEAP Works

LEAP is able to synthesize a relatively small sample of footage of animal movements to predict and understand how those movements play out in various situations. The tool automatically plots and tracks a subset of motion in just minutes, and then allows AI to take over. Compared to early studies in this area, the platform’s neural network is remarkably adept at using a limited video sample of movement and extrapolating those movements to model future movements. In this respect, LEAP marks a true departure from similar efforts made recently across the field.

The Princeton team recognized that other methods were designed to work from data collected under tightly controlled laboratory conditions. For that reason, other neural networks constrain the user—there just isn’t the infrastructure in place to make inferences about behavior that occurs in dynamic, live situations. LEAP is optimized to interpret the raw material (read: video footage of all kinds) by labeling and sorting points across samples that might not have been generated under lab conditions.

Potential Applications

The most obvious use for LEAP’s AI-enabled process will be in animal behavior studies. The flexibility of the interface will enable researchers to determine what differences exist in animal motion under a variety of conditions. For instance:Do animals that have been exposed to certain chemicals move differently than those without such exposure? Does posture change based on seasonality or overall population health? What other environmental factors might play a role?

Perhaps the most important innovation that LEAP brings to the table is the versatility it enables among datasets of varying quality. The study of how animals negotiate their surroundings is highly labor intensive when footage is manually analyzed by zoologists. Automating the build-out of an animal’s “movement catalogue” from just a few samples of limited quality could dramatically increase our understanding of what neural processes are really driving animal behavior.

For more on how artificial intelligence is beginning to intersect with the animal kingdom, check out this article on what AI researchers can learn from the synchronized movements of groups of animals.