Despite the importance of behavioural data, our ability to collect it is still handicapped by the need to do so manually. Timing and distinguishing the behaviour of even a single animal in a lab experiment is tedious, and prone to observer bias. Obtaining data for hundreds of individuals is an even more forbidding task, and with difficulties faced by biologists in the field, the problem seems near insurmountable.
In recent years however, advances in machine vision have brought us closer to automated, consistent observations in tracking and timing individuals. The drawback is that these systems have an overreliance on strict experimental and environmental constraints, and the need for custom apparatus.
AMPtrack is a software being developed at Microsoft Research, capable of accurately tracking multiple, unmarked, interacting individuals. By separating automated preprocessing and interactive tracking, we ensure that tracks can be generated with minimal user input and time. As the software is independent of recording devices, the user may provide recordings from any habitat type. With its intended ability to robustly cope with variations in lighting, camera movement and changes in object appearance, this will be a huge step forward in facilitating the collection and analysis of behaviour not just for laboratory experiments, but more importantly, for biologists in the field.