A group from the University of Michigan has created new software to aid researchers in the biological sciences in conducting more effective animal behavior analyses. LabGym, an open-source program, uses artificial intelligence to recognize, classify, and count predefined activities in a variety of animal model systems.
For a variety of reasons, including understanding all the possible effects of a given medicine on an organism and mapping how the brain’s circuits interact to produce a certain behavior, scientists must measure animal behaviors.
Researchers in the lab of U-M faculty member Bing Ye, for example, analyze movements and behaviors in Drosophila melanogaster or fruit flies as a model to study the development and functions of the nervous system. These fruit fly research frequently shed light on human health and disease since fruit flies and humans share many genes.
“Behavior is a function of the brain. So analyzing animal behavior provides essential information about how the brain works and how it changes in response to disease,” said Yujia Hu, a neuroscientist in Ye’s lab at the U-M Life Sciences Institute and lead author of a Feb. 24 Cell Reports Methods study describing the new software.
Manually recognizing and counting animal actions, however, takes a lot of effort and is quite subjective to the person doing the analysis. There are a few software tools that can automatically quantify animal behaviors, however they have limitations.
“Many of these behavior analysis programs are based on pre-set definitions of a behavior,” said Ye, who is also a professor of cell and developmental biology at the Medical School. “If a Drosophila larva rolls 360 degrees, for example, some programs will count a roll. But why isn’t 270 degrees also a roll? Many programs don’t necessarily have the flexibility to count that, without the user knowing how to recode the program.”
Thinking more like a scientist
Hu and his colleagues made the decision to create a new program that more closely mimics the cognitive process of humans, “thinks” more as a scientist would, and is easier to use for biologists who might not be proficient in coding in order to solve these difficulties.
Many of these behavior analysis programs are based on pre-set definitions of a behavior. If a Drosophila larva rolls 360 degrees, for example, some programs will count a roll. But why isn’t 270 degrees also a roll? Many programs don’t necessarily have the flexibility to count that, without the user knowing how to recode the program.
Professor Bing Ye
Researchers can use LabGym to tell the program what to count by providing instances of the behavior they want to study. The software then makes use of deep learning to enhance its capacity to identify and measure the behavior.
The use of both video data and a so-called “pattern image” to increase the program’s dependability is one novel feature of LabGym that enables it to use this more adaptable cognition. Scientists to study animal behavior, however time series data from videos can be difficult for AI programs to process, use animal movies.
To help the program identify behaviors more easily, Hu created a still image that shows the pattern of the animal’s movement by merging outlines of the animal’s position at different timepoints. The group discovered that the program’s ability to identify different forms of behavior was improved by mixing the video data with the pattern images.
Like a human researcher would, LabGym is also made to ignore unnecessary background information and take into account both the animal’s general movement and changes in location over time and place. The program can also track multiple animals simultaneously.
Species flexibility improves utility
Another key feature of LabGym is its species flexibility, Ye said. While it was designed using Drosophila, it is not restricted to any one species.
“That’s actually rare,” he said. “It’s written for biologists, so they can adapt it to the species and the behavior they want to study without needing any programming skills or high-powered computing.”
After hearing a presentation about the program’s early development, U-M pharmacologist Carrie Ferrario offered to help Ye and his team test and refine the program in the rodent model system she works with.
Ferrario, an associate professor of pharmacology and adjunct associate professor of psychology, studies the neural mechanisms that contribute to addiction and obesity, using rats as a model system.
She and her lab colleagues have mostly relied on hand-scoring, which is subjective and takes a lot of time, to accomplish the required observation of drug-induced behaviors in the animals.
“I’ve been trying to solve this problem since graduate school, and the technology just wasn’t there, in terms of artificial intelligence, deep learning and computation,” Ferrario said. “This program solved an existing problem for me, but it also has really broad utility. I see the potential for it to be useful in almost limitless conditions to analyze animal behavior.”
The program will then be improved upon by the team in order to perform better in even more challenging circumstances, like observing animals in the wild.
This research was supported by the National Institutes of Health.
In addition to Ye, Hu and Ferrario, study authors are: Alexander Maitland, Rita Ionides, Anjesh Ghimire, Brendon Watson, Kenichi Iwasaki, Hope White and Yitao Xi of the University of Michigan, and Jie Zhou of Northern Illinois University.