Forget Finding Nemo: This AI can identify a single zebrafish out of a 100-strong shoal
Sounds fishy, yet it works for fruit flies, too. So take that, fish/fly-spotting humans
AI systems excel in pattern recognition, so much so that they can stalk individual zebrafish and fruit flies even when the animals are in groups of up to a hundred.
To demonstrate this, a group of researchers from the Champalimaud Foundation, a private biomedical research lab in Portugal, trained two convolutional neural networks to identify and track individual animals within a group. The aim is not so much to match or exceed humans' ability to spot and follow stuff, but rather to automate the process of studying the behavior of animals in their communities.
“The ultimate goal of our team is understanding group behavior," said Gonzalo de Polavieja. “We want to understand how animals in a group decide together and learn together."
The resulting machine-learning software, known as idtracker.ai, is described as “a species-agnostic system.” It’s "able to track all individuals in both small and large collectives (up to 100 individuals) with high identification accuracy—often greater than 99.9 per cent," according to a paper published in Nature Methods on Monday.
The idtracker.ai software is split into a crossing-detector network and an identification network. First, it was fed video footage of the animals interacting in their enclosures. For example in the zebrafish experiment, the system pre-processes the fish as coloured blobs and learns to identify the animals as individuals or which ones are touching one another or crossing past each other in groups. The identification network is then used to identify the individual animals during each crossing event.
Surprisingly, it reached an accuracy rate of up to 99.96 per cent for groups of 60 zebrafish and increased to 99.99 per cent for 100 zebrafish. Recognizing fruit flies is harder. Idtracker.ai was accurate to 99.99 per cent for 38 fruit flies, but decreased slightly to 99.95 per cent for 72 fruit flies.
“I didn't believe we could reach those numbers; it was a surprise," said de Polavieja. "I thought there wouldn't be enough information in the images." Four years ago, “we could track ten animals back then," he added.
Americans are just fine with facial recognition technology – as long as they get shorter queuesREAD MORE
The training process is a little tedious. All the animals in the group have to be individually imaged, as well as the group crossings. Sometimes this process can be automated, and sometimes humans have to manually label the pictures. Each animal in the group is then labelled with a name like George or Tom or assigned a number for identification.
Idtracker.ai also requires quite a lot of training data. The footage needs to capture 30 images per animal, and if there are 100 zebrafish in the tank then a video camera needs to be kept recording for a while. About 300 pixels are analysed per animal for the crossing network, and 100 per animal for the identification network.
The researchers believe that transfer learning can be used to perform the same experiments with similar animals. “The transfer-learning technique can be applied at the identification stage to reuse knowledge from a network previously trained with similar animals and light conditions,” the paper stated. But it might require retraining the whole network or a few of its layers. They also used idtracker.ai for ants, Japanese rice fish, and mice.
Untrained neural networks may not understand the concept of objects, but it doesn’t stop them being excellent at identifying them over time. If they can pick out individual animals that look virtually identical to us, then they should be able to cope with human faces that have much more variation, right?
The technology isn't good enough yet, however. London's Metropolitan Police reported a whopping 98 per cent false positive rate for one of their systems. Also, let's not forget the project by the American Civil Liberties Union (ACLU) that showed how bad computers were at matching up faces.