Imprimir Republish

Computing

AI systems could monitor animals crossing highways

Study tested the performance of computer vision models

Some five million large animals are killed on Brazilian highways every year, according to the Brazilian Center for Road Ecology

Paulo Baqueta / Getty Images

Computer systems that use artificial intelligence (AI) to detect moving objects can be adapted and trained to identify animals crossing Brazilian roads. These adapted AI systems could be installed in roadside devices to issue almost immediate alerts when animals are spotted on the highway, in addition to automatically classifying which species are most frequently hit by vehicles.

These are the main findings of a study led by scientists from the São Carlos Institute of Mathematics and Computer Science at the University of São Paulo (ICMC-USP), who analyzed the performance of 14 systems based on the You Only Look Once (YOLO) algorithm, which is used to identify and delimit the location of specific objects in an image or video (in this case, animals). The study was published in the journal Scientific Reports in January.

None of the systems performed perfectly when tasked with analyzing images of five wild animal species they had been trained to recognize: the tapir, jaguarundi (wild cats), maned wolf, cougar, and giant anteater. Some, however, such as Scaled-YOLOv4, achieved a performance of greater than 85% in most situations. “Comparative studies are essential to determining the response time needed for these systems to work efficiently on the roads, a scenario that involves high-speed vehicles, and to evaluate the feasibility of their implementation,” says computer scientist Rodolfo Meneguette, head of the research group.

The tests were carried out on tiny, simple Raspberry Pi 4 computers, which weigh around 50 grams. Because they are so small and inexpensive, this type of device could theoretically be installed on existing roadside devices with a Wi-Fi internet connection. The microcomputer would analyze and classify the images locally, transmitting only its verdict (whether or not there is an animal on the road) to a cloud-based system. The external structure would then trigger some form of warning almost immediately to drivers traveling on the same stretch of road.

According to estimates by the Brazilian Center for Road Ecology (CBEE), linked to the Federal University of Lavras (UFLA) in Minas Gerais, approximately five million large animals are killed on Brazilian roads every year, including capybaras, jaguars, monkeys, and maned wolves.

To train the YOLO models to recognize these five specific animals, the researchers created a database called BRA-Dataset, containing 1,458 images of the species. The database was populated from free online sources found using the Google Images search engine. The team used videos they themselves recorded at the São Carlos Ecological Park, in addition to free footage found online, to test how quickly the systems could recognize the animals.

BRA-DatasetScreenshots provided by three systems designed to identify and classify wild animals in imagesBRA-Dataset

The YOLO architecture combines image processing and AI to form convolutional neural networks, which are widely used in the field of computer vision. “This approach allows the machine, when receiving new images or videos, to compare the learned characteristics against predefined classes,” explains computer scientist Gabriel Ferrante, lead author of the article, who defended his master’s thesis on the topic at ICMC-USP in 2023, supervised by Meneguette.

The neural network divides a still or moving image (video) into smaller parts, as sets of pixels (points) that are transformed into numerical data. Through mathematical and probabilistic calculations, this data is used to classify what type of object appears in the image and its location, to a given degree of certainty.

The images accompanying this article show the kinds of results provided by the YOLO systems when looking for animals on roads. They drew boxes around the recognized species and classified them as belonging to one of the five classes they had learned to recognize. At the end of the process, the name of the animal recognized by the model appears in the image, followed by a number between 0 and 1. The expression “anta 0.90,” for example, means the system is 90% certain that the object identified in the image belongs to this class.

“We tested different models based on the YOLO architecture to try to see if one could be ideal for specific contexts,” explains computer scientist Luís Nakamura of the Federal Institute of São Paulo (IFSP) Catanduva campus, coauthor of the article. Despite being trained, the systems were inaccurate at distinguishing animals in more challenging scenarios, such as when they were hidden by other objects, camouflaged against the landscape, or very far from the camera.

“To understand the pixel patterns in an image, convolutional neural networks scan parts of it in sequence,” explains Ferrante. “If the environment interferes with the recognition of important visual characteristics, such as edges, textures, and colors, the software has difficulty classifying and defining the area of a possible object.”

Systems designed to analyze images taken in daylight are not suitable for night surveillance or in low visibility conditions. In these cases, the use of infrared cameras that are capable of “seeing” in the dark could work as an alternative. This approach, however, was not tested in the study.

Data scientist Alexandre de Siqueira, who was not involved in the research, believes future studies could expand the number of animal species included in the database used to train the systems. “If this technology were installed in static cameras, it could even be used to observe species migrating between different regions of the country, for example,” says Siqueira, who worked at the Berkeley Institute for Data Science (BIDS) of the University of California between 2019 and 2022. “It is also important to test networks with models other than YOLO to assess which is the fastest or cheapest, depending on the purpose of the application.”

Projects
1.
Services for an intelligent transportation system (nº 20/07162-0); Grant Mechanism Regular Research Grant; Principal Investigator Rodolfo Meneguette (USP); Investment R$146,438.83.
2. Dynamic resource management for intelligent transportation system applications (nº 22/00660-0); Grant Mechanism Regular Research Grant; Sprint Program; Agreement University of Manchester; Principal Investigator Rodolfo Meneguette (USP); Investment R$64,930.00.

Scientific article
FERRANTE, J. S. et al. Evaluating Yolo architectures for detecting road killed endangered Brazilian animals. Scientific Reports. Jan. 16, 2024.

Republish