How AI and US researchers are decrypting the audible world of insects

Tiny, veined wings beat rapidly over a feast of goldenrod illuminated by a late summer sun.

Without even turning her back, Anna Kohlberg knows if it’s a honeybee or bumblebee, just based on the buzz and gyrations. She learned how to make the distinction during hundreds of hours of field work while she was an undergrad student at the University of Massachusetts Amherst.

It’s a skill that, she said, most people can’t comprehend when she tells them.

But even the most carefully trained human ear can’t make a dent in deciphering the collective chorus of the world’s 1.1 million insect species with scientific names, plus the millions more that haven’t been formally described.

In this endeavour, humans need help. And via an emerging application of artificial intelligence, they’re getting it.

Newly published research led by UMass Amherst evaluates how well AI can identify different insect species by sound, and, in turn, assist in the field of bioacoustics, defined as the study of animal sounds in nature.

Listening in on the world of insects can reveal critical details about an ecosystem and the invertebrates themselves. Honeybees, for example, emit different sounds based on the presence of toxic pollutants in the air.

“I think there is a lot of uncertainty when it comes to AI as a tool for ecology, but it’s actually a really exciting and awesome way forward that will allow us to ask and answer questions that are of unprecedented scales,” Kohlberg, an Arlington native and now a graduate student at Oregon State University, said.

In fact, it’s a use of AI that Laura Figueroa, an assistant professor of environmental conservation at UMass Amherst, calls “mind boggling”, in a good way.

One model can differentiate between up to 20 species of host-seeking mosquitoes based on wingbeat frequency.

While still an emerging field, scientists have already paired AI and bioacoustics to determine when honeybee colonies are likely to swarm, or when weevils are present in grain barrels. They’ve used the calls of katydids to inform rainforest reforestation efforts, and detected endangered cicadas by their mating calls.

Most of the AI models being deployed in this instance utilise neural networks, a method that teaches computers to process data input in a way that’s similar to a human brain.

Figueroa and Kohlberg’s study, published earlier this month in the Journal Of Applied Ecology, points to AI as the materialising gold standard for getting the most out of bioacoustics when it comes to detecting, classifying and monitoring insects.

Historically, bioacoustics has been used to monitor birds, bats, whales and elephants.

Sound is “a telling indicator of the health or wellness of components that make up an environment, and you can learn a lot about how an ecosystem is doing based on the sounds that you hear”, Kohlberg said.

After evaluating more than 170 peer-reviewed studies from around the world, Figueroa and Kohlberg were able to identify AI models that can classify hundreds of insect species with over 90% accuracy.

And the findings come at a time when colliding factors such as chemical pesticides, climate change, and additional environmental stressors are impacting insect populations drastically.

While some species considered essential to the ecosystem are seeing alarming declines, others that are disease-carrying, damage-causing pests are multiplying and expanding their ranges.

Bees, which are responsible for pollinating about one-third of the world’s food supply, are declining rapidly in the US, for example. In 2021, colonies of bees in Massachusetts dropped by 47%, according to the Bee Informed Partnership.

Meanwhile, mosquitos carrying Eastern equine encephalitis are on the rise due to climate change-induced rainfall. And the emerald ash borer beetle that’s devastating to all ash tree species is rapidly expanding.

The sounds of insects, the researchers say, can tell us a lot about these shifts and how to manage them.

“We want to make sure that the insects that can cause harm, we want to monitor their populations and catch outbreaks,” Figueroa said. “And for a lot of the insects that are really important for the wellbeing of the environment, we want to catch them if there’s any potential declines so that we can promote conservation and restoration.”

Both Kohlberg and Figueroa hailed bioacoustics as an efficient and ethical way to study insects of its non-lethal and non-invasive methods. Pairing audio with camera-trap images is especially effective when dealing with insects of conservation concern, they said.

When recording insects in the field, Kohlberg said she holds a small microphone up as if she’s interviewing them. If it’s a bee, she’ll follow it as it meanders from flower to flower.

The result is a crystal-clear audio recording that can be used to decipher elusive details about insect population and environmental health.

And it appears AI could assist in making the deciphering that much easier. – masslive.com/Tribune News Service

Tagged