Newswise — ITHACA, N.Y. – Cornell University’s Elephant Listening Project tracks African forest elephants with acoustic sensors, but the forests are so remote and the sound files so huge it takes months to collect and analyze the data – too long to rescue the animals from poachers or other threats.

Now scientists can learn critical information about the elephants’ habits and patterns in a fifth of the time. A startup, Conservation Metrics, developed a tool that uses artificial intelligence to distinguish the low-frequency, long-duration elephant calls from other rainforest sounds.

The Elephant Listening Project’s 50 sensors, installed throughout a 580-square-mile area of Nouabalé-Ndoki National Park in the Republic of Congo, generate 7 terabytes of data every three months – the equivalent of 2 million iTunes songs. New developments in machine learning and deep neural networks make it possible to analyze these enormous files faster and with much higher accuracy, so scientists can alert park managers about red flags in time.

“A key thing this collaboration will do is speed things up, so we can show the people who manage the national park that we can provide information that will make a difference,” said Peter Wrege, director of the Elephant Listening Project (ELP), part of the Cornell Lab of Ornithology. “If it takes us a year to figure out what elephants are doing in the forest, it’s already too late.”

The number of forest elephants in central Africa has plummeted – from an estimated 100,000 in 2011 to fewer than 40,000 today. Their vast range and the thick rainforest canopy make them extremely difficult to track by land or air. Forest elephants are a distinct species from the more populous, though also endangered, savanna elephants.

In 2014, scientists learned 25,000 forest elephants had been slaughtered by ivory poachers in Gabon’s Minkébé National Park, in between visits to the park to count them. Before anyone realized those elephants were in danger, they’d been wiped out.

That makes it imperative to develop a system to track elephants in as close to real time as possible. Using the data from acoustics, Wrege and his team can create maps showing the elephants’ habits. For example, a recent map revealed large numbers of elephants congregating in an area adjacent to a logging site and close to roads, making them highly vulnerable to poachers.

“What the Elephant Listening Project is doing in terms of working with collaborators on these sites in Africa is really impressive, but the logistics are really hard,” said Matthew McKown, CEO of Conservation Metrics, which recently received a two-year Microsoft AI for Earth grant for this work. “It’s a truly ambitious project, and it’s the first time we’re actually realizing the potential of these automated monitoring approaches.”

Before using Conservation Metrics’ new tool, it took ELP six to eight weeks to run the sound data through its five computers. Samples of data then needed to be analyzed by hand to test the accuracy of the computer version; the entire process took around three months.

Conservation Metrics was able to analyze the data in 22 days, and McKown hopes it can soon be done far faster. Part of the grant from Microsoft includes access to Azure, its powerful cloud server; once his team develops compatible software, it could potentially complete the analysis using Azure in a single day.

The sensors’ sound files are currently stored on memory cards that need to be collected by workers in the park on foot, transported 500 miles to the city of Brazzaville and express mailed to Ithaca. Eventually, ELP and Conservation Metrics hope to create a way to run the artificial intelligence tools directly in the African forests; then the numbers of elephants in each area could simply be texted to the United States.

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews. For additional information, see this Cornell Chronicle story.

-30-