Team in $10M XPRIZE contest combines Indigenous wisdom, AI data analysis
What is the rainforest worth?
Six Arizona State University researchers are part of a team that’s creating a new way to measure rainforest biodiversity in the $10 million XPRIZE Foundation competition.
The group, called Team Waponi, is one of six finalist teams in the competition. They’re now working on refining their solution, which involves using a drone to drop a device onto the tree canopy to measure sound, record images and take insect and plant samples from one square kilometer of rainforest.
The project will combine artificial intelligence data analysis with the knowledge of the Indigenous people in the region, who are sharing their insights into the plants, animals and insects of their ecosystem.
The final round of the competition will be in summer 2024 in the Amazon rainforest in Brazil, where each team will deploy its solution for 24 hours and then have 48 hours to analyze the results. The team that is judged to have the most viable, scalable solution gets $10 million.
The ASU members of Team Waponi are:
- David Manuel-Navarrete, associate professor in the School of Sustainability and affiliate faculty in the Center for Behavior, Institutions and the Environment.
- Garth Paine, professor of digital sound and interactive media in the School of Arts, Media and Engineering and the School of Music, Dance and Theatre and a Senior Global Futures Scientist.
- Nicholas Pilarski, associate professor in the School of Arts, Media and Engineering and The Sidney Poitier New American Film School and a Global Futures Scientist.
- Ankita Shukla, postdoctoral research scholar in the School of Arts, Media and Engineering.
- Tod Swanson, associate professor of religious studies in the School of Historical, Philosophical and Religious Studies and a Senior Global Futures Scientist.
- Pavan Turaga, professor and director of the School of Arts, Media and Engineering and professor in the School of Electrical, Computer and Energy Engineering.
Team Waponi is led by Tom Walla, a professor of biology at Colorado Mesa University who focuses on tropical ecology. The entire group has more than three dozen members from institutions all over the world.
“What I really l liked about the XPRIZE is that it combines my value system for preserving the rainforest with impact,” Walla said.
“I would really like to make a difference in conserving the rainforest directly, in addition to the impact I’ve had indirectly through teaching."
The XPRIZE Foundation is a nonprofit with the goal of incentivizing projects that will improve the planet. Other XPRIZE categories include space, climate, technology, food and water, health and education. The goal of the rainforest contest is to quantify the biodiversity and leverage the economy.
“XPRIZE has done a good job of framing the ultimate conundrum in that the nature of what a rainforest is worth is the amount you would pay for a cow to feed on the grass where the rainforest used to be,” Walla said.
“The people who sell cows and who grow soybeans know exactly what an acre of rainforest is worth to them and they will pay it.
“But we know that an intact rainforest is worth so much more. And yet we don’t have a measurement for it.”
So the goal is to produce a metric to value the rainforest.
“If we do a pretty good job, we might save the rainforest by bringing its value to the marketplace,” Walla said.
“What we hope to generate is a solution that will land on the world stage as a tool that conservation organizations, large companies and businesspeople who are interested in preserving the value of an intact tropical rainforest can use to measure the positive effects of conservation.
“But it’s hard because many of the species are not described. When we collect insects, 60% or 70% don’t have a name. We’re trying to quantify something and we don’t know what it is.”
Decoding the cacophony
Team Waponi, which has been working on the project for a few years, became a finalist in June after the semifinal competition in a Singaporean rainforest.
The team, in partnership with Outreach Robotics, used drones to drop five raft-like devices they created called Limelights onto the canopy. Limelight records sounds and images and also has traps with special lights to attract insects. In Singapore, the devices collected hundreds of plant and insect samples, whose DNA was analyzed.
Paine created the bioacoustic recorders. His specialty is acoustic ecology and the maintenance of biodiversity for a healthy environment. He built on his previous work creating a mesh of machine-learning devices to identify bird species in the rainforest of Costa Rica.
“We overlook sound all the time, but sound is probably the richest data source that's available to us, whether it's in our daily life or whether it's in saving the rainforest,” he said.
Human hearing is not very refined, he said.
“It’s really attuned to hearing our mother's voice and not very good in high frequencies or low frequencies.
“So when you hear recordings of the Amazon at night, it's just a cacophony, right? But each of these species has what's called a sonic niche — a frequency range in which they talk to each other.”
Paine developed recording devices to analyze species density in the sounds.
“I'm also on the composition faculty in the School of Music, and my other hat is music and sound. So I produced some soundscapes of the Ecuadorian rainforest, which were part of our early-stage submissions to the XPRIZE.
“The judges loved them and said, ‘Oh, you can really do this. You can get the data and dig into what sound means here.’”
Now, Team Waponi is working on upgrading the Limelight devices before the final competition in Brazil next year.
Turaga said the cacophony in the rainforest recordings is like “cocktail party effect” that makes it hard for even artificial intelligence to discern.
“We couldn’t differentiate the sounds but we can use machine learning to see how diverse the sounds were,” said Turaga, who has used AI with photographic images but not sound before.
Shukla is working on improving her algorithm to identify bird species from the enormous amount of bioacoustics data, which has been challenging, she said.
“The data used to train the model were nicely cleaned bird recordings from Xeno-Canto. But the acoustic data from the rainforest had constant noise masking a lot of the signals.
“I had to identify the chunks where it heard something like a bird and then I used my machine learning AI model to identify the potential bird in it,” she said.
“The model said it identified a bird but when we cross verified it, it didn’t turn out to be that bird.”
So now, people who are experts in bird species are labeling the data sets so Shukla can retrain her machine-learning model.
“The best way is a hybrid system — using expertise from humans to improve your system and vice versa. I think of machine learning as an assistance tool, especially when you’re dealing with problems in social-good applications,” she said.
Turaga said, “Machine learning or AI isn’t going to take our jobs yet.”
Elevating Indigenous knowledge
While the Limelight devices literally swoop onto the rainforest canopy and then fly away after 24 hours, the overall project must be deeply embedded into the local community, building trust and patience. Two ASU professors who already have years of experience in the rainforest are working to ensure that Indigenous knowledge is being embraced in the project.
Manuel-Navarrete and Swanson both work with Iyarina, an Indigenous-focused research and education organization in Ecuador. Swanson is academic director of the Andes and Amazon Field School, which is part of Iyarina, and Manuel-Navarrete is working on a solar-powered canoe project with the nonprofit.
“There are two ways in which Indigenous knowledge can contribute,” Manuel-Navarrete said.
“One is to help us design the measuring devices by incorporating their actual knowledge about species — where to find them, how they behave, what sounds they produce,” he said.
“The other part is the relationship they have with these species, which goes beyond the perceptions of the senses and beyond the sounds they make. It has to do with the meanings of the species within their culture.”
This requires a shift in thinking, he said.
“It’s easy to think we can solve this problem with just more information and better governing and policies without changing our own culture,” he said.
“The Western culture that has seen humans as apart and as separated from ecosystems is the root cause of this problem, and we cannot solve it without changing the way we know and the way we relate and the way we are.”
Swanson, who grew up in Ecuador and is fluent in the Indigenous language of Kichwa, researches the Indigenous peoples’ social relation to plants, animals, the Earth and water in the Amazon.
“Traditionally, Amazonian Kichwa people engaged plants and animals as though they were human-like, and as though they could respond to empathy and politeness from humans as well as respond to bitterness. And they have language to communicate with each other,” he said.
“So they are not manageable as just natural resources. They have to be engaged more like a human being would need to be engaged.”
Swanson and his colleagues have been creating a series of short videos to capture key aspects of the Indigenous relationship to plants and animals through the voice of community elders. There are already more than 300 videos in three Indigenous languages. Through the XPRIZE work, the videos will eventually be turned into a searchable platform that will include scientific names and data.
“The end goal will be a linking of Indigenous knowledge and scientific knowledge of the species. And this will be connected to a technology where you could go into the area and it would be able to give you a good sense of the species that are available locally,” Swanson said.
A drone picks up the Limelight device from the rainforest canopy in Singapore during the XPRIZE semifinal competition in June.Photo courtesy Team Waponi
ASU Professor Garth Paine created the bioacoustics recording device for the $10 million XPRIZE Foundation competition to measure biodiversity in the rainforest.Photo courtesy Team Waponi
An image from a Team Waponi video shows some of the bugs collected by the Limelight device during the semifinal competition in Singapore in June.Photo courtesy Team Waponi
Creating a process
Another crucial component of the project is communication. How can Team Waponi share what they have learned and express the value of an intact rainforest?
Pilarski, whose expertise is in interactive media such as virtual reality, said the end result is unknown at this point, but will combine all of the elements into an immersive experience that feels true to everyone involved.
“Actually, we're not too interested in telling a story. We’re interested co-creating a process in which there is a mutual pathway of information and different forms of knowledge being used, and then using that actually to develop VR,” he said.
“This is not something that you can just dream up, write on a piece of paper and then execute. This mutual exchange is going to get us something that no one has seen before.”
Pilarski, who will travel to the site at some point, is having people collect 360-degree video now, which will be combined with the bioacoustics soundscapes.
“We’re not just thinking of this is as a final product, which will be fantastic for the XPRIZE,” he said.
“We’re creating a replicable model that communities can use as a roadmap.”
Top image: A drone drops a Limelight device onto the rainforest canopy in Singapore during the XPRIZE semifinal competition in June. Photo courtesy Team Waponi