By Kathleen Wong, Director of Communications at the UC Natural Reserve System
Animal whereabouts are a prime indicator of ecosystem health. Species on the move can suggest local conditions are becoming inhospitable to some or shifting to favor others. Understanding where species are going, and how they’re faring, can aid conservation by identifying the need for migration corridors and establishing protections for those lands.
Yet keeping tabs on animals is no easy feat. Wildlife has long had to be physically captured and examined by experts to be accurately identified.
Technology has the potential to make biodiversity monitoring far more efficient. Scientists can review photos from camera traps or listen for their calls on audio recordings. But managing these devices remains labor intensive. Camera traps are notorious for snapping photos of waving branches leaves and falling rain, leaving human reviewers to review thousands of photos to glean a handful of animal sightings. Accessing the images involves trekking into the field to remove memory cards. And experts are still needed to distinguish between similar species, such as a deer mouse versus a California vole.
A new project funded by a UC Natural Reserve System Climate Award grant aims to make those problems artifacts of the past. “We’re putting a generalized recognition system out in the field to get some very sophisticated processing to occur, and then get data returned to you from very hard to reach places,” says UC Berkeley data scientist Collin Bode, a principal investigator on the project.
“We don’t want to totally replace biologists. But we’re limited in the supply of them. And this system is like putting a biologist out in the field 24 hours a day,” says Lisa Micheli, a co-principal investigator on the project and director of Pepperwood, an independent field station in Sonoma County.
The system has two main linchpins: using AI in the field to identify species, and communicating those records to a database via satellite.
“The datasets from camera traps are ginormous,” Bode says. “I can put ten years of weather station data”— stored as text—”in the same space as a single image.” Identifying an animal in the field means the system has no need to transmit gigabytes of images. Instead, it can record a species detection event as “Western fence lizard, 90% confidence level identification, July 2, 2023, 1400 PST” and send the text via satellite in real-time.
The project will integrate smart devices able to transmit sensor information with advances in artificial intelligence and the ubiquity of cube satellites. The resulting system aims to gather, send, and record animal detections autonomously, minimizing the need for biological experts and hands-on equipment maintenance. And with a new network of sites monitoring climate impacts being adopted statewide, the project has the potential to exponentially increase the amount of biodiversity information being gathered from every corner of California.
All images will be stored for validation by a human being. “Even with all the advancements, machine learning still makes a lot of mistakes,” Bode says.
Image recognition advances
Automating animal identifications is possible thanks to several technological improvements. Among these is the appearance of sophisticated image recognition software. A decade ago, Bode recalls struggling to get computers to distinguish between houses and trees in aerial LIDAR images, and needing weeks for a computer to do the image processing. “Now they do that in real time in a car that drives itself,” he says. Processing such as this conducted on a local computer is known as edge computing, as opposed to cloud computing, where processing is done by remote servers.
Machine learning and image recognition software packages are now not only readily available, but come pre-installed on computer boards engineered for object recognition. These optimized systems are not only fast, but eerily accurate. Bode cites a manufacturing plant that requires hard hats installing such a system at its doors. The system can spot a person trying to enter the plant without a hardhat, sound an alarm, and close the gates to prevent the bareheaded from entering the building.
“It’s doing all the image processing to recognize a head, hat, no hat, all on that little machine. And if it can recognize a hard hat on a head, I’m pretty sure it can recognize a California bullfrog,” Bode says.
Integrating different devices
Once the computer identifies the species, it stores this information until it can hail a cube satellite orbiting overhead. Thousands of these hand-sized satellites have been launched into the atmosphere in recent years, offering coverage to every corner of the globe. The cubesat can then relay the data to Dendra, a cyberinfrastructure service that ingests, monitors, and delivers time-stamped information from environmental sensors. The NRS uses Dendra to manage data from its climate stations, sap flow meters, stream gauges, and other types of field sensors. Bode is also a developer.
“The goal here is to create a small, all-in-one system that you can deploy anywhere in the world,” says J. Scott Smith, a freelance full stack developer working with Bode on the animal recognition and reporting system. Smith and Bode also co-developed Dendra.
The system also has the potential to host multiple types of sensors at once. These sensors could provide additional clues to help narrow down animal identifications. For example, an infrared sensor could distinguish whether an animal was warm and likely to be a mammal, or cooler and probably a reptile or amphibian. Or a humidity sensor could demonstrate that some species emerge only when it rains.
A studio for animal portraits
The researchers will test their systems with two types of sensor inputs: images from camera traps, and sound clips from audio recorders. The camera traps will target small animals—the lizards and mice, snakes and newts that make up the foundation of the food web. These species are difficult for observers to spot and too tiny to wear radio collars.
To funnel animals toward the camera traps, the scientists will set up drift fences. These are lengths of fabric staked to the ground to form a foot-high barrier. “If a small creature, a lizard or a mouse, runs into it, they can’t climb over, so they’ll go left or right. And at the end, we have a little turnaround, so they’ll essentially run down the opposite side of the fence until they run through the camera trap,” Bode says.
The fence will direct animals into an upside-down plastic bucket into which entrance and exit holes have been cut. A camera trap mounted in the chamber’s ceiling will snap photos of anything passing through. The chamber provides a protected space that prevents blowing dirt, waving vegetation, or precipitation from accidentally triggering the camera.
“It’s important to put the camera facing down so it can capture the little guys. A lot of our endangered species, especially amphibians, are in this size class. And having a noncontact way of sampling them reduces the risk of harm to the animal,” Micheli says.
Training the machine
The researchers will train the camera trap AI by showing it images of known species photographed from above. They plan to set up sample camera traps at Pepperwood, and ask staff biologists to identify the species that pass through. They also have access to half a million camera trap images gathered by the California Department of Fish and Wildlife, which operates drift fence setups of a similar design. The teaching dataset will need to include images of animal species found in the region photographed in various locations within the bucket, in a range of lighting, in different poses.
The camera trap systems will be deployed at Pepperwood Preserve and the NRS’s Fort Ord Natural Reserve near Monterey, or the NRS’s Hastings Natural History Reservation in Carmel Valley. Their respective directors, Joe Miller and Jen Hunter, have extensive experience monitoring local fauna with camera traps.
“Having protected places where people have this biological knowledge available to test this system is rare. So this is definitely a project that could only be done in collaboration with field stations like those in the NRS; you need multiple flavors of nerds” to make it work, Bode says.
The sensors in the second system will be audio moths, devices that can be programmed to record sounds during certain times of day. The system will be trained to listen for the calls of American bullfrogs. Invasive species in California, this amphibian has been documented downstream of the river reach that flows toward the NRS’s Angelo Coast Range Reserve. The system could notify the reserve when the bullfrog nears reserve waters.
Bode suspects that for the AI, bullfrog IDs will be a piece of cake. “Bullfrog croaks are really distinctive; all the other competing native frogs are tree frogs that make much higher-pitched tones,” he says.
Getting apples and oranges to converse
A major challenge is to get the various technologies in the system talking to one another. “It’s like connecting Lego bricks to Tinker Toys: you’re trying to combine different sets of equipment together,” says Smith. Smith is charged with getting all of the devices to work together, so that the solar panels can power the sensors and computers, and the computer can talk to the radio antennas and connect to the cube sats.
Making the system stable and autonomous is yet another goal. “It’s going to have to operate on its own, so no one has to come out and reboot it,” Smith says.
If anyone can get these devices working together reliably, it would be Smith and Bode. They’ve become experts at importing data from technologies old and new onto Dendra. Having so much sensor information in one system should help scientists make new correlations, such as identifying that salamanders are more likely to emerge from their burrows on rainy nights.
A tool for conservation
The new system promises to help California preserve its irreplaceable biodiversity. It will enable scientists to observe animal activity in areas such as remote mountains that are difficult for people to reach. It will also make better use of biologists, who can focus their time devising ways to improve species survival instead of clicking past empty camera trap photos.
The drift net with bucket camera setup is likely to be adopted by a nascent sentinel site network established to monitor climate change impacts across the state. The sentinel sites are a project of the California Biodiversity Network, a consortium of environmental and community organizations working to conserve the natural heritage of the Golden State. Sentinel site participants include the NRS, private field stations such as Pepperwood, and the California Department of Fish and Wildlife, among other entities. CDFW has already decided to deploy the method at 42 new sentinel sites it has established in habitats across the state.
The system being developed by Bode, Scott, and Micheli has great potential to streamline the intake of biodiversity information from these locations. “It would be a huge advancement if this data could automatically be relayed from the many sentinel sites already using Dendra to capture and manage climate data,” says Micheli.
The additional data gleaned from the system offers a means to ground truth California’s efforts to conserve 30 percent of state lands by 2030. “The assumption is that the land being protected is actually sheltering biodiversity. But if you do not have a mechanism to determine whether that land is being effective at preserving species, then we could lose species at the same rate we are now,” Bode says.
“The focus on monitoring climate with biodiversity will enable us to track trends over time in the face of an uncertain future,” Micheli says.
The project’s potential to improve state resilience to climate change enabled it to qualify for $150,000 in climate award funding from the UC Natural Reserve System. The grant come from a $1 million award to the UC Natural Reserve System to seed climate-focused entrepreneurial efforts. This is part of $100 million received by the University of California Office of the President from the state of California to fund applied solutions to California’s pressing climate needs.
This article was originally featured on the UC Natural Reserve System website as a news post on August 10, 2023. All rights belong to the UCNRS, and permission to post on Pepperwood’s website was granted August 14, 2023. To view the original article or leave comments for UCNRS, please visit https://ucnrs.org/using-tech-to-break-bottlenecks-in-wildlife-monitoring/.