Humpback whales, such as the mother and baby photographed here in Tonga, engage in complicated communications. The Earth Species Project (ESP) is partnering with researchers to develop new tools for understanding animal vocalizations to help save wildlife. (Katie Zacarian Photo)
My whippet Roxy is smart. She understands dozens of words and phrases well beyond sit and stay. Shell go find dad when commanded and run to the window when asked whos here?
So a couple of years ago I bought these big, paw friendly buttons that play a recorded word when pressed. The YouTube video that sold me on the technology features a pup pushing buttons to elicit the phrase love mom. I thought Roxy might find the tool empowering. Her big brown eyes and expressive brows can only say so much.
It turned out that while Roxy seemed to understand the concept, she despised the buttons.
But what if instead of pressuring Roxy to speak English to me, I could go full Dr. Dolittle and speak whippet to her?
A nonprofit called Earth Species Project, or ESP, is on a trajectory to talk with animals but with a much higher purpose than my intended goals with Roxy.
[ESP has] this amazing practical near-term conservation benefit, with this long-term, moon-shot vision of being able to communicate with animals.
ESP is working with more than 40 research efforts around the globe, using machine learning and artificial intelligence to help scientists understand animal communications in pursuit of saving imperiled species. The organization recently received $1.2 million in funding from the Seattle-based Paul G. Allen Family Foundation to support its work. Allen, the Microsoft co-founder who passed away in 2018, was interested in both protecting wildlife and AI research.
[ESP has] this amazing practical near-term conservation benefit, with this long-term, moonshot vision of being able to communicate with animals, said Gabriel Miller, technology director for the foundation.
The technology could unlock valuable insights into the lives of wild animals and lead to more effective conservation, while also inspiring a greater human kinship with the natural world, say ESP leaders and supporters. They suggest the research could have an impact akin to the 1970s album of humpback whale songs that spurred critical marine mammal protections, or the Earthrise photo from Apollo 8 that captured the planets vulnerability and stoked environmental concerns.
Understanding animal communication can essentially transform human perspective on how we relate to the rest of nature, said Jane Lawton, ESPs director of impact.
But while ESPs intentions are to bolster a connection to animals and increase protections, the work raises serious moral and safety concerns for wildlife. Just as there are fears about the impact of AI and large language models on the human condition, there are significant risks to the natural world.
A paper this year from Princeton University professors called out the neglect of animals in the field of AI ethics. The article, published in the journal AI Ethics, said that it remains the responsibility of AI companies, AI developers and scientists to identify, predict, and thereby as far as possible prevent, harms that are done to animals.
Karen Bakker, a University of British Columbia professor and expert on the issue, is likewise worried.
The ethical questions that arise when we imagine using digital technology to try to talk to animals are really complex, Bakker said in an interview on the KUOW podcast The Wild.
ESP is developing foundational machine learning models that perform operations such as the detection and classification of animal vocalizations, as well as separating vocalizations from background noise.Funding from the Allen Family Foundation will support its work on multi-modal models that pair audible communications with visual observations, videos and movement-detecting monitors, which provides context for the sounds.
The nonprofit is developing tools that are species-agnostic and can be fine-tuned for specific research. It has dozens of partnerships, including with scientists who are researching vocal communications between carrion crows and a team investigating the calls of beluga whales and what they reveal about their social structure, among other projects.
To understand how communications could aid protections, consider, for example, the ability to decode elephant communications. Those insights could inform conservationists when and where a herd is planning to migrate, and they could take action to ensure a safe passage. Or if humans could eavesdrop on whales to learn when theyre surfacing, diving or hunting prey, it could be possible to prevent deadly ship collisions.
But while AI could be used to guide animals away from danger, it could be also used to lure them to hunters and poachers.
The ESP co-founders all come from technology backgrounds. CEO Katie Zacarian was an early Facebook employee in New York; President Aza Raskin is a tech entrepreneur and also a co-founder of the Center for Humane Technology; and senior advisor Britt Selvitelle was on the founding team at Twitter.
The most ambitious element of ESPs work is creating two-way communication with animals. The idea follows the approach used in developing large language models like those that power ChatGPT and related generative AI tools that allow humans to engage in verbal dialogue with machines. The models are trained on massive volumes of writing, learning the patterns of speech.
Researchers can also use recordings of animal communications to build a model that converses with non-humans. But the pursuit has a particularly curious and controversial twist.
Theres a high likelihood, ESPs Lawton said, that if a model achieves fluent, two-way communication between a computer and animal, we will not, as the human beings, understand the meaning of that communication.
That could be problematic. There is plenty of evidence that generative AI can use inappropriate and erroneous language in the human realm suggesting that a computer could likewise misspeak to an animal without the knowledge or understanding of humans.
Theres a lot of uncertainty about the outcome of animal-machine dialogue.
So this could simply land flat, like the animals dont care, they dismiss it, Bakker said on KUOW. Or it could actually do a great deal of harm.
The Allen Family Foundation funding will support ESP in the generation of novel vocalizations that can be used in playback experiments that may result in the machine learning models engaging in two-way communication with another species. Researchers working with ESP currently are experimenting on communication with zebra finches in a lab setting.
ESP is working with research partners and ethics experts to determine whether there is a need to develop a set of guardrails to regulate the use of AI in animal communication research, Lawton said. The ethical issues in this space are a concern for the foundation, said Miller, and a point of active discussion.
Speaking on KUOW, ESPs Raskin suggested the need for laws addressing cross-species communication akin to the Geneva Conventions humanitarian rules. Bakker countered that wasnt enough, and called for the scientific community to develop protocols in this space that are similar to those restricting the use of CRISPR gene editing.
While the researchers and supporters acknowledge the serious ethical concerns that need to attention, they champion the technologys potential for good.
Decades ago, Miller himself studied hummingbird vocalizations. He would have loved access to high-tech tools, he said, to decipher the diversity of the calls and discover how the birds learn to communicate with them.
It makes data a lot more usable and practical, he said. The amount of power in these approaches is hard to overstate.
Originally posted here:
AI could make us conversant with critters, unlocking conservation tools and serious risks - GeekWire
Read More..