Big risks: Obama and tech experts address harms of AI to marginalized communities – NBC News

CHICAGO More must be done to curb AIs potential for harm or the further marginalization of people of color, a panel of experts weighing the ever-widening reach of AI warned last week.

The warning came during a panel discussion here at the Obama Foundations Democracy Forum, a yearly event for thought leaders to exchange ideas on how to create a more equitable society. This years forum was focused on the advances and challenges of AI.

During a panel titled, Weighing AI and Human Progress, Alondra Nelson, a professor of social science at the Institute for Advanced Study, said AI tools can be incorrect and even perpetuate discrimination.

Theres already evidence that the tools sometimes discriminate and sort of amplify and exacerbate bias in life big problems that were already trying to grapple with in society, Nelson said.

A 2021 paper published by AI researchers revealed how large language models can reinforce racism and other forms of oppression. People in positions of privilege tend to be overrepresented in training data for language models, which incorporates encoded biases like racism, misogyny and ableism.

Furthermore, just in the last year multiple Black people have said they were misidentified by facial recognition technology, which is based on AI, leading to unfair criminalization. In Georgia, 28-year-old Randall Reid said he was falsely arrested and jailed in 2022 after Louisiana authorities used facial recognition technology to secure an arrest warrant linking him to three men involved in theft. Noticeable physical differences, including a mole on his face, prompted a Jefferson Parish sheriff to rescind the warrant.

Porcha Woodruff sued the city of Detroit for a false arrest in February. Her lawsuit accuses authorities of using an unreliable facial recognition match in a photo lineup linking her to a carjacking and robbery. Woodruff, who was eight months pregnant at the time, was charged and released on a $100,000 personal bond. The case was later dropped for insufficient evidence, according to the lawsuit.

In polls, Black people have already expressed skepticism over the technology. In April the Pew Research Center found that 20% of Black adults who see racial bias and unfair treatment in hiring as an issue said they think AI would make it worse, compared to about 1 in 10 white, Asian and Latino adults.

Former President Barack Obama, in the forums keynote address, said he was encouraged by the Biden administrations recently signed executive order on AI, which established broad federal oversight and investment in the technology and which Obama provided advice on, but acknowledged that there are some big risks associated with it.

During the panel, Hany Farid, a professor at the University of California, Berkeley, said that predictive AI in hiring, in the criminal legal system and even in banking can sometimes perpetuate human biases.

That predictive AI is based on historical data, Farid said. So, if your historical data is biased, which it is against people of color, against women, against the LGBTQ community well guess what? Your AI is going to be biased. So, when we push these systems without fully understanding them, all we are doing is repeating history.

Over the past two years, Nelson has been working within the White House Office of Science and Technology Policy, focusing on the equitable innovation of AI to include many people and voices, she said. Under the Biden administration, her team developed a Blueprint for an AI Bill of Rights, a guide to protect people from the threats of automated systems and includes insights from journalists, policymakers, researchers and other experts.

More conversations are happening about AI around the globe, Nelson said, which is really important, and she hopes that society will seize the opportunity.

Even if youre not an expert in mathematics, you can have an opinion about this very powerful tool thats going to accomplish a quite significant social transformation, Nelson said. We have choices to make as a society about what we want our future to look like, and how we want these tools to be used in that future and it really is going to fall to all of us and all of you to do that work.

For more from NBC BLK,sign up for our weekly newsletter.

Claretta Bellamy is a fellow for NBC News.

Read the original here:

Big risks: Obama and tech experts address harms of AI to marginalized communities - NBC News

Related Posts

Comments are closed.