Brown computer scientist aims to protect people in an age of artificial intelligence – Brown University

On the occasion of the AI Bill of Rights announcement, Venkatasubramanian, who is deputy director of Browns Data Science Initiative, shared insights and perspectives on his stint at the White House, his humanistic approach to computer science, and what he looks forward to accomplishing at Brown in the years to come.

We recognize that there are a lot of potential benefits from automation and data-driven technology all these promises of what could be. But we also see that the promises often tend not to pay out. For example, we can try to build an AI system to make sure we cant discriminate in the criminal justice system, but systems that suck up data from previous arrests are irrevocably tainted by the history of racial injustice in the criminal justice system. And then implemented at scale, this taint spreads. All data thats fed into a system is just going to amplify biases in the data, unless there are rigorous and carefully designed guardrails.

These technological systems impact our civil rights and civil liberties with respect to everything: credit, the opportunity to get approved for a mortgage and own land, child welfare, access to benefits, getting hired for jobs all opportunities for advancement. Where we put these systems in place, we need to make sure theyre consistent with the values we believe they should have, and that theyre built in ways that are transparent and accountable to the public. Its not something we can slap on after the fact.

I have been studying these issues for almost a decade, thinking about whats coming next and what the world will look like when algorithms are ubiquitous. Ten years ago, one concern I thought we were likely to have was whether we can trust these systems to work the way theyre supposed to, and how we know these systems are accountable to the public and our representatives.

Whether you like it or not, the technology is here, and its already affecting everything that shapes you. You are without your knowledge adapting how you live and function to make yourself more readable to technology. You are making yourself machine-readable, rather than making machines human-readable. If we dont pay attention to this, the technology will be driving how we live as a society rather than society making technology that helps us flourish and be our true selves. I dont like to frighten people, but its true and its important.

Neither, really. Its not the technology thats good or bad, AI or not. Its the impact the harms that we should be concerned about. An Excel spreadsheet that produces a score that confines someone to detention before standing trial is as bad as a sophisticated AI system that does the same thing. And a deep learning algorithm that can help with improving crop yields is amazing and wonderful. Thats why the AI Bill of Rights focuses on impact on peoples rights, opportunities and access to services rather than the technology itself, which changes and evolves rapidly.

Think about prescription drugs, for example. You dont have to worry that the drug youre taking has not been tested, because the FDA wont let it come onto the market until its gone through rigorous testing. Similarly, were confident that our cars will work and that regular recalls happen whenever the National Highway Traffic Safety Administration discovers a problem; and were confident that our planes work and that every new kind of jet goes through rigorous testing before being flown. We have many examples to draw from where we dont let new technology be used on people without checking it first. We can look to that as a guide for what we think is important, and technology affects everyone.

This AI Bill of Rights is a blueprint that goes beyond principles. It provides actionable advice to developers, to civil society, to advocates, to corporations, to local governments and to state governments. There are various levers to advance it: regulation, industry practices, guidance on what governments will or wont build. There is no silver bullet here, but all the levers are within reach. It will take the whole of society to advance this work.

It was life altering. My brain now works in ways I cannot and dont want to undo. Im constantly thinking about the bridges between research and innovation, society and policy. As a country and as researchers, were still coming to terms with this. For a long time, weve thought of technology as a thing we use to make life better. But were not as familiar with technology as a thing that changes our world. Trying to make policy for an entire country and in some ways, the entire world, because the U.S. is a leader is challenging because there are so many competing interests that you must balance.

In my time in government, I was impressed by how complex and subtly these issues unfold in different domains what makes sense when thinking about health diagnostic tools doesnt really work if youre thinking about tools used in the courtroom. I have a deeper appreciation for how many dedicated people there are within government who want to make a difference and need help and bandwidth to do it.

One thing that Ive realized in the years Ive spent working in policy spaces is that its critical to help policymakers understand that technology is not a black box its malleable and evolving, and it helps shape policy in ways that we might not expect. Technology design choices are policy choices, in so many ways. Coming to terms with how tech and policy influence each other requires a lot of education both for technologists and for policymakers.

I cannot think of a better place than Brown that embodies the values of transdisciplinarity and scholarship in service of the public good. In my years studying the impact of data-driven technology on people and communities, Ive learned the critical importance of bringing a variety of perspectives to bear on any specific problem. Technologists cannot alone solve problems caused by the clash of tech and society, but neither can any other group of thinkers and actors.

The Brown campus ethos is incredibly cooperative, and its deeply embedded with a commitment and passion to public service among the students, faculty and administration. As my colleagues and I at the Data Science Initiative work toward building a new center that will focus on tech responsibility, Im focused on the mission of redefining how we design technology and teach it to center the needs, problems and aspirations of all especially those that technology has left behind.

Im convinced that we have the creativity and the tools to build tech that helps us flourish and helps all of us benefit from the advancements in tech. In order to do this, we have to bring together all the amazing ideas from engineering, public health, medicine, the social sciences, the humanities, policy leaders and technologists. Im committed to encouraging and contributing to that ongoing vibrant dialog on campus and creating a transdisciplinary home where we can come together to solve problems and solve them well.

More here:

Brown computer scientist aims to protect people in an age of artificial intelligence - Brown University

Related Posts

Comments are closed.