Intel v-p: ‘research needed to open machine learning black box’ – Times Higher Education

Academic research is needed togive anethical foundation tomachine learning, according tothe head ofthe AI engineering team atIntel, one ofthe worlds largest semiconductor chip manufacturers.

Speaking at the Digital UniversitiesUK event, held by Times Higher Education inpartnership with the University of Leeds, WeiLi, vice-president and general manager ofartificial intelligence and analytics atIntel, said: Machine learning today isa black box. You get what you get, and you dont really know why.

He added: In some applications [such as healthcare], you will want to know why that system gave you that answer.

Academic research can help to expose fundamental ethical issues in AI, such as in-built bias around gender, DrLi told the event.

There are a lot of unknowns and open questions in machine learning today, and itreally demands fundamental research that universities cando and industry cant, he said.

Dr Li said his team at Intel was working to develop technology that would help to make fair and inclusive AI systems.

But speaking with THE, he warned that industry was more focused on building AI systems than with addressing the ethical questions they provoke. These problems have roots in how the overall machine learning works, he said. These things Ihope people in academia can do something deeper than what were doing.

Despite a recent open letter signed by AI experts and industry executives, including Elon Musk, calling for a pause in the development of AI until we are confident that their effects will be positive and their risks will be manageable, DrLi does not expect AI advancements to slow.

Its not realistic, he said. Its a risk in terms of commercialisation, and its a race to be the fastest and the first in the industry. Thats enough motivation for people to go for these things.

So with the rapid advancement in AI systems, will academic research into machine learning models quickly become obsolete? No, said DrLi, who argued that universitiescould influence the way that future systems are built. Idont expect them [researchers] to dig into ChatGPT and explain ChatGPT thats impossible to do given the state of art we have today. But if people have a better foundation for machine learning, then maybe the next generation can be a safer and less biased model.

When it comes to university teaching, DrLi said, higher education has a challenge and an opportunity to better train the next generation of students in the new AI environment. Institutions should teach students to be more than just simply a messenger for ChatGPT, headded.

The products of a university are the students youre producing. If theyre not better than ChatGPT, then why do we bother to send them to university? DrLi asked.

sara.custer@timeshighereducation.com

Read more:
Intel v-p: 'research needed to open machine learning black box' - Times Higher Education

Related Posts

Comments are closed.