Government Inquiry Into AI Algorithm Bias

Artificial IntelligenceBusiness IntelligenceData StorageGovernement-ITInnovationRegulation

Worried that AI may discriminate on race or gender? Government orders inquiry despite ongoing use in policing

The government has ordered an independent watchdog to conduct an investigation to explore the potential for bias with algorithmic decision-making or artificial intelligence (AI) in criminal and justice cases.

The AI bias inquiry will be led by the Centre for Data Ethics and Innovation (CDEI), despite the fact that the police already use artificial intelligence to predict crimes, despite concerns by some civil liberty groups.

It is fair to say however that the Government has also had its own reservations. As far back as 2016, it has been concerned about the impact that AI would have on privacy, transparency and data use.

AI inquiry

The government announcement said that CDEI would also investigate how to maximise the benefits in the use of algorithms in recruitment, local government and financial services.

The government said that algorithms have huge potential for preventing crime, protecting the public and improving the way services are delivered. But decisions made in these areas are likely to have a significant impact on people’s lives and public trust is essential.

It recognised that professionals in those fields are increasingly using algorithms built from data to help them make decisions. But there is a risk that any human bias in that data will be reflected in recommendations made by the algorithm.

The CDEI wants to ensure those using such technology can understand the potential for bias and have measures in place to address this.

It cited examples in the crime and justice arena, where algorithms could be used to assess the likelihood of re-offending and inform decisions about policing, probation and parole.

“Technology is a force for good which has improved people’s lives but we must make sure it is developed in a safe and secure way,” explained digital secretary Jeremy Wright. “Our Centre for Data Ethics and Innovation has been set up to help us achieve this aim and keep Britain at the forefront of technological development.

“I’m pleased its team of experts is undertaking an investigation into the potential for bias in algorithmic decision-making in areas including crime, justice and financial services,” he added.

“The Centre is focused on addressing the greatest challenges and opportunities posed by data driven technology,” said CDEI chair Roger Taylor. “ These are complex issues and we will need to take advantage of the expertise that exists across the UK and beyond.”

“We want to work with organisations so they can maximise the benefits of data driven technology and use it to ensure the decisions they make are fair,” said Taylor. “As a first step we will be exploring the potential for bias in key sectors where the decisions made by algorithms can have a big impact on people’s lives.”

Financial services

Other areas where computer algorithms are used is in recruitment, where it can screen CVs and shortlist candidates.

This could help potentially limit the impact of unconscious bias of recruiters, but apparently there have also been reports that such technology can inadvertently exacerbate gender bias.

And in financial services sector, data analysis for a while now has been used to inform decisions about whether people can be granted loans. When this is coupled with AI, concerns are raised about the transparency and fairness of such decisions.

The CDEI will publish interim reports in the summer, with final reports set to be published early next year.

This is not just a government tick-box exercise, as AI bias is also concern for the business world as well.

Last September for example IBM rolled out a cloud-based tool designed to detect bias in artificial intelligence models, and give the organisations using them better visibility into why they are making the decisions they do.

Put your knowledge of artificial intelligence (AI) to the test. Try our quiz!

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio