Read Latest Issue now

Fighting AI bias

Posted on Jan 17, 2020 by FEED Staff

Professor Noel Sharkey, an expert in the field of AI, has urged the UK government to ban the use of all decision algorithms that could lead to bias, and impacting on people’s lives In an interview with The Guardian, Professor Sharkey expressed concern over a series of examples of machine-learning systems being loaded with bias.  On inbuilt bias in algorithms, Sharkey said: “There are so many biases happening now, from job interviews to welfare to determining who should get bail and who should go to jail. There should be a moratorium on all algorithms that impact on people’s lives. Why? Because they are not working and have been shown to be biased.” According to The Guardian, Sharkey has had discussions with the biggest global social media and computing corporations – Google and Microsoft – about the innate bias problem. “They know it’s a problem and they’ve been working, in fairness, to find a solution over the past few years, but none so far has been found.”  He added: “Until they find that solution, what I would like to see is large-scale pharmaceutical-style testing. Which means testing these systems on millions of people, or at least hundreds of thousands of people, in order to reach a point that shows no major inbuilt bias. These algorithms have to be subjected to the same rigorous testing as any new drug produced that ultimately will be for human consumption.” This article first appeared in the January 2020 issue of FEED magazine.]]>

NAB Show 2021 is cancelled

September 16th, 2021

Due to a rise in the Delta variant, NAB Show organisers were left with...

A statement from The Media Production & ...

March 16th, 2020

MPS 2020 have released the following statement amid COVID-19 concerns

4K 4Charity virtual run 2020 NAB Show

May 7th, 2020

This year, the 2020 NAB Show 4K 4Charity Fun Run is going virtual

Media Production and Technology Show - R...

January 6th, 2020

The organisers of the Media Production and Technology Show 2020 are delighted to announce...