Police in South Wales did not do enough to ensure automated facial recognition technology deployed on two occasions did not have a racial or gender bias, the Court of Appeal has said in a ruling that could have significant implications for police forces.
Judges hope that, as facial recognition is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.
Read the judgment
Find out more at The Society Law Gazette