- Have YOU been falsely matched by facial recognition? Email Sam.Lawley@dailymail.co.uk
<!–
<!–
<!–
<!–
(function (src, d, tag){ var s = d.createElement(tag), prev = d.getElementsByTagName(tag)[0] ; s.src = src; prev.parentNode.insertBefore(s, prev); }(“https://www.dailymail.co.uk/static/gunther/1.17.0/async_bundle–.js”, document, “script”));
<!– DM.loadCSS(“https://www.dailymail.co.uk/static/gunther/gunther-2159/video_bundle–.css”);
<!–
An innocent man arrested after an AI system wrongly identified him as a burglar says police should not be allowed to use the technology because of their ‘racial bias’ and the software’s ‘horrific’ error rate.
Alvi Choudhury, 26, was working from home in Southampton at the house he shares with his parents on January 7 when officers arrested him and held him in custody for 10 hours.
Thames Valley Police’s (TVP’s) automated facial recognition system had matched his mugshot – taken after he was falsely arrested five years ago – to a clip of a thief who stole £3,000 and some jewellery from Milton Keynes Buddhist Vihara a month earlier.
He was finally freed with no further action at 2am after cops realised he was not the man in the CCTV clip.
Software engineer Mr Choudhury is now suing police and says he wants both cash and an apology to make up for the miserable ordeal.
He told the Daily Mail he blames the software – which returns false matches 4 per cent of the time among Asian faces – and the detectives analysing the clips.
Mr Choudhury said: ‘No tech company would ever put a system into production with a failure rate of one in 25. That’s horrific. It is filled with bugs.
‘They said they had officers visually review it. That is even more concerning because that is probably racial discrimination.
‘You’ve probably just seen two brown people, even though they have completely different features and said, “yeah, they look close enough. Let’s arrest them.”‘
According to Mr Choudhury, CCTV footage of the crime featured a younger man who looked different to him in every way except for their shared curly hair.
TVP said that another man, Eduard Zlatineanu, 23, had been arrested on the day of the crime and pleaded guilty at Aylesbury Crown Court just five days after Mr Choudhury’s false arrest.
A second offender who helped steal the loot from a charity fund for Sri Lanka floods victims at the vihara remains unidentified.
The force confirmed Mr Choudhury’s arrest was ‘based on the investigating officers’ own visual assessment’ following the initial automated match.
‘They saw I was a match, so they could have done some research, some background information on me and not just look at the two pictures and come and arrest me,’ Mr Choudhury added.
‘If they did any actual detective work, they would have crossed me out straight away, even if their facial recognition system identified me as a suspect.’
After being arrested at 4pm by Hampshire Constabulary officers – carrying out the arrest on behalf of TVP, the software engineer was finally interviewed at around midnight.
But it took just 10 minutes for questioning to conclude, wwith detectives satisfied he was not in fact the person captured on the CCTV clip.
‘When I was released, police were laughing because they saw the footage and it was clearly two different people,’ Mr Choudhury said.
‘The TVP officer admitted to me that before she even interviewed me, she knew I wasn’t the suspect because she had seen my custody photos and she had seen the footage of the suspect and she knew straight away.’
The software engineer’s mugshot was on the system after a previous false arrest in 2021 while he was a student in Portsmouth.
On that occasion, he and his group of four women and four men were attacked by a gang of eight to 10 men while getting a takeaway following a night out.
As his friend lay there with his teeth knocked out and Mr Choudhury nursed wounds ‘all over his head’, the group were stunned to see that police were arriving not to help them – but to arrest them.
He was questioned after 16 to 17 hours in custody and arrested once police became aware that another couple had been attacked on the same night.
It was only when officers watched CCTV of both attacks that Mr Choudhury was finally cleared.
They said his info and DNA would be removed from the system but his face was still on the software when he was arrested in January.
‘Now they have another photo of me, in theory, with this facial recognition system, they will match me double the amount of times,’ he added. ‘I could keep getting arrested.’
The facial recognition software is far from fool-proof and Home Office research revealed in December that matches for black faces are false positives 5.5 per cent of the time, far higher than the 0.04 per cent of white face matches which result in false positives.
The systems are generally approved by individual police forces, but the Home Office has pushed for their implementation and procured the German algorithm used to trawl through around 19million mugshots on the national database.
They run around 25,000 searches a month and, according to the National Police Chiefs’ Council, the matches should be treated as intelligence and not fact.
And Mr Choudhury has called for the Government to take responsibility for the system’s shortcomings and review its use.
He said: ‘They really need to look at this. Someone needs to be held accountable and there needs to be consequences, new laws and legislations implemented to protect members of the public.
‘There needs to be legislation on how AI facial recognition systems are used. There needs to be an investigation into the police force and they need to have more professionalism in how they carry out their work.’
Mr Choudhury recalled how he was not even allowed to collect his coat as officers placed him in handcuffs and searched his parents’ property.
‘I just said, “Hello officers, I’m not in any trouble am I?” as a joke,’ he added.
‘I have never even been to Milton Keynes, I was at work on the day, I would have had meetings, would have spoken to clients and my internal team, would have gone to Tesco during my lunch break, would have had bank transactions in Southampton.’
Mr Choudhury was concerned officers were trying to plant evidence as he waited outside and asked if he could watch them search his room.
Once at the station, the software engineer initially refused to have a custody photo taken, given his disappointment over his previous mugshot being kept on the system, before later agreeing to it.
He recalled how he was kept in a dark, echoey cell – silent but for the sound of water dripping.
Despite being cleared shortly after questioning, Mr Choudhury is worried the repercussions of his latest false arrest might bring him strife at work.
The software engineer has Home Office and Met Police security clearance and his ordeal had to be declared.
‘This just now looks very suspicious,’ he said.
Police and crime commissioners have warned of ‘concerning in-built bias’ and insisted that while ‘there is no evidence of adverse impact in any individual case, that is more by luck than design’.
A TVP spokesperson previously said: ‘While we apologise for the distress caused to the complainant in this case, their arrest was based on the investigating officers’ own visual assessment that the individual matched the suspect in CCTV footage following a retrospective facial recognition match, and was not influenced by racial profiling.
‘To confirm, retrospective facial recognition technology did initially provide intelligence, but did not determine the arrest.
‘Although later enquiries eliminated the individual from the investigation, this does not make the arrest unlawful.
‘We continue to use policing tools responsibly while striving to improve and build trust in our communities.’
Hampshire Constabulary previously declined to comment.
Mr Choudhury’s lawyer, Iain Gould of DPP Law said: ‘This isn’t policing by consent, and nor is it policing by common sense.
‘In this case, the police have been playing AI lottery with people’s lives, and Alvi has been wrongly arrested; now the police must pay the price for that.’
In response to Mr Choudhury’s ordeal, Dr Mary-Ann Stephenson, Chair of the Equality and Human Rights Commission, said: ‘When used in a way which respects and protects people’s rights, facial recognition and similar technologies may help to combat serious crime and keep people safe.
‘However, as we have seen, there is a danger that these technologies can be inaccurate and falsely identify people.
‘The data shows that there are racial disparities for false positive identification, causing human rights infringements and distress to those affected.
‘That means we need clear rules, to guarantee that facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards.
‘To ensure the new framework is being followed correctly, a new independent body should be established with appropriate enforcement and oversight powers to ensure compliance.’
Last month an innocent Sainsbury’s customer was marched out of his local store by staff after facial recognition software correctly identified an offender was inside – but staff ejected the wrong man.
Warren Rajah was in the Elephant and Castle branch of the supermarket when two members of staff and a security guard suddenly escorted him outside in what he described as ‘the most humiliating moment of his life’.
When the 42-year-old asked why, they pointed to a sign showing that the store used facial recognition technology.
In fact, they had mistaken him for someone who was on the system for shoplifting who had also entered the store at the same time.
It came after South Wales Police in January paid damages to a black man who was identified as a possible match to a stalking suspect despite being 32nd on the list of suggested matches on the facial recognition technology.




