TDPel Media News Agency

Facial Recognition Mistake Prompts Southampton Man Alvi Choudhury to Take Legal Action Against Police Following Wrongful Detention Linked to Milton Keynes Burglary

Gift Badewo - Author Profile Picture
By Gift Badewo

Alvi Choudhury was at home in Southampton, getting on with his day and working remotely, when everything suddenly shifted.

Police officers arrived at the house he shares with his parents and arrested him in full view of neighbours.

The reason? A computer had decided his face looked like that of a burglary suspect in Milton Keynes — a town roughly 100 miles away.

Choudhury, 26, ended up spending 10 hours in custody before being released at 2am with no further action.

No apology could erase the humiliation of being led away from his home while neighbours watched, or the anxiety it caused his family — particularly his father, who he says was left deeply shaken.

Now, he is suing both Thames Valley Police, which used the facial recognition system, and Hampshire Constabulary, which physically arrested him.

When Software Becomes a Suspect

The arrest was triggered by automated facial recognition technology.

Thames Valley Police had run an image of a burglary suspect — accused of stealing around £3,000 worth of goods — through a system linked to a massive national database containing around 19 million custody images.

The software flagged Choudhury as a match.

UK police forces rely on an algorithm developed in Germany and procured through the Home Office.

Around 25,000 searches are carried out each month.

According to the National Police Chiefs’ Council, these matches are meant to be treated as intelligence leads — not hard evidence.

But in practice, intelligence turned into handcuffs.

Police have insisted the arrest wasn’t based solely on the software’s suggestion.

They say a human officer also carried out a visual comparison before deciding to detain him.

Even so, the force later acknowledged the mistake “may have been the result of bias within facial recognition technology.”

“He Looks Nothing Like Me”

Choudhury says the CCTV footage showed someone about ten years younger than him, with lighter skin, a larger nose, smaller lips, and no facial hair.

The only similarity, he claims, was curly hair.

He also told officers he had proof he was working in Southampton on the day of the alleged burglary — 115 miles from the scene.

He even provided details of work meetings.

Despite that, he was taken into custody.

At the Hampshire police station, he says he asked officers whether he truly resembled the man in the footage.

He claims they laughed.

Only when Thames Valley officers later interviewed him in person did they realise the obvious — he was not the man in the CCTV clip.

By then, however, the damage had already been done.

The Shadow of a Previous Mistake

One uncomfortable detail makes this case even more troubling.

Choudhury’s image was on the police database because of a separate wrongful arrest in 2021.

Back then, he had actually been the victim — attacked during a night out in Portsmouth.

Yet his custody image remained on the system.

Facial recognition searches depend on those mugshots being stored and searchable.

Once an image is in the database, it becomes part of a vast digital lineup that can be scanned in seconds.

Choudhury now fears that this second arrest will mean another image is stored, increasing the risk of future false matches.

For someone who works as a software engineer and occasionally requires security clearance for government clients, that concern is not abstract.

He says the ordeal makes him appear “dodgier” — a perception that could have real consequences for his career.

The Bias Problem Nobody Can Ignore

This case lands in the middle of a long-running debate about algorithmic bias.

Home Office research published in December revealed that false positives for Black faces occur about 5.5% of the time.

For Asian faces, it is around 4%. For white faces, the false positive rate drops dramatically to roughly 0.04%.

Those numbers have alarmed police and crime commissioners, some of whom have warned about “concerning in-built bias.”

Critics argue that even if no single case proves systemic discrimination, the statistical imbalance itself raises serious questions.

An officer reportedly told Choudhury that because facial recognition was already under strategic review, there was no need to escalate his case further for organisational learning.

That remark alone is likely to fuel further criticism.

Not an Isolated Incident

Choudhury’s experience is far from unique.

Recently, South Wales Police paid damages to a Black man who had been flagged as a possible stalking suspect — despite ranking 32nd on a list of suggested matches.

Elsewhere, 67-year-old Ian Clayton found himself accused of theft after facial recognition technology used by Facewatch identified him as someone who had stolen items from a Home Bargains store in Chester.

He was asked to leave the shop, only to later discover the system had wrongly linked his image to suspicious activity.

Facewatch eventually admitted the mistake and permanently removed his image and associated record.

These cases highlight a recurring theme: when machines are wrong, ordinary people pay the price first — and corrections come later.

The Legal Battle Ahead

Choudhury is now pursuing damages for wrongful arrest, distress, reputational harm, and loss of work.

His claim targets both the force that relied on the facial recognition system and the one that executed the arrest.

Legal experts say such cases could test how much weight police are allowed to place on automated tools.

If courts determine that officers leaned too heavily on algorithmic suggestions, future operational guidelines may need tightening.

There is also a growing call for clearer safeguards — such as stricter human verification standards, independent audits of algorithms, and improved processes for removing innocent people’s images from databases.

What’s Next?

The lawsuit will likely force closer scrutiny of how facial recognition is used across UK forces.

If Choudhury succeeds, it could encourage others who believe they were wrongly flagged to step forward.

At a broader level, pressure is mounting on the Home Office and police chiefs to address racial disparities in false positives.

That could mean updates to the algorithm, revised operational policies, or even temporary restrictions on certain deployments.

For Choudhury personally, the next chapter is about restoring his reputation and ensuring he does not face the same ordeal again.

But for policing in Britain, this case could become part of a much larger reckoning over the balance between technology and civil liberties.

Summary

Alvi Choudhury, a 26-year-old software engineer from Southampton, is suing Thames Valley Police and Hampshire Constabulary after facial recognition software wrongly identified him as a burglary suspect in Milton Keynes.

He was arrested at home, held for 10 hours, and released without charge.

The match was generated through automated facial recognition technology scanning a national database of 19 million mugshots.

Despite visible differences between Choudhury and the suspect, officers detained him.

Research shows significantly higher false positive rates for Black and Asian faces compared to white faces, raising concerns about bias.

Similar incidents involving other innocent individuals have intensified debate over the reliability and fairness of facial recognition systems.

Choudhury’s legal action could influence how UK police forces use such technology in the future — and whether stronger safeguards are introduced to prevent further wrongful arrests.

Spread the News. Auto-share on
Facebook Twitter Reddit LinkedIn

Gift Badewo profile photo on TDPel Media

About Gift Badewo

A performance driven and goal oriented young lady with excellent verbal and non-verbal communication skills. She is experienced in creative writing, editing, proofreading, and administration. Gift is also skilled in Customer Service and Relationship Management, Project Management, Human Resource Management, Team work, and Leadership with a Master's degree in Communication and Language Arts (Applied Communication).