Mass Surveillance

George Floyd’s Murder, Three Years On: Insitutional Racism Hardwired in Police Tech

Three years ago today, rumblings of a global reckoning on racial injustice took place that led many people to reconsider their own experiences and roles when it came to anti-Blackness and racial discrimination. And unfortunately it had to come after one more violent incident – the murder of an innocent Black man at the hands of American police officers.

It would be cynical to say that nothing changed but it’s simply realistic to add that we are all still learning.

Insitutional racism in the UK

When George Floyd was killed, among the outrage were the cynics that still deny the existence of systemic oppression or that these problems are those of the US alone. And that’s simply not true.

Institutional racism within policing has a sizeable history in the UK and is very much present today.

From the Macpherson Report to Louise Casey’s damning assessment of the Metropolitan Police Service (MPS), the police in the capital are “institutionally racist” as well as sexist and homophobic. Elsewhere, findings from Greater Manchester Police’s Achieving Race Equality report in 2021 led the chair of Greater Manchester’s Race Equality Panel to conclude the force was institutionally racist

Earlier this year, the charity INQUEST found Black men were seven times more likely to die than white people following police restraint but that there was no adequate accountability for racism within police forces.

What we know about tech is that it accelerates results and so when an institutionally racist organisation like the police use technology to deploy more efficient policing, it further embeds and exacerbates inherent discrimination.

That’s why we urge transparency, public consultation and a moratorium on the use of tech in policing that will only amplify systemic oppression, such as:

Gangs Databases

Gangs databases like the Met’s Gangs Matrix erroneously and disproportionately pre-criminalise young Black men and impact their life chances. See Amnesty International’s report Trapped In The Matrix.

The ICO issued an enforced notice against the Met’s Gangs Matrix but other regions still possess gangs databases. See Dangerous associations: Joint enterprise, gangs and racism by Patrick Williams.

Social Media Weaponisation

Project Alpha is a Met project targeting serious gang violence allegedly inspired by social media. It refers posts, including music videos like drill music videos, for removal. The project involves the mass surveillance of children and, by the precedent of the Gangs Matrix, will target young people of colour. This is harmful enough but where young people are creating content, it can also impact their livelihoods. See Open Rights Group’s wiki on Project Alpha.

Online content is being mined and used as digital evidence to create gang narratives and, in conjunction with extended criminal liability, is increasingly being used to imprison young Black people and people of colour for offences they have not commited. See Ciaran’s Thapar’s Guardian article, What kind of society sends young men to jail and ruins lives because of the lyrics in a song?.

Facial Recognition

Facial recognition is being increasingly used by police forces to try and match people to watch lists but its error rate is unacceptable, particularly foryounger people and people with darker skin. See Big Brother Watch’s briefing on facial recognition surveillance.

According to a landmark ruling, South Wales Police breached human rights by deploying live facial recognition, but after only a few years its use has resumed.

See Open Rights Group’s blog, Don’t Use Beyonce to Normalise Live Facial Recognition.

Predictive Policing

Police across the UK are using algorithms to analyze massive amounts of information in order to ‘predict’ and try to prevent potential future crime. Discriminatory factors like ‘cramped houses’ and ‘jobs with high turnover’ have been used to determine whether someone is likely to commit a crime. See Liberty’s factsheet on Predictive Policing.

Civil society groups say the use of sensitive metrics is exacerbating discrimination in policing. See Fair Trials’ predictive policing tool.

The Prevent Duty

The Prevent duty aims to supposedly safeguard people from becoming terrorists or supporting terrorism by mandating people in positions of trust and authority – doctors, teachers, social workers – to report behaviour they subjectively deem radical. 

Even in frivolous circumstances and for minors, these reports remain on police databases alongside criminal data and are retained for 6-100 years. They can be shared across systems and watchlists and can impact people’s life chances, including educational opportunities. The duty disproportionately impacts British Muslims. See Prevent Watch’s People’s Review of Prevent Report Part III.

End Pre-Crime

GANG SURVEILLANCE in the uk

The case of the Manchester 10 and pre-crime policing

Find out more

end racialised surveillance

Open letter to Andy Burnham to end discriminatory policing

Find out more

REPORT: Prevent and the pre-crime state

ORG’s report on how unaccountable data sharing is harming a generation

Find out more