EDRi’s first Colour of Surveillance Europe Conference
In September, European Digital Rights (EDRi) held the first Colour of Surveillance Europe Conference in Amsterdam and it was such a privilege for me to be selected to attend both days of the conference. The Colour of Surveillance Conference was established in 2016 by Georgetown Law and the Center for Privacy to explore the intersection between the brutality of policing in Black communities and surveillance.
Since 2016, Georgetown Law and the Center for Privacy have held the Colour of Surveillance Conference annually (except in 2020 and 2021). The themes that have been addressed so far include government monitoring of the African American community, government monitoring of American immigrants, government monitoring of American religious minorities and monitoring of poor and working people. This year’s theme was policing of abortion and reproduction.
I knew the EDRi Colour of Surveillance Europe Conference was going to be something special when I saw the agenda: from the panels and the workshops to the somatic practitioner who guided us through the day as we ‘explore[d] key themes, issues, tensions and current organising linking these topics [racism and surveillance].’ What I was most excited about was the opportunity to build connections with others exploring the intersections between racial justice and digital rights, and the disproportionate impact of surveillance on marginalised communities.
Since entering the digital rights space, I have often heard in response to highlighting the specific impact of surveillance on communities that are vulnerable and marginalised ‘well it affects all of us.’ To paraphrase Alvaro M. Bedoya’s (Founding director of the Centre on Privacy and Technology at Georgetown University Law Center) statement, ‘There is a myth in the [digital rights sector] that in a world where everyone is watched, everyone is watched equally.’
As Gracie Mae Bradley, policy expert and co-author of Abolish Borders, emphasised at the launch event for Open Rights Group’s (ORG) How to Fight Data Discrimination toolkit:
“To succeed at fighting the harms of certain technologies we really need to understand our history. We need to understand how state racism works … we also need to respect the expertise of people at the sharpest ends of state power and that’s something that hasn’t traditionally been the strongest suit of the digital rights sector.”
On the first day of the conference, panel discussions were held on racialised digital criminalisation in the Netherlands, digital criminalisation across borders and contesting digital oppression. Participants heard from Franziska Manuputty-a survivor of the Dutch child benefits scandal. The scandal involved the use of algorithms by the Dutch Tax and Customs Registration to spot suspected benefits fraud. Listening to Franziska, I immediately thought of ORG’s efforts to challenge the UK Government’s proposals to extend the National Fraud Initiative (NFI) data matching powers.
Last year, data protection expert Chris Pounder raised the alarm regarding the Government’s proposals to extend data matching from its current anti-fraud base to include any other criminal activity, debt recovering and data quality (e.g. improving accuracy of personal data) in a blog. He suspects that data matching for immigration purposes is on the cards-identifying those migrants who cannot work in the UK. ORG published a briefing for migrants’ rights organisations to provide them with information about the likely impact of the Natioanl Fraud Initiative proposals on migrants.
I was also reminded of the legal challenge launched by Migrants’ Rights Network and Foxglove against unlawful data sharing between HMRC and the Home Office of migrants’ tax data for immigration enforcement impacting nearly 500,000 highly skilled migrants namely from Pakistan.
At ORG, we have been doing work on the expansion of Police data powers and so on the second day of the conference I attended a workshop on policing laws enabling racialised surveillance and resistance in the UK and France. I was able to learn more about racialised surveillance in the banlieues of France from Safia Oulmane from Ghett’up. I also heard Zara Manoehoetoe, a youth worker and community activist, speak about community organising against state harms specifically the Police, Crime, Sentencing and Courts Act 2022. Although surveillance is nothing new to people from communities that are racialised, it was interesting to learn how communities are resisting digital surveillance in different European states.
Later that day, I attended a workshop on trauma informed practice on the impact of increasing AI surveillance on Black women led by Eva Okunbor who works at Glitch. Glitch is an award-winning UK charity ending online abuse and championing digital citizenship with a particular focus on Black women and marginalised people. She emphasised that ‘how’ we do the work is just as important – maybe even more so – than doing the actual work; something that is constantly at the forefront of my mind. It was also something emphasised by Sarah Chander, EDRi’s Senior Policy Advisor, during the opening of the conference.
The third and final workshop I attended was on the racialised criminalisation of texts and text messages led by Patrick Williams (Senior Lecturer at Manchester Metropolitan University) and Roxy Legane (Founder of Kids of Colour). They spoke to us about a conspiracy case Roxy (who documented the whole trial) refered to as the Manchester 10 – 10 Black boys who were found guilty by association, the majority for harms they didn’t commit. The evidence used to charge the boys included text messages, rap, drill and grime lyrics, and expressions of grief in the aftermath of the loss of their childhood friend. ORG’s Sector Support Policy Manager, Sophia Akram, recently wrote a blog about young people being criminalised for content and how young people can exert their data rights find out what information is being held on them through a subject access request (SAR).
In my opinion, EDRi’s Colour of Surveillance Europe Conference succeeded in achieving its objectives:
- Raising awareness of the issue of racialised surveillance and criminalisation in Europe.
- Building connections and facilitating collaborative efforts between those organising on these issues.
Participants were already aware of the issue of racialised surveillance and criminalisation, withmany of us coming from communities at the sharpest end. However, there was still learning to be done – listening to the many examples of racialised surveillance carried out by different Europe states from different perspectives. The conference brought together a broad audience, including activists, researchers, funders, journalists and others engaged in the fields of racial justice, digital rights and anti-surveillance. The broad audience reminded me of The Social Change Ecosystem Map created by writer, facilitator and activist Deepa Iyer. Iyer says that ‘many of us play different roles in pursuit of equity, shared liberation, inclusion and justice.’
Reflecting on the conference and the roles mapped by Iyer, I thought of the work I have been doing on the Migrant Digital Justice Programme and the work Sophia has been doing to build relationships with new sectors such those challenging counter-terrorism policing.
We have been working hard to widen the lens through which ORG addresses digital rights issues and reaching out to community groups and organisations who fight for those at the sharpest end of data discrimination, exploitation and abuse.
In the fight for digital justice, we recognise that it is imperative that the digital rights sector engages carefully and thoughtfully with individuals, groups and communities to understand how they are impacted, what they are doing to resist, the challenges they face in this work and why and how we must work in solidarity. Activist Mariame Kaba’s ‘Questions I regularly ask myself when I’m outraged about injustice’ can be helpful for those of us working for digital rights organisations like ORG to check in with themselves on how best to work in solidarity with others:
- What resources exist so I can better educate myself?
- Who’s already doing work around this injustice?
- Do I have the capacity to offer concrete support and help to them?
- How can I be constructive?
There is much more that we at ORG need to learn and do, but we are grateful for the trust community groups and organisations have placed in us to support them on issues affecting migrants including digital right to work checks, age verification and counter-terrorism policing.