How GDPR stops discrimination and protects equalities
Workers and Trade Unions
Government plans would grant greater freedom for employers to collect and use personal data, providing those with unprecedented insight on their workers’ buying habits, social relationships, creditworthiness, lifestyle, and personality.
Watered-down accountability rules and the imposition of fees for personal data access requests will make it more difficult and onerous for workers to identify and challenge abuses, for instance in the context of timekeeping, scheduling, and monitoring either by traditional or automated means.
Plans to scrap the obligation to human review for algorithmic decisions would also ease discrimination and arbitrariness in automated processes such as recruiting, firing or management. In turn, this will reduce workers’ safeguards and redress options against wage theft, union-busting, discriminatory practices and work-related abuses.
Case-study: Uber’s robo-firing
Uber drivers have been unfairly dismissed by Uber’s algorithms without the right to appeal, a practice known as “robo-firing”.
Drivers challenged the decision under article 22 of the GDPR, which provides the right to not being subject to automated decisions that have a life-changing impact on individuals. They won, and the Courts ordered to reinstate the license of drivers who were robo-fired by Uber.
In other to gather evidence of these abuses, drivers have had to access the personal data Uber held about them. Thanks to the GDPR right to access, Uber drivers won a historic victory in their legal battle to access their personal data from Uber.
However:
- The UK Government would scrap article 22, denying workers’ recourse against robo-firing
- The UK Government would impose data access fees, making it expensive for Uber drivers to access their own data. Uber would also be allowed to refuse to hand over data if it is “too onerous for them”.
Immigrants
Government plans would introduce a legal ground for public and private bodies to share information to detect crimes and other safeguarding interests. The interplay with the National Fraud Initiative and the Hostile Environment would pave the way to conduct moment-by-moment monitoring of migrants activities with information obtained via data brokers, credit history, utility, employment, housing, and criminal records.
Furthermore, plans to introduce fees for data access requests will result in significant barriers for migrants who are willing to exercise their rights, but may not have the resources or even a bank account yet. In turn, this will replicate, in essence, the restrictions of the Immigration Exemption in the Data Protection Act 2018.
Case-study 1: Racist VISA algorithm
The Home Office was processing VISA applications via a “racist” algorithm that sorted applications on the basis of problematic and biased criteria, like nationality. An individual visa applicant allocated by the algorithm to the ‘Red’ category because of their nationality had much lower prospects of a successful application than the prospect of an otherwise equivalent individual with a different nationality allocated to the ‘Green’ category.
The Home Office scrapped the system under threat of judicial review. The Home Secretary agreed to put in place the essential legal protections — Equality Impact Assessments and Data Protection Impact Assessments.
However:
- The UK Government would scrap Article 22, thus allowing automated decisions like sorting VISA applications to be taken opaquely and without human review.
- The UK Government would scrap DPIAs requirements, thus allowing automated systems to be deployed without properly assessing the risks for the individuals
Case-study 2: The immigration exemption
The Government implemented an “Immigration Exemption” in the Data Protection Act 2018, that prevents migrants from exercising their right to access personal data. This prevented migrants from asking public or private organisations if and how they use their personal data to determine their eligibility for public benefits, credit, jobs, employment, housing, and other life necessities.
ORG and the 3Million challenged the immigration exemption in Court. The Court of appeal found that the Immigration Exemption was incompatible with the GDPR.
However,
- The UK Government would impose a data access fee, making it expensive and dissuading migrants from exercising their rights. Even if they were paying, migrants’ request may still be denied by an organisation if it is “too onerous for them”. In essence, this will reintroduce barriers to access migrants data that were first provided by the illegal Immigration Exemption.
Students and pupils
The Government will reduce boundaries and protections for the use of data for safeguarding concerns and the deployment of automated tools, thus incentivising the adoption of surveillance systems at school and for remote lessons.
Remote proctoring is another area where disproportionate harm by automated systems could be normalised and find legal standing under the guise of “detecting cheating”. This will reduce safeguards for pupils against data security risks and algorithmic decisions that cause harm to their right to free speech, creativity, social development and educational outcomes due to over-policing, bias and discrimination.
Increased freedom to share and use personal data for marketing and ad targeting will expose low-income students and their families to predatory practices by for-profit educational providers.
Case-study 1: A level failure
Ofqual calculated students’ grades with an unfair grading algorithm, that downgraded thousands of students’ marks and threw their futures into disarray.
Foxglove threatened to bring the Government to Court over the algorithm over the violation of key principles of data protection law and unfair and unlawful processing of data. In a last-minute U-turn, the Government scrapped Ofqual’s unfair grading algorithm in favour of teachers’ assessed grades.
- The UK Government would scrap article 22, and allow automated decisions with life-changing consequences to be taken opaquely and without human review.
Case-study 2: Remote proctoring
During Covid19, Universities began to use proctoring software to monitor students taking their exams. This software violates fundamental rights by processing a large amount of personal data, including identity, location, videos of movements, and the student’s room. These systems have also proven to be discriminatory — for instance, by falsely accusing BAME students of cheating.
In Italy, the DPA issued a 200.000€ fine against the Bocconi University for using a US proctoring software. The DPAs decree also ordered the University to stop processing students’ personal data, and interdicted the further transfer of this information to the US.
However:
- The UK Government would introduce a legitimate interest basis for “safeguarding interests”, thus allowing students’ surveillance for “detecting cheating” even if this harms or otherwise trumps the rights of affected students.
- The UK Government would enable the free flow of personal data to the US, exposing UK students to state surveillance by US agencies.
Victims of domestic violence
Government plans would grant greater freedom to share and acquire personal information: this will amplify the harmful and stigmatising effects of coerced debts, identity fraud, or other phenomena linked to, or revealing of, domestic violence — for instance, by making credit reports or court cases more widely available.
Plans to liberalise public authority use of automated systems would lead to increased reliance on predictive systems in social services, but without suitable safeguards against errors and adverse consequences for families and individuals being targeted.
Furthermore, the Information Commissioner supervising powers in the context of public safety would be reduced, thus giving more leeway to dehumanising law enforcement practices such as the mobile phone data extraction of victims of violence and sexual assaults, or excessive use of personal data being extracted, stored, and made available to others for public safety reasons.
Case-study 1: Digital strip searches
The UK Police adopted the policy of asking consent to extract phone data from victims of sexual violence. Victims who refused to provide consent to the full download of their mobile phone and social media data ran the risk of the investigation into their complaint being dropped.
Big Brother Watch challenged police practices on the basis of lack of transparency and coerced consent, relying on the GDPR rights and principles. The ICO investigated these practices, and found that mobile phone extraction raised issues around legality, necessity and proportionality, and transparency.
Under regulatory pressure, the UK Government scrapped digital strip searches
However,
- The UK Government would allow unprecedented power to the police for collecting and using personal data for law enforcement purposes — for instance, the further processing for loosely defined substantial public interests, or the legitimate interest of “reporting crimes”.
- The UK Government would water down ICO supervisory powers, by tasking the ICO with a “public safety duty”. In practice, the ICO will be required to ease law enforcement use of personal data, rather than scrutinise their practices.
BAME
Government plans would grant public and private entities unprecedented freedom to collect, use, and share information regarding buying habits, social relationships, creditworthiness, lifestyle, hobbies, and personality of BAME individuals.
Watered-down accountability rules and increased bureaucracy for individuals seeking redress will make it difficult to challenge the use of this data when it results in exclusion and discrimination — for instance, from affordable credit, jobs, employment, housing, and other life necessities.
Transparency and safeguards against algorithmic decisions will be significantly reduced, in a context where eligibility determination for public benefits and other services is often based on data streams that perpetuate race and class-based discrimination.
Case-study 1: Uber’s racist facial recognition system
Uber prompted workers to provide a real-time selfie and face dismissal if the system fails to match the selfie with a stored reference photo. The Live Facial Recognition System used by Uber proved to be discriminatory, as it struggles to identify BAME individuals, and led to their unfair dismissals.
Workers submitted access requests that allowed them to collect the selfies the system analysed. As the pictures are showing that the system is faulty, drivers are challenging their dismissals under article 22 GDPR.
- The UK Government would scrap article 22, denying workers’ recourse against discriminatory dismissals.
- The UK Government would impose a data access fee regime, making it expensive for workers to gather evidence about discrimination and unfair treatment. Even if they were paying, workers request may be denied by employers if it is “too onerous for them”.
Case-study 2: Racist automated recruiting tools
BAME job applications are being automatically rejected due to foreign names or other racist criteria. The GDPR provides affected individuals with recourse for unfair or discriminatory use of their data.
However:
- The UK Government would water down accountability rules and introduce data access fees. This will make it more difficult for affected parties to challenge unfair practices, as gathering evidences will become expensive and companies will be able to shred records.
Gamblers
Government plans would give greater freedom for gambling companies to collect, use and share information regarding buying habits, social relationships, creditworthiness, lifestyle, hobbies, and personality in order to target problem gamblers or individuals at risk.
Watered-down accountability requirements and the imposition of data access fees on individuals seeking redress would make it more difficult to obtain records of personal information and proof of data being used in underhanded or predatory ways.
This will make it easier to use personal data such as patterns of play, spending tendencies and exposure to risk for marketing and sales objectives, while reducing gamblers abilities to control the use of their information and challenge these practices.
Case-study 1: Leveraging on gamblers’ addiction
Gambling companies have been using data-driven targeted advertising to target serial gamblers with advertisements about gambling, and keep them hooked on their platforms. In other words, problem gamblers who were trying to quit would start receiving gambling ads to pressure them to beat again.
Serial gamblers have been relying on the GDPR data rights to hold gambling companies to account, such as obtaining gamblers’ profile data held by, and confirmation about being a target for, gambling companies. Lawyers in the UK are submitting data access requests to uncover malpractices.
However:
- The UK Government would impose a data access fee, dissuading and making it expensive for data subjects to uncover gambling companies abuses. Even if they pay, gambling companies may refuse to hand over data if it is “too onerous for them”.
- The UK Government would water down accountability and record-keeping requirements for organisations, allowing gambling companies to shred evidence of malpractice and evade responsibility.
LGBTQIA+
Government plans would grant public and private entities unprecedented freedom to collect, use and share information regarding buying habits, social relationships, creditworthiness, lifestyle, hobbies, and personality, increasing the likelihood of unwanted disclosure of one’s sexual orientation or identified gender.
The imposition of fees on data access requests will discourage and make it harder for members of the LGBTQIA+ community to understand what data is being stored about them, what it’s being used for, and to whom it is being shared.
Watered down accountability rules will make it harder for authorities and individuals to scrutinise and hold discriminatory practices to account, also in relation to individuals who identify as members of the LGBTQIA+ community.
Case-study 1: Grindr
Grindr collected and shared with advertisers personal details of their users, including location, sexual orientation and mental health.
The Norwegian Consumer Council filed a GDPR complaint that resulted in the Norwegian DPA fining Grindr 10 million euros. They also prohibited Grindr to share personal data without users’ consent.
However:
- The UK Government would liberalise data sharing for commercial reasons, for instance by providing unconditional legitimate interest grounds to share personal data for “improving services” or “profiling”. This will effectively legalise selling personal data to third parties.
NHS patients / Health data
Government plans would redefine data sharing rules for research purposes, allowing corporations and private entities to seize NHS health data without patients’ consent even when they are not in a direct-care relationship.
Watered-down accountability rules would allow outsourcing without suitable safeguards, and hamper scrutiny to practices that could expose health data to data breaches and other privacy risks.
Plans to liberalise the use of data for the development of AI, coupled with reduced safeguards against algorithmic decision making, would expose patients to potentially unsound medical assessments or advice provided by automated systems.
Case-study 1: Google DeepMind
Google DeepMind collected health records of 1.6 million NHS patients without patients’ knowledge or consent. They also failed to consult with the ICO as required by the law, as well as to implement contractual safeguards with the Royal Free Trust.
As part of a settlement with the ICO, the Royal Free Trust and Google DeepMind had to conduct an audit of their data protection practices.
A UK-based Law firm is now bringing representative action on behalf of NHS patients, over the illegal grab of NHS health records by Google DeepMind.
However:
- The UK Government would establish a new legal basis for research and undermine purpose limitation, thus allowing NHS patients data to be shared with private corporations without their knowledge or consent.
- The UK Government would scrap accountability rules and contractual safeguards duties, thus legalising Google DeepMind malpractices.
Children and Parents
Government plans would grant unprecedented freedom to collect, use, and share information regarding buying habits, social relationships, creditworthiness, lifestyle, hobbies, and personality of parents and children for marketing purposes.
This will expose parents and children to increasing exploitation of their information for commercial interests, as well as distress. The imposition of data access fees and increased bureaucracy for seeking redress will make it difficult for parents to be aware of, identify and challenge abuses.
At the same time, watered-down accountability requirements will hamper the Information Commissioner’s ability to scrutinise practices and act upon infringements.
Case-study 1: Bounty UK
Bounty UK illegally sold personal data of 14 million mothers and children to data brokers.
The ICO fined Bounty 400k for failure to identify a legal basis for processing, lack of fairness and transparency. Privacy International is helping victims to submit data access requests to Bounty and data brokers to regain control of their data.
However:
- The UK Government would provide a legitimate interest basis for processing personal data “to improve customers services”, allowing companies like Bounty to sell personal data for their won profit, even without your consent.
- The UK Government will impose data access fees, making it expensive and more difficult to regain access and control over personal data. Criminal enterprises will earn money for imposing you a fee on exercising the rights they have violated. Even if you were paying, Bounty may deny your request if it is “too onerous for them”.
Mental Health
Government plans would grant unprecedented freedom to collect, use, and share information regarding buying habits, social relationships, creditworthiness, lifestyle, hobbies, and personality. In turn, this will allow organisations to gain knowledge of and exploit individuals’ anxieties or mental health vulnerabilities for marketing and commercial purposes.
The imposition of data access fees and increased bureaucracy for individuals seeking redress will make it difficult for victims to challenge these practices, or ask for the deletion of information regarding their health conditions.
Watered-down accountability requirements and relaxed safeguards over algorithmic decision-making will hamper Information Commissioner’s ability to prevent and investigate harms.
Case-study 1: Facebook algorithm of trauma
Facebook is sharing data about anxiety and mental health of their users with advertisers, to allow advertisers to exploit individuals’ vulnerabilities and traumas for commercial purposes.
Facebook shares data with advertisers without your consent, a practice that is about to be declared illegal in the European Union.
However:
- The UK Government would provide unprecedented freedom to use your data for commercial purposes, which will likely legalise rather than challenge Facebook’s practices.
- The UK Government will impose a duty on the ICO to evaluate the impact on Facebook’s profits before enforcing your rights. Even when Facebook break the law, you may be denied remedy in the name of “innovation” and “economic growth”.