Data Use and Access Bill will fail to protect public from AI harms


The Data Use and Access Bill will fail to protect the public from harmful uses of artificial intelligence, say digital rights campaigners, Open Rights Group. The Bill, published today, rehashes many of the provisions in the previous government’s controversial Data Protection and Digital Information Bill, which was ditched in wash-up prior to the General Election.

Legal and Policy Officer Mariano delli Santi said:

“Strong data protection laws are an essential line of defence against harmful AI and automated decision making (ADM) systems which can be used to make life-changing decisions.

“The Data Use and Access Bill weakens our rights and gives companies and organisations more powers to use automated decisions. This is of particular concern in areas of policing, welfare and immigration where life-changing decisions could be made without human review.

“The Government says that this Bill will generate billions for the economy but at what cost to the privacy, security and dignity of the British public?”

Petition: people, not machines should oversee life-changing decisions

The Data Bill must protect our right to request a human review of automated decisions that impact your life

Sign and Share

Key takeaways

Automated decision-making

Currently, governments and corporations cannot use automated systems to make decisions that have a legal or significant effect on our lives. The DAUB Bill removes this right unless the processing involves special category data, such as health data or political beliefs. This means that organisations can use automated decision to make life-changing decisions – such as firing workers, calculating wages, deciding on visa and benefits applications. It also gives the Secretary of State the right to outright exempt automated decision-making systems from data protection safeguards regardless of the risk they pose to the public

Data sharing

It will be easier for organisations to share data meaning that data collected for one reason can be shared with public authorities and private companies who may use it for something different such as immigration control, predictive policing or safeguarding. For example, data collected by GPs could be shared with the police or Home Office.

Weakening police accountability

The Bill would remove the need for police to log why they are accessing personal data using automated systems. This supposed cutting of red tape reduces accountability and transparency, and makes abuse more likely.

Undermining our data rights

Individuals data rights will be undermined by new loopholes which will allow companies and organisations to extend the time limit for responding to individuals’ requests (such as access to data or erasure) by asking for further information.

The Bill will also data grabs of our personal information under the guise of ‘research’. allow personal data to be used for commercial purposes instead.

Undermining the independence of the ICO


The Government will remain fully in charge of selecting members of the new Information Commission according to their own discretion and to determine their salaries. This allows the Government to interfere with the independence of the ICO, which already faces criticism for its failure to stand up for the public against government and corporate pressure. There are no measures for Parliament to approve appointments, or for individials to challenge the ICO in court, leaving it very hard for individuals to challenge substantive ICO decisions, even at Judicial Review. See https://www.openrightsgroup.org/blog/the-ico-is-leaving-an-ai-enforcement-gap-in-the-uk/

Hands Off Our Data