UK NEWS WEBSITE OF THE YEAR

The UK government is not including the use of AI on its required register

Admin, The UK Times
28 Nov 2024 • 06:27 am
The UK government is not including the use of AI on its required register

The UK government is not including the use of AI on its required register

The Secretary of Technology has admitted that government departments are not being open about how they use AI and algorithms.

Since the government made it mandatory, no department has registered the use of AI systems. This has raised concerns that the public sector is unaware of how algorithmic technology is being used, even though it impacts millions of people.

AI is already used by the government to help make decisions on things like benefit payments and immigration enforcement. Records show that public agencies have awarded many contracts for AI and algorithm services. For example, a police agency linked to the Home Office recently offered a £20m contract for facial recognition software, which has sparked worries about widespread surveillance.

However, only nine algorithmic systems have been added to a public register, and none of the AI programs used in welfare, by the Home Office, or the police are listed. This lack of information comes even though the government made it a requirement in February to register AI use across all departments.

Experts have warned that if AI is used without careful consideration, it could cause harm. There have been recent examples of technology not working as expected, like the Post Office’s Horizon software. In the UK government, AI is being used in many ways, from Microsoft’s Copilot system, which is being tested, to automated checks for fraud in the benefits system. The Department for Work and Pensions (DWP) has mentioned that interest in AI is growing quickly, both within the department and across the government.

Peter Kyle, the secretary of state for science and technology, admitted that the public sector hasn’t been clear enough about how it uses algorithms. When asked about the lack of transparency, Kyle said that the public has the right to know how the government is using algorithms. He believes people should feel that these algorithms are there to help them, not control them. The only way to achieve this is by being open about how they are used.

Big Brother Watch, a privacy rights group, criticized the police’s use of facial recognition technology. They said it is another example of the government not being transparent about how it uses AI, especially since MPs have warned that there is no proper law to regulate its use.

“The secret use of AI and algorithms to affect people’s lives puts everyone’s data rights at risk. Government departments need to be clear and honest about how they use this technology,” said Madeleine Stone, chief advocacy officer.

The Home Office did not comment.

The Ada Lovelace Institute recently warned that AI systems might seem like they reduce administrative work, “but they can harm public trust and reduce benefits if the predictions or results are unfair, damaging, or ineffective.”

Imogen Parker, an associate director at the data and AI research group, said: “Lack of transparency not only keeps the public unaware, but also means the public sector is making decisions without full understanding. Not sharing transparency records makes it harder for the public sector to figure out if these tools are working, learn from mistakes, and monitor their social effects.”

Since the end of 2022, only three algorithms have been added to the national register. These include a system used by the Cabinet Office to identify valuable historical digital records, an AI-powered camera to analyze pedestrian crossings in Cambridge, and a system to analyze NHS patient reviews.

However, since February, there have been 164 contracts with public bodies that mention AI, according to Tussell, a company that tracks public contracts. Tech companies like Microsoft and Meta are actively promoting their AI systems to the government. A recent report funded by Google Cloud claims that increasing the use of generative AI could save the public sector up to £38 billion by 2030. Kyle called it “a powerful reminder of how generative AI can transform government services.”

Not all AI used by the public sector involves personal data. For example, a £7 million contract with Derby city council is aimed at “Transforming the Council Using AI Technology,” and a £4.5 million contract with the Department for Education is focused on “improving the performance of AI for education.”

A spokesperson for the Department of Science and Technology confirmed that transparency standards are now required for all departments and said that “several records are set to be published soon.”

Where is the government already using AI?

  • The Department for Work and Pensions (DWP) is using AI to read over 20,000 documents daily to understand and summarize them. This helps share important information with officials for decision-making. DWP also uses AI to detect fraud and mistakes in universal credit claims. AI helps agents working on personal independence payment claims by summarizing evidence. This fall, DWP began using basic AI tools in jobcentres, allowing work coaches to ask questions about universal credit rules to improve conversations with jobseekers.
  • The Home Office uses an AI system to manage immigration enforcement, sometimes called a “robo-caseworker.” This system helps decide things like returning people to their home countries. The government says it is a “rules-based” system, not one that learns from data. It is designed to prioritize work, but humans are still responsible for making the final decisions. This system is used to manage the rising number of asylum seekers, about 41,000 people.
  • Several police forces, including the Metropolitan Police, South Wales Police, and Essex Police, use facial recognition software to track suspected criminals. Critics say this could turn public spaces into high-tech police line-ups, but supporters argue it helps catch criminals and assures that the data of innocent people is not kept.
  • NHS England has a £330m contract with Palantir to create a large new data platform. The deal with the U.S. company that builds AI tools has raised concerns about patient privacy, although Palantir says it does not control the data and customers do.
  • An AI chatbot is being tested to help people use the gov.uk website. It was built by the government’s digital service using OpenAI’s ChatGPT technology. Another AI chatbot, called Redbox, is used by civil servants in Downing Street and other departments to quickly look through secure government documents and get summaries and briefings.

Published: 28th November 2024

Also Read:

Glow Up: Top Skincare Routines for Every Skin Type in the UK
Premier League Channel 4 to train teens in Labour’s job plan
Smoking may cause 300,000 cancer cases in the UK by 2029

More Topics