Stuart Monk - Fotolia

UK regulators lack the skills and expertise to cope with increasing use of algorithms

MPs told that multiple regulators will be needed to govern the ever-growing use of algorithmic systems in all areas of the economy and public sector

Many of the regulatory bodies overseeing algorithmic systems and the use of data in the UK economy will need to build up their digital skills, capacity and expertise as the influence of artificial intelligence and data increases, MPs have been told.

Expert witnesses told MPs sitting on the Digital, Culture, Media and Sport (DCMS) subcommittee on online harms and disinformation this week that existing frameworks for regulating data use and algorithms were insufficient, and suggested that the way forward is to move to a more fluid regulatory framework in which multiple regulators are empowered to take action.

Carly Kind, director of the Ada Lovelace Institute, said self-regulation of the data economy by the enterprises that operate in it had failed, which she attributed largely to the size and scale of the dominant platforms, as well as the incentives of the data economy itself.

“I think that their expansiveness means they’re not incentivised, necessarily, to meet the kind of standards we might expect from a public trust perspective,” she said.

“I think this relates to the underlying question around the data economy which incentivises the use of personal data for advertising purposes, and it doesn’t incentivise necessarily putting the brakes on.”

Kind added that although she thinks many actors in the digital economy, including platforms such as Google and Facebook which have taken action to address the challenges of mis- and disinformation, for example, are well-intentioned, “the scale of the problem is so great that I think having some external accountability mechanism is absolutely imperative in order to start to create an online space that is more hospitable to a wide range of communities”.

Jeni Tennison, vice-president of the Open Data Institute, said that although the Information Commissioner’s Office can be the general information regulator, different regulators – including the Competition and Markets Authority, Ofqual and Ofgem – will need the internal expertise and capacity to begin interrogating how the misuse of data might affect a particular sector, ecosystem or community.

Kind said that on top of developing the technical skills to understand various aspects of the data economy and technology in general, these regulators need to be empowered to demand access to a range of systems, algorithms and information, in both the public and private sectors, to scrutinise them effectively in the first place.

“Technical skills alone are not only what regulators need to be able to inspect algorithms, they will also need a legal framework – so they will need the powers to get in there and to assess the algorithms against a certain set of standards,” she said, adding that “broad notions of ethics are insufficient” and that “specific thresholds and standards” are needed.

“I think they’ll need a kind of infrastructure in order to match those technical capabilities with the powers, and do a kind of audit that we might have previously seen in the financial sector, that real scrutiny of systems that’s not only about a technical audit but also interviewing demanding information,” said Kind.

However, she admitted that, in terms of building technical skills and capacity in regulators, the dominant tech platforms have a monopoly on procuring new talent. “Regulators simply can’t offer the incentives and salary to join that a Google or Facebook can,” she said – and that will need to be addressed.

“The ICO has effectively become the regulator for everything because data is everything, and they simply just don’t have the people power to enforce all these micro violations, which are very important in one person’s life, but in the grand scheme of things there are tens of thousands of them happening every day,” she said.

Read more about algorithms

Scores of high-level data ethics frameworks, principles and guidelines have been published over the last few years, but they have not yet been translated into practice and made tangible, she said.

“I think what we are moving towards is an agile regulatory framework which tries to bring in ethical considerations and puts in place processes, rather than hard-and-fast rules,” said Kind.

“Things like impact assessments, risk assessments, audits, inspections, the ability for people to get redress – these types of tools and processes can help us evolve a regulatory framework, without just being a hard-and-fast set of rules that will be outdated when the platforms do their next software update.”

For Jiahong Chen, a research fellow in IT law at Horizon Digital Economy Research at the University of Nottingham, a key part of this process-driven approach is having debate and consultation before any system that makes important decisions about a person’s life is put in place.

“We have anti-discrimination law, based on which it will be illegal to treat people differently because of those factors, but what we’re seeing now is that a lot of complex systems are put into place and then we have no idea what factors have been taken into account, and how these factors have been accounted for,” said Chen.

“Decisions as important as, for example, going to university or even financial decisions in the private sector, should be subject to prior impact assessments and also consultations with the general public, so they can really think about ‘is it fair to make decisions based on what postcode I’m in or how many steps I’m taking per day?’”

Asked by MPs about the government’s National Data Strategy, published in September 2020, and whether it gets the balance right between the desire to create growth and the need to create a trusted data regime, Tennison said this was a false dichotomy.

“One of the things about the National Data Strategy is that it seems to be trying to set up a dichotomy between innovation and responsibility, when in fact we can do both and we should do both,” she said. “Having responsible processing of data and responsible algorithms is not only the right thing to do, but it’s absolutely necessary in order to win trust, in order to get adoption of those technologies.”

The DCMS subcommittee was established in March 2020 to consider a broad range of issues related to disinformation and online harms, including forthcoming legislation to regulate the latter.

Read more on IT governance

CIO
Security
Networking
Data Center
Data Management
Close