Editor's PickInvesting

Sunak says UK shouldn’t ‘rush to regulate’ AI

2 Mins read

<?xml encoding=”utf-8″ ?????????>

Rishi Sunak has said the UK “shouldn’t be in a rush to regulate” the development of artificial intelligence (AI) despite a dossier of potential dangers laid out by the government.

The Prime Minister made a speech on the risks and rewards of the new technology at the Royal Society this morning ahead of the UK’s AI Safety Summit at Bletchley Park next week.

Asked about regulation, Sunak said: “I think we shouldn’t be in a rush to regulate for a couple of reasons.”

He said the UK’s approach of encouraging innovation has “historically” been the right one, and stressed it was “hard to regulate something if you don’t fully understand it”.

Sunak said “We as a country tend to get this right. We tend to take a principles-based, proportionate approach to regulation that protects the things that we need to protect, whilst allowing the maximum amount of innovation to happen here.

“That is the hallmark of the UK – that’s why we have such successful innovative sectors like technology, life sciences and financial services.

“We need to not lose that as we think about AI and that’s why I think our approach is absolutely the right one for the country.”

The Prime Minister also said mitigating the extinction risk from AI should be a global priority alongside pandemics and nuclear war and said he wanted to be “honest” with the public.

It came after the government revealed a dossier of warnings about how AI could develop until 2030, and said it is unable to rule out it posing an “existential threat” to humanity.

Potential dangers cited were cyberattacks; terror groups developing bioweapons, rising unemployment; increased poverty; scams, fraud and fake news; election interference; trade secrets being stolen and “societal unrest” as people “fall victim to organised crime”.

An AI Safety Institute – based on the work of the AI Safety Taskforce – is part of the government’s plan to address these potential threats.

Sunak added: “As we understand what the risks are – if they manifest themselves – then we’ll be in a far better place to figure out what is the appropriate action to take at that moment.

“When you’re dealing with something so fast moving and not fully understood even by the people who are developing the tech themselves it’s hard to say ‘this is the best way to regulate’.

“I think first build the understanding and we can maintain our pro-innovation approach.

“Then move to something more practical down the line when we know exactly what we’re dealing with.”

Peter Kyle, Labour’s shadow science and technology secretary, said: “AI is already having huge benefits for Britain, and the potential of this next generation of AI could be endless, but it poses risks as well.

“Safety must come first to prevent this technology getting out of control. Rishi Sunak should back up his words with action and publish the next steps on how we can ensure the public is protected.

“We are still yet to see concrete proposals on how the government is going to regulate the most powerful AI models.”

Related posts
Editor's PickInvesting

LNER warns customers after passenger details exposed in cyber-attack

1 Mins read
London North Eastern Railway (LNER) has warned passengers to remain vigilant after a cyber-attack on a third-party supplier exposed customer contact details…
Editor's PickInvesting

Millions of Brits to use generative AI for Christmas gift shopping in 2025

1 Mins read
Britain’s festive shopping habits are undergoing a technological transformation, with new research showing that more than seven in ten consumers will use…
Editor's PickInvesting

UK life sciences sector slipping in global investment race, industry warns

1 Mins read
Britain’s ambitions to build a world-leading life sciences industry are being undermined by falling investment and mounting criticism from global pharmaceutical groups,…
Power your team with InHype
[mc4wp_form id="17"]

Add some text to explain benefits of subscripton on your services.

Leave a Reply

Your email address will not be published. Required fields are marked *