New Times,
New Thinking.

Advertorial: in association with Institution of Engineering and Technology
  1. Sponsored
28 April 2025

Artificial intelligence and energy security

Advanced technology can be a double-edged sword.

By Stephanie Baxter

Energy infrastructure outages can have widespread and serious consequences for both individuals and society. As well as the impact of physical damage to infrastructure – such as the recent substation fire which saw the closure of Heathrow Airport – vulnerability to AI misinformation and cyberattacks is rapidly becoming one of the biggest threats to the UK and one which government must mitigate.

 The damage of an electricity blackout to the UK economy depends on factors such as its duration, geographic impact, and affected sectors. However, estimates suggest that a nationwide blackout lasting 24 hours could cost billions of pounds. These are not just hypothetical scenarios – between 2023 and 2024 the UK economy suffered a loss of £17.6bn in economic output due to connectivity outages, with the average UK business losing over £11,000 in economic output.

These were relatively short power outages. However, if a system blackout lasts more than a few days, the economic damage could reach tens of billions of pounds. There would be supply chain failures, an impact on hospitals and emergency services, mass business closures and potential social unrest. That’s why investment in cybersecurity and network resilience today could save billions in lost revenue tomorrow.

The increasing number of devices connected by digital networks, including in energy infrastructure, exposes these areas to new risks. If these systems are fed with incorrect or misleading information, they might fail to identify potential issues such as impending equipment failures or capacity shortages. Misinformation can be a tool of cyber-attackers aiming to disrupt grid operations. AI-driven malicious misinformation campaigns could mislead operators or automated systems, causing disruptions and outages. An AI system manipulated by false data could also open vulnerabilities that hackers could exploit, potentially resulting in a third-party taking control of critical infrastructure, disrupting operations, or gathering confidential information

To mitigate these risks, it is crucial to ensure that AI systems are robust, transparent and subject to comprehensive validation and verification processes. There should also be tools and techniques that are available to developers that help prove they are safe and fit for purpose to regulators, with competency frameworks and lists of recognised qualifications to provide organisational reassurance and developer competence. As well as ensuring secure AI systems, training for all staff and effective cybersecurity measures are also essential to protect systems from manipulation.

Workers will need differing levels of awareness and training on AI depending on their organisational roles – such as “working”, “practitioner” and “expert”. There is a challenge finding people with the required skills at competitive salary rates, which is why the new Growth and Skills Levy should ensure flexible funding, particularly for SMEs, to upskill existing workers with bespoke short courses (micro-credentials) to ensure a basic standard of safety and competency for those at “working” and “practitioner” level. At the higher “expert” level, key cybersecurity roles should have protected status (in the same way as “medical doctor”) to help drive up and guarantee standards. While the integration of new technologies into our energy infrastructure poses threats, there are also significant security and resilience benefits to be gained from harnessing them in a safe way.

For example, by integrating AI alongside the adoption of cyber-physical systems like “digital twins”, virtual models connected to a real-world counterpart by a two-way flow of right-time data, we can monitor and rapidly address faults – boosting security and resilience. This can and is already being adopted on a case-by-case basis, but the potential benefits from a whole-systems approach is game-changing: if the government coordinates industry to bring together the different digital twins of critical energy infrastructure into one holistic model, this could then be coordinated to monitor and address issues across the whole energy system. By joining up monitoring and intervention of generation, transmission and consumption, government can ensure a secure supply of energy across the UK.

Subscribe to The New Statesman today from only £8.99 per month

But we cannot realise the potential of these technologies without the skilled workforce to utilise and adopt them. The UK’s engineering and technology firms are the least likely to recognise Digital Twins as a priority for reaching net zero (5 per cent), and less than a quarter of employers think that we have the skills in this area, which include data collection and analysis.

The introduction of AI into the systems that control our energy infrastructure is already under way, bringing both the potential to strengthen security and resilience through innovation, and the risk of a system failure – either through unintentional failures or from the ongoing efforts of malicious actors to access and weaken our systems. The best way to reap the benefits of new technologies and mitigate the risks is to ensure all workers are given sufficient and appropriate training on the safe and effective use of AI and digital systems, along with investment in cybersecurity expertise and robust regulation.

This article first appeared in our Spotlight Energy and Climate Change supplement of 24 April 2025

Topics in this article :
OSZAR »