Advancing UK Aerospace, Defence, Security & Space Solutions Worldwide
  • Home
  • /
  • Security
  • /
  • Protection of AI models against cyber attacks enhanced

Security Events

Protection of AI models against cyber attacks enhanced

New measures, anticipated to establish a global benchmark for enhancing the protection of AI models against hacking and sabotage, were unveiled today by the UK government.

Copyright Shutterstock / by Teerachai Jampanak

During a speech at CYBERUK, the government’s flagship cyber security conference, Technology Minister Saqib Bhatti announced two new codes of practice which will help developers improve cyber security in AI models and software, putting the UK economy on an even stronger footing to grow safely and helping the government achieve long term growth for the British economy.

Advertisement
Security & Policing Rectangle

The codes set out requirements for developers to make their products resilient against tampering, hacking and sabotage and will boost confidence in the use of AI models across most industries, helping businesses improve efficiencies, drive growth, and turbocharge innovation.

In the last 12 months, half of businesses (50%) and a third of charities (32%) reported cyber breaches or attacks, and phishing remained the most common type of breach. The codes introduced today show developers how software can be built in a secure way, with the aim of preventing attacks such as the one on the MoveIT software in 2023 which compromised sensitive data in thousands of organisations around the world.

Technology Minister Saqib Bhatti said: "We have always been clear that to harness the enormous potential of the digital economy, we need to foster a safe environment for it to grow and develop. This is precisely what we are doing with these new measures, which will help make AI models resilient from the design phase.

"Today’s report shows not only are we making our economy more resilient to attacks, but also bringing prosperity and opportunities to UK citizens up and down the country. It is fantastic to see such robust growth in the industry, helping us cement the UK’s position as a global leader in cyber security as we remain committed to foster the safe and sustainable development of the digital economy."

The new measures come as findings of a new report published today show the cyber security sector has experienced a 13% growth on the previous year and is now worth almost £12 billion, on par with sectors such as the automotive industry.

The findings are reported by the government’s annual Cyber Sectoral Analysis Report and show the number of cyber security firms finding home in the UK has risen in 2023, strengthening the UK’s resilience to attacks and propelling sustainable economic growth.

The new codes of practice will improve cyber security in AI and software, while new government action on cyber skills will help develop the cyber workforce and ensure the UK has the people it needs to protect the nation online.

NCSC CEO Felicity Oswald said: "To make the most of the technological advances which stand to transform the way we live, cyber security must be at the heart of how we develop digital systems.

"The new codes of practice will help support our growing cyber security industry to develop AI models and software in a way which ensures they are resilient to malicious attacks.

Advertisement
ODU RT

"Setting standards for our security will help improve our collective resilience and I commend organisations to follow these requirements to help keep the UK safe online."

These measures are crucial for new businesses in the digital age, ensuring cybersecurity commitment, safeguarding personal data for users and fostering global alignment for enhanced cyber resilience.

The AI cyber security code is intended to form the basis of a future global standard.

Rosamund Powell, Research Associate at The Alan Turing Institute, said: "AI systems come with a wide range of cyber security risks which often go unaddressed as developers race to deploy new capabilities. The code of practice released today provides much-needed practical support to developers on how to implement a secure-by-design approach as part of their AI design and development process.

"Plans for it to form the basis of a global standard are crucial given the central role international standards already play in addressing AI safety challenges through global consensus. Research highlights the need for inclusive and diverse working groups, accompanied by incentives and upskilling for those who need them, to ensure the success of global standards like this."

Today also marks the publication of the Capability Hardware Enhanced RISC Instructions (CHERI) report, introducing a new microprocessor technology known as 'magic chip', which integrates advanced memory protections to prevent up to 70% of current cyber-attacks.

Alongside this, Minister Bhatti announced this morning new initiatives on how the government and regulators will professionalise the cyber security sector, such as incorporating cyber roles into government recruitment and HR policies.

The minister also spoke about his intention to foster cyber skills among young people and inspire them into cyber careers, with the UK launching a campaign to encourage entries to a brand new national cyber skills competition for 18–25-year-olds later this year. The competition will give the winners the opportunity to represent the UK at international cyber competitions.

Advertisement
General Atomics LB
Peli launches 9730 RALS

Defence Security

Peli launches 9730 RALS

8 January 2026

Peli Products has launched the Peli 9730 Remote Area Lighting System (RALS), a next-generation lighting solution combining power, safety and portability.

Cranfield University continues collaboration with HMGCC

Defence Security

Cranfield University continues collaboration with HMGCC

7 January 2026

Cranfield University is continuing to help address national security engineering challenges through an ongoing collaboration with HMGCC (His Majesty’s Government Communications Centre) and its Co-Creation initiative: a partnership with Dstl (Defence Science and Technology Laboratory).

IFS to acquire Softeon

Aerospace Defence Security

IFS to acquire Softeon

6 January 2026

IFS today announced that it has entered into a definitive agreement to acquire Softeon, a provider of cloud-native Warehouse Management, Warehouse Execution and Distributed Order Management solutions.

Defence Medical Services awards Project Mercury contract to Avenue3

Defence Security

Defence Medical Services awards Project Mercury contract to Avenue3

6 January 2026

A £2.5 million contract to develop a Deployed Clinical Record system to enable defence clinicians to access military medical records anywhere in the world - Project Mercury - has been awarded by the Defence Medical Services, to Leeds based digital health-care solutions consultancy Avenue3.

Advertisement
ODU RT
Cyber action plan aims to bolster resilience of public services

Security

Cyber action plan aims to bolster resilience of public services

6 January 2026

Backed by over £210 million, a new UK Government Cyber Action Plan published today sets out how government will rise to meet the growing range of online threats, introducing measures that aim to make online public services more secure and resilient, so people can confidently use them - whether applying for benefits, paying taxes or accessing ...

Babcock leads new STEM pilot in Plymouth

Aerospace Defence Security

Babcock leads new STEM pilot in Plymouth

5 January 2026

Babcock International Group is to lead a new STEM pilot in Plymouth as part of a major UK Government £182 million national skills drive.

Advertisement
Security & Policing Rectangle
Advertisement
Babcock LB Babcock LB