Australian technology leaders are losing sleep over increasing regulatory complexity, growing damage from ransomware, and the challenges of near-ubiquitous artificial intelligence (AI) and deepfakes, a new survey by cybersecurity body ISACA has found.

Generative AI (genAI) and large language models will drive the agenda in 2026, with 64 per cent of Oceania respondents to ISACA’s 2026 Tech Trends & Priorities Pulse Poll – which surveyed nearly 3,000 global security professionals – naming them as key.

As a transformative force, that places genAI ahead of AI and machine learning (60 per cent), data privacy and sovereignty (34 per cent) and supply chain risk (34 per cent).

Yet for all its promise, genAI has those risk and security professionals worried – with 67 per cent saying that AI-driven cyber threats and deepfakes will keep them up at night in 2026, and only 8 per cent saying they’re very prepared to manage its risks.

Some 45 per cent worry most about the “irreparable harm” if they fail to detect or respond to a major breach, while 41 per cent worry about supply chain vulnerabilities like those that hit the likes of Qantas, Dymocks, and British Airways.

Technical issues such as cloud misconfigurations and shadow IT (named by 38 per cent of respondents) are also causing security executives to toss and turn, as are fears that regulatory complexity (36 per cent) will put increasing pressure on security practices.

To address this, respondents named regulatory compliance (58 per cent), business continuity and resilience (52 per cent), and cloud migration and security (48 per cent) as top focus areas – with three-quarters expecting cyber regulations will boost digital trust.

Security leaders “are dealing with constant AI-driven threats, tighter regulation and growing expectations from executives, all while struggling to find and keep the right people,” ISACA Board vice chair Jamie Norton said.

“It’s a perfect storm that demands stronger leadership focus on capability, wellbeing and risk management.”

The monster under the bed

For an industry that was already highly stressful, the new threats posed by genAI have only made things worse – ratcheting up the pressure on chief information security officers (CISOs) that were already feeling the pressure long before it emerged.

ISACA’s findings corroborate recent surveys such as Proofpoint’s recent 2025 Voice of the CISO survey of 1,600 CISOs, which found 76 per cent of Australian CISOs have dealt with the material loss of sensitive information over the past 12 months.

ISACA Oceania ambassador and ACS Fellow Jo Stewart-Rattray said only eight per cent of tech leaders feel prepared for the risks of genAI. Photo: Supplied

With 80 per cent of Australian CISOs feeling that they are held personally accountable when a cybersecurity incident happens – well above the global average of 67 per cent – genAI is only exacerbating what was already a significant source of stress.

It “adds to the pressure on CISOs to secure their organisations in the face of a rapidly changing threat and technological landscape,” Proofpoint found, with “expectations high and increasing numbers feeling the pressure and experiencing burnout.”

Accounts of the heart attack suffered by former SolarWinds CISO Tim Brown – who not only struggled to clean up the major 2020 SolarWinds breach but was charged with fraud by the US SEC – have highlighted just how big a human toll the pressure is taking.

Setting the agenda for 2026

AI services and infrastructure are driving a global surge in global ICT spending, Gartner recently said, predicting spending will grow 9.8 per cent next year and pass $9 trillion ($US6 trillion) for the first time – much of it driven by genAI technologies.

Yet for all their companies’ spending plans, executives see managing those technologies as a major part of the challenge, with many respondents to the ISACA survey worried they won’t be able to find the staff to help them do so properly.

Some 37 per cent of Australian organisations expect to expand their hiring next year compared to this year, ISACA found – but the third who plan to hire audit, risk and cybersecurity professionals next year expect to have problems finding the right people.

This disconnect was identified in the recent major OECD Science, Technology and Innovation Outlook report, which noted that the security and resilience aspects of Australia’s science, technology and innovation policies “are relatively less apparent”.

This, then, is the crux of the fears that ISACA survey respondents carry with them – with compliance seen as crucial but 30 per cent not very or not at all prepared to actually deliver the oversight they need.

“With only eight per cent saying they feel very prepared for generative AI’s risks,” ISACA Oceania ambassador and ACS Fellow Jo Stewart-Rattray said, “there’s an urgent need to balance experimentation and usage with robust oversight.”