Cybersecurity still fails at the click

By Justin Fong

Human behaviour, not technology, remains the critical factor in public sector cyber resilience.

Cybersecurity programmes assume rational behaviour - that staff will carefully read messages, evaluate risks logically, and follow procedures consistently. But neuroscience suggests vulnerability can arise from the interaction of situation, emotion, and personality. Image: Canva. 

Governments around the world have invested heavily in cybersecurity infrastructure - from zero-trust architectures to artificial intelligence (AI)-driven threat detection.
 

Yet phishing and social engineering remain the most common entry points for breaches. 

 

The reason is simple. Most attacks don’t target systems; they target human decision-making: 

  • Someone clicks a link. 

  • Someone responds to an urgent email. 

  • Someone trusts what looks legitimate. 

 

These are not failures of training or diligence. They are the result of how the human brain makes decisions under pressure. 

 

Cybersecurity programmes assume rational behaviour - that staff will carefully read messages, evaluate risks logically, and follow procedures consistently. 

  

But neuroscience suggests otherwise. Our brain operates using two interacting systems: 

  • A fast, automatic system optimised for speed and response 

  • A slow, analytical system responsible for reasoning and verification 

 

Under stress, time pressure, or cognitive overload, the fast system dominates. And phishing attacks succeed because they trigger this system before analytical thinking can engage. 

What happens when a phishing email arrives 

 

When an email appears, the brain does not first ask, “Is this legitimate?” It asks: 

  • “Is this urgent?” 

  • “Is this from someone important?” 

  • “Will there be consequences if I ignore this?” 

 

Attackers, therefore, design messages to exploit these instinctive evaluations.  

 

This aligns with behavioural research captured in the model of phishing susceptibility, which shows that vulnerability arises from the interaction of situation, emotion, and personality. 

Context shapes attention 

 

Public sector officers operate in environments characterised by high email volumes, tight deadlines, multiple reporting lines, and strong hierarchical norms.  

 
Justin Fong: Effective cyber resilience requires habits that align with how the brain works.
 

Cognitive science shows that attention is a limited resource, and when workload increases, the brain prioritises speed over scrutiny.  

 

Capitalising on this vulnerability, phishing emails are deliberately timed during reporting cycles, near financial or compliance deadlines, and before holidays or system cut-offs. 

 

In these contexts, clicking is not irrational. It is predictable. 

Feelings precede logic 

 

Emotions play a central role in decision-making.  

 

They are reflex and hence act faster than conscious reasoning and guide behaviour before logic intervenes.  

 

Social engineering, therefore, relies on a small set of emotional triggers: 

  • Authority - appearing to come from senior leadership or regulators 

  • Urgency - imposing time pressure 

  • Curiosity - offering access to sensitive or exclusive information 

  • Familiarity - referencing prior interactions or shared context 

 

Once emotion is triggered, the brain seeks resolution through action. This explains why even experienced and well-trained officers can make mistakes under pressure. 

Strengths can become vulnerabilities 

 

Individual traits influence which emotional triggers are most effective.  

 

Conscientious officers feel pressure to comply correctly, helpful officers feel compelled to assist, and curious officers are drawn to new information.  

 

These are strengths in public service, not weaknesses. However, attackers deliberately exploit them.  

 

The implication is important as improving cyber resilience is not about “fixing” people but about designing cultures that account for human variability. 

Why awareness training alone falls short 

 

Most cybersecurity training focuses on recognition. Spot suspicious links, identify unusual senders, and look for technical red flags.  

 

While these are necessary, this assumes that the analytical brain is active at the moment of decision. In reality, as we have spoken, under pressure, it often is not. 

 

What is missing is a behavioural interruption - a mechanism that slows decision-making long enough for analysis to occur. 

A practical behavioural control: spot, pause. verify 

 

Effective cyber resilience requires habits that align with how the brain works.  

 

The Spot-Pause-Verify framework provides such a mechanism: 

  • Spot unusual cues (tone, timing, request) 

  • Pause briefly to allow emotional arousal to subside 

  • Verify using an independent channel 

 

This short pause re-engages analytical thinking and interrupts impulsive action. It functions as a cognitive control rather than a technical one. 

Leadership matters 

 

In government, cybersecurity is fundamentally about trust in systems, institutions, and public data. Leaders shape the conditions under which decisions are made: 

  • How much urgency is normalised 

  • Whether questioning requests is acceptable 

  • Whether reporting mistakes is safe 

 

A culture that prioritises speed over verification increases risk.  

 

A culture that penalises mistakes discourages early reporting.  

 

Cyber resilience, therefore, extends beyond IT controls into governance, leadership, and organisational design. 

 

Every cyber incident begins in the moment a decision is made under pressure. Understanding the science behind that moment is critical. 

 

Cybersecurity does not fail because people are weak. It fails when systems ignore how people make decisions.  

 

Governments that recognise this will be better positioned to protect not just their systems, but the trust the public depends on. 

 

---------------- 

 

The author is a former military security officer and senior communications leader with over 30 years of experience. He helps organisations strengthen their human firewall by transforming employees from the weakest link in cybersecurity to the first line of defence. He has previously worked for the Singapore Armed Forces, Prime Minister’s Office, and A*STAR, leading crisis response teams, advising political office holders, and building communication strategies that work under pressure.