Why Good Employees Make Bad Security Decisions: The Psychology Behind Insider Threats

Security as the Enemy of Productivity

Let me share an uncomfortable truth I've learned in four decades of IT and cybersecurity work: most security failures aren't caused by bad people. They're caused by bad systems that force good people to make bad choices.

When the Information Commissioner's Office analyzed school data breaches and found that students were guessing passwords or finding them written down, the immediate reaction is to blame the students. But that misses the point entirely. If people are consistently bypassing security measures, the problem isn't people, it's the security design.

Think about it: students don't wake up thinking "I fancy committing computer fraud today." They wake up thinking "I need to submit this coursework" or "I need to check my timetable." When the secure way to do that is slower, more complicated, or doesn't work properly, they find another way.

The same applies to your employees.

The Spreadsheet of Shame

I once consulted for a business where the password policy required:

  • Minimum 15 characters

  • At least 3 uppercase letters

  • At least 3 numbers

  • At least 3 special symbols

  • Changed every 30 days

  • No reuse of previous 24 passwords

Sounds secure, right? The IT manager was quite proud of it. Until I discovered that employees had developed an elaborate Excel spreadsheet, shared on a network drive, tracking everyone's passwords so they could help each other when someone inevitably forgot theirs.

They'd essentially created a master key to the entire business, stored in plain text, accessible to anyone who knew where to look. And they weren't trying to compromise security. They were trying to do their jobs.

When Security Becomes Theatre

Here's what happens when security policies become too restrictive:

  1. Shadow IT Systems Emerge: Employees create unofficial workarounds because the official systems don't work for them. They use personal email to share files, personal devices to access company data, or unapproved cloud services to collaborate.

  2. Workarounds Become Standard Practice: What starts as one person's solution to a frustrating policy becomes the team's unofficial procedure. Soon everyone knows that "to really get things done, you need to..."

  3. Security Becomes Adversarial: Instead of protecting the business, security is seen as an obstacle to overcome. The conversation shifts from "how do we stay secure?" to "how do we get around these restrictions?"

  4. Real Threats Go Unreported: When people feel blamed for security issues, they hide problems rather than reporting them. That suspicious email? Better to delete it quietly than admit you might have clicked something dodgy.

The Student Hacker's Mindset

The ICO found that student hackers were motivated by "dares, notoriety, financial gain, revenge and rivalries." But there's another motivation that doesn't make the official list: curiosity and the challenge.

According to the Centre for Internet Security, 82% of K-12 schools in the US experienced a cyber incident between July 2023 and December 2024. Many of these incidents started with students who weren't trying to cause harm. They were testing boundaries, exploring systems, or trying to solve problems the official channels wouldn't let them solve.

Consider the case from the ICO report: three Year 11 students hacked into their school's student information system using password-breaking tools. When caught, two admitted they were inspired by curiosity and the challenge. They weren't trying to steal data or cause damage. They wanted to see if they could do it.

Now ask yourself: do you have employees with similar curiosity? Technical people who wonder "what if I tried this?" or "I bet I could access that if I wanted to?" That curiosity isn't a character flaw. It's often what makes them good at their jobs. The question is whether your security systems channel that curiosity constructively or push it underground.

The Password Paradox

The ICO found passwords written on bits of paper. In 2025. Let that sink in.

But before we judge too harshly, consider the cognitive load we're placing on people. The average business professional needs to remember passwords for:

  • Email account

  • Computer login

  • Cloud storage

  • Customer database

  • Accounting software

  • Project management tools

  • Video conferencing

  • Expense reporting

  • HR portal

  • And a dozen other systems

If each password must be unique, complex, and changed regularly, we're essentially requiring people to memorize the equivalent of several chapters from a random character generator. It's not realistic. So they write them down, reuse them, or create predictable patterns like "Password123!" becoming "Password124!" next month.

Design Security for Humans

The key insight from analyzing school breaches applies directly to business security: humans will always follow the path of least resistance. You can't change human nature, but you can design security that works with it rather than against it.

Here's how:

Make the Secure Way the Easy Way

Multi-factor authentication gets resistance because it's seen as adding steps. But modern MFA using biometrics or app notifications is actually easier than remembering complex passwords. Single sign-on systems reduce the number of passwords people need to remember. Password managers make strong, unique passwords effortless.

Provide Clear Alternatives

If employees need to share large files, give them a secure, easy-to-use file sharing system. If they don't have one, they'll email files to personal accounts or use unauthorized cloud services.

Build Trust, Not Fear

When someone makes a security mistake, the response shouldn't be punishment, it should be education. You want people to report when they've clicked a suspicious link or accidentally sent data to the wrong recipient. If the consequence is blame, they'll hide problems until they become disasters.

Test Assumptions Regularly

Your security policies might have made sense when you implemented them, but do they still work? Are people following them, or have workarounds become the norm? Regular reviews of actual behavior (not just policy documents) reveal where security is breaking down.

The Trevor Graves Lesson

Trevor Graves, a University of Iowa student, used hardware keyloggers he named "pineapple" and "Hand of God" to change grades over 90 times. He even charged classmates for the service, running what was essentially a black market grade modification business.

When caught, investigators found text messages revealing the sophistication of his operation: "Pineapple hunter is currently laying in wait in a classroom already." This wasn't a bored teenager clicking around. This was reconnaissance, planning, and execution.

But here's what's interesting: Graves succeeded because of human factors, not technical sophistication. Teachers left computers unattended while logged in. Physical security was lax enough to plant hardware keyloggers. The system assumed trusted users in trusted environments.

The same assumptions that let a student run a four-month grade modification business could let an employee (or contractor, or visitor) access your sensitive business data.

The Government's Response

The Department for Education's mandate that all FE colleges must have Cyber Essentials certification by July 2025 isn't just about ticking compliance boxes. It's recognition that baseline security practices matter.

But even with mandatory standards, UK government data shows concerning breach rates:

  • Primary schools: 44%

  • Secondary schools: 60%

  • Further Education: 85%

  • Higher Education: 91%

These institutions have security policies, compliance requirements, and often dedicated IT staff. They still experience high breach rates because policies alone don't change behavior. You need to design systems that people can actually use without heroic effort or frustrating workarounds.

What This Means for Your Business

Most small businesses don't have dedicated IT security staff. They rely on employees following policies and making good decisions. That's not necessarily a problem if you design your security with human factors in mind.

This week, ask yourself:

  1. Can your team actually follow your security policies? Not "should they be able to," but "can they, realistically, given everything else they need to do?"

  2. What workarounds exist in your organization? Where are people bypassing official procedures because those procedures don't work well enough?

  3. When was the last time someone reported a security concern? If the answer is "never," that's probably not because your security is perfect. It's because people don't feel safe reporting problems.

  4. Are you measuring compliance or effectiveness? Having a password policy doesn't help if everyone's writing passwords on sticky notes to comply with it.

The Path Forward

Good security isn't about creating impenetrable fortresses. It's about making the right choices the easy choices. When employees bypass security, they're usually trying to solve real business problems. Your job is to provide secure ways to solve those problems that don't feel like fighting the system.

The students who hacked their school systems weren't inherently criminal. They were curious, persistent, and motivated to solve problems, sometimes their own problems like improving grades, sometimes just the challenge of seeing if they could do it. Those same traits make for excellent employees. The difference between a security vulnerability and a security asset often comes down to whether the system works with human nature or against it.

Tomorrow, we'll look at practical technical solutions that respect human factors while still maintaining strong security. Solutions that work in the real world, not just on paper.

Noel Bradford

Noel Bradford – Head of Technology at Equate Group, Professional Bullshit Detector, and Full-Time IT Cynic

As Head of Technology at Equate Group, my job description is technically “keeping the lights on,” but in reality, it’s more like “stopping people from setting their own house on fire.” With over 40 years in tech, I’ve seen every IT horror story imaginable—most of them self-inflicted by people who think cybersecurity is just installing antivirus and praying to Saint Norton.

I specialise in cybersecurity for UK businesses, which usually means explaining the difference between ‘MFA’ and ‘WTF’ to directors who still write their passwords on Post-it notes. On Tuesdays, I also help further education colleges navigate Cyber Essentials certification, a process so unnecessarily painful it makes root canal surgery look fun.

My natural habitat? Server rooms held together with zip ties and misplaced optimism, where every cable run is a “temporary fix” from 2012. My mortal enemies? Unmanaged switches, backups that only exist in someone’s imagination, and users who think clicking “Enable Macros” is just fine because it makes the spreadsheet work.

I’m blunt, sarcastic, and genuinely allergic to bullshit. If you want gentle hand-holding and reassuring corporate waffle, you’re in the wrong place. If you want someone who’ll fix your IT, tell you exactly why it broke, and throw in some unsolicited life advice, I’m your man.

Technology isn’t hard. People make it hard. And they make me drink.

https://noelbradford.com
Previous
Previous

Kido Nursery Rant: When We Lost Whatever Was Left of Our Souls

Next
Next

Your Biggest Cyber Threat Wears a School Uniform: What Small Businesses Can Learn From School Hackers