Confessions of a Reformed School Hacker: How Getting Caught Changed My Career
The Geography Teacher's Fatal Mistake
I was sixteen when I first accessed a system I shouldn't have. This was the Gordon Brown era, our school had actual PCs running a Novell network, and I'd discovered something remarkable: our IT administrator, who was actually the Geography teacher, had terrible password security habits.
I didn't use sophisticated hacking tools or write complex code. I used something far more effective: I stood behind him and watched him type.
What happened next changed the trajectory of my entire career, though I didn't know it at the time. Getting caught wasn't the disaster I thought it would be. It was the beginning of understanding what cybersecurity really meant.
How It Started: Curiosity and Access
Our school's IT infrastructure was a patchwork of inherited systems and budget constraints. We had a Novell NetWare network connecting about fifty PCs, dial-up internet access that worked sporadically at best, and a Geography teacher who'd been volunteered to manage it all because he "knew computers" (translation: he owned a PC at home).
Looking back, the security was laughable. But at the time, it seemed quite sophisticated to a curious teenager. User accounts for every student, network drives for sharing files, even email accounts, though checking them meant waiting several minutes for the dial-up connection to establish.
The breakthrough came during a particularly boring lesson about oxbow lakes or river deltas, I can't quite remember which. I'd finished the assigned work and found myself daydreaming near the teacher's desk. He needed to check something on the network, logged into the admin console, and didn't think twice about the fact that a student was standing directly behind him with a clear view of his keyboard.
His password was "admin123!"
I remember thinking: "That's it? That's the key to the entire school network?"
The Exploration Phase
I didn't immediately do anything with this information. For a few days, I just carried it around like a secret, occasionally thinking "I could access anything if I wanted to."
But curiosity is a powerful thing when you're sixteen. Eventually, I logged in using the admin credentials from a computer in the library after school. My heart was pounding like I'd just committed a serious crime, even though all I was doing was looking at file directories.
The first thing that struck me was how disorganized everything was. Files scattered randomly, no clear structure, teachers' personal documents mixed with school data. The second thing was how much access the admin account had, literally everything. Student records, exam papers waiting to be printed, even what appeared to be salary information for staff.
I didn't touch any of it. But I did start exploring the network architecture, trying to understand how it all fit together. Where were the backups stored? How did the internet connection work? What determined who could access what?
Without realizing it, I was conducting my first security audit.
When Curiosity Met Opportunity
The trouble started when my friend Tom mentioned that his coursework file had corrupted. He'd spent weeks on a project analyzing local geography data, and suddenly it wouldn't open. He was genuinely distressed, facing a zero mark and having to redo months of work.
"I can probably fix that," I said, with the confidence of someone who really shouldn't have been that confident.
That evening, using the admin credentials, I accessed the corrupted file. It turned out to be a relatively simple problem, the header information was damaged but the actual data was intact. I rebuilt the file, tested it to make sure it worked, and saved it back to Tom's directory.
He was amazed. I was pleased with myself. And I made my first serious mistake: I helped a few other students with similar problems.
Before long, I was the unofficial "fix-it" person for file problems. My error wasn't the fixing itself, it was not realizing how suspicious it looked that a student could access and repair files he technically shouldn't have been able to see.
The Shoulder-Surfing Network Grows
What made it worse was that I wasn't the only one who'd noticed the Geography teacher's password habits. A couple of other students had worked out similar approaches, watching over shoulders, noticing patterns, testing logical variations of passwords they'd partially seen.
We started comparing notes. It became a sort of game, figuring out who had access to what, discovering security gaps, finding clever ways to navigate the network. We weren't malicious, we were genuinely fascinated by how it all worked, and perhaps a bit smug about being clever enough to figure it out.
One lad even created a map of the network topology, showing which servers connected to what, where backups were stored, how the internet connection routed through the system. It was genuinely impressive work, the kind of thing that would make a network administrator proud, if it hadn't been done entirely without authorization.
Getting Caught: The Assembly Hall Summons
The call came during a Maths lesson. "The Headmaster wants to see you and three others in his office. Now."
That walk down the corridor was one of the longest of my young life. I knew exactly what this was about, and the fear wasn't just about punishment. It was about disappointing people, about having done something I knew was wrong even if I'd justified it to myself as helping people.
The Headmaster's office was crowded. The Head, the Deputy Head, the Geography teacher looking both angry and embarrassed, and the LEA's actual IT specialist, who'd apparently been called in specifically to investigate "unauthorized network access."
They laid it out clearly: they knew we'd been accessing systems without authorization. They had logs showing admin access from student accounts. They'd found the network topology map saved on one of the school computers. We were, in technical terms, comprehensively busted.
The Punishment That Changed Everything
I expected suspension. Possibly expulsion. Maybe even police involvement, though I wasn't sure what laws we'd actually broken.
What I didn't expect was what the IT specialist said: "These students clearly understand the network better than our current setup allows. Rather than punish them, I suggest we put them to work."
The Headmaster wasn't entirely convinced, but the IT specialist was persuasive. "They've identified genuine security weaknesses. They've demonstrated technical aptitude. And they're clearly interested in how systems work. Let's channel that constructively."
Our punishment was this: we were to help implement a new security system for the school network. Under supervision, of course. We'd be the test subjects, the challengers, the ones trying to find holes in the new security while helping to design it.
It was brilliant. Instead of criminalizing curiosity, they were redirecting it.
Learning Security by Breaking It
Over the next several months, I learned more about cybersecurity than any textbook could have taught me. The IT specialist treated us like junior consultants. He'd explain a security measure, then challenge us to defeat it. When we found weaknesses, we'd discuss why they existed and how to address them.
We learned about:
Proper password policies (not "admin123!")
Access control and the principle of least privilege
Audit logging and monitoring
Physical security measures
Social engineering vulnerabilities
Backup and recovery procedures
But more importantly, we learned why security mattered. It wasn't about restricting people or making systems difficult to use. It was about protecting information, ensuring systems worked reliably, and building trust.
The Lessons That Stuck
That experience taught me several things that still guide my work today:
1. Curiosity Isn't Criminal
The students currently hacking school networks, the 57% of breaches that the ICO attributes to student actions, they're not criminals. They're curious. They're testing boundaries. They're often trying to solve real problems, even if their methods are unauthorized.
The question isn't how to eliminate that curiosity. It's how to channel it constructively.
2. Insider Threats Come from Knowledge and Access
I wasn't a sophisticated hacker. I was a student who paid attention, noticed weaknesses, and had the time and motivation to explore them. That's exactly the profile of most insider threats in business environments.
Your employees have legitimate access, intimate knowledge of your systems, and various motivations to use that access in ways you might not intend. That's not a personnel problem. It's a security design problem.
3. The Best Defense Is Understanding the Attack
When organizations treat security incidents as purely disciplinary matters, they miss the learning opportunity. Every breach, whether successful or attempted, reveals something about your security posture. Treating the person who exposed the weakness as the enemy means you'll never understand the weakness itself.
4. Security Must Work for Users
The reason I could access admin systems wasn't sophisticated hacking. It was that the legitimate security measures were so cumbersome that even the administrator was working around them. He used simple passwords because complex ones were too hard to remember. He didn't log out because logging back in took too long.
When security fights productivity, productivity wins. Every time.
What This Means for Your Business
The National Crime Agency reports that one in five British children aged 10 to 16 has engaged in illegal online activity. That means roughly 20% of your future workforce has experience bypassing computer security. Some were malicious. Most were just curious.
The question for business owners isn't how to keep these people out of your organization. Many of them will become your most technically capable employees. The question is how to build security systems that work with human nature rather than against it.
Consider:
Do your employees have legitimate reasons to work around security? If so, they will. Address those reasons, don't just enforce policies.
Are you treating security incidents as learning opportunities? When someone makes a mistake, the response shapes whether future mistakes get reported or hidden.
Is curiosity encouraged or punished? The same employees who might "hack" your systems to solve problems could be your best security advocates if you channel that curiosity properly.
The Current Threat Landscape
The statistics from our podcast episode bear repeating:
82% of K-12 schools in the US experienced cyber incidents recently
A 19-year-old student just extorted $2.85 million from PowerSchool after accessing 62 million student records
The youngest referral to the UK's Cyber Choices programme was seven years old
97% of credential theft incidents in schools were student-led
These aren't isolated incidents. They're symptoms of a larger pattern: insider threats are real, they're growing, and they often come from people who don't see themselves as threats.
From School Network to Career Path
That experience getting caught and being put to work on the school's security didn't just teach me about cybersecurity. It set my entire career path. I went from being a student who'd broken the rules to someone who understood why the rules existed and how to implement them properly.
Years later, working alongside Noel on The Small Business Cyber Security Guy Podcast, I still think about those lessons. The human factors haven't changed. Technology has evolved, but human nature remains constant. People will always choose the path of least resistance. Curiosity will always drive exploration. And security systems that fight human nature will always fail.
Your Action Plan
If you're running a small business and worrying about insider threats, here's what that sixteen-year-old me wishes businesses understood:
Assume Curious People Will Test Your Security: Build systems that can withstand curious, persistent people with legitimate access.
Make the Secure Path the Easy Path: If following security policies is harder than working around them, expect workarounds.
Create Channels for Legitimate Curiosity: Some employees will be interested in how systems work. That's valuable. Give them appropriate channels to explore and contribute.
Treat Security Incidents as Data: Every attempt to bypass security, successful or not, tells you something about your security design. Learn from it.
Remember That Knowledge Is Asymmetric: Your users know more about the practical reality of using your systems than you do. Listen to them.
The Bottom Line
I was lucky. I got caught by people who saw potential rather than just problems. I learned that cybersecurity isn't about being the smartest person in the room, it's about understanding human behavior and designing systems that work with it.
The students hacking school systems today, the employees who might bypass your security tomorrow, they're not your enemies. They're people with the same curiosity, persistence, and problem-solving drive that makes them valuable.
The question isn't how to stop them. It's how to build security systems robust enough that testing them makes them stronger, not weaker.
And if you're a business owner reading this, wondering whether you should worry about insider threats, remember: I wasn't trying to cause problems. I was trying to fix my friend's coursework file. Most insider threats start exactly the same way, with good intentions and poor judgment.
Build your security accordingly.
Source | Article |
---|---|
Information Commissioner's Office | Insider threat of students leading to increasing number of cyber attacks in schools |
National Crime Agency | One in five children found to engage in illegal activity online |
Center for Internet Security | 2025 K12 Cybersecurity Report |
Reuters | Massachusetts hacker to plead guilty over PowerSchool data breach |
US Department of Justice | Former student sentenced for damaging University of Iowa computer network |