I got a coffee with Chris Harrison (or tea in his case as he seems to run off the stuff) to talk about red teaming and physical security. Harrison (as we call him, as we’ve already managed to accumulate two Chris’s) spends most of his time in cybersecurity, but he's got strong opinions about the physical side. Strong enough that on my first day at CyberTeam, he had us doing lock picking while he showed off his collection of security gadgets.
To kick off our conversation, I first asked about his approach to red team engagements, and his answer surprised me: "The first thought that goes through my head is always, how do we interact with the client without hurting anybody?" Starting with ethics wasn’t the answer I expected, but Harrison framing the conversation from a human standpoint was a very Harrison thing to do.
Culture Over Controls
Harrison's big thing is framing physical security problems as cultural issues rather than technical ones. He gave me this example about clients who start by saying everything's on the table for testing with no limitations. But when he asks, "so it's okay if we smash a window?" they suddenly backtrack.
His point is that most organisations haven't really thought through what they need from security testing. Sometimes clients get too eager to help, offering access cards or credentials. According to Harrison, this defeats the point of an unbiased and realistic assessment, and you end up doing security theatre instead of real testing due to the attack scenarios being unrealistic.
This is why Harrison rarely recommends red teaming for organisations just starting out. As he put it, "I would almost never recommend a red team [starting out] for physical security. I just can't see it being practical, you would find a bazillion things that you could have told them just simply by popping down for half an hour."
Plus, red teams can become unnecessarily competitive, with both assessors and clients wanting to “win” rather than focus on realistic security improvements. Clients may impose unrealistic restrictions or monitoring that makes it artificially difficult for assessors to succeed, whilst assessors might pursue elaborate attack vectors that bear little resemblance to real-world threats or withhold information that led to their success because it’s their “special sauce”. This mutual competitiveness can undermine the core purpose of getting an honest assessment of the client's actual security posture. Instead, Harrison prefers purple teams for newer organisations:
"A black hat or a gray hat, talking freely and openly with a white hat."
—Chris Harrison on purple teaming
That's his way of describing collaborative security testing. You're working together to improve the client’s security posture and culture in real time as well as find problems before real attackers. Rather than trying to outsmart each other and losing sight of the end goal, you’re both working together to improve the client’s security posture.
Harrison was talking about how he likes to suggest security controls which provide staff with ‘cultural permission’, and I brought up airlocks as an example, which are secure waiting areas where visitors get processed before entering the main workspace. Harrison jumped on this, explaining how instead of expecting someone to confront a potentially dangerous stranger directly, you give them tools to make challenging easier and less confrontational. You design processes where questioning becomes natural, expected, and safe.
Name tags, visitor management systems, controlled entry points, these aren't just physical security controls, they're cultural tools. They make it socially acceptable for employees to say "I notice you don't have a visitor badge, let me help you get signed in" instead of having to directly challenge someone's right to be there.
Then he gave me this analogy about demanding a rock to fly. You could engineer one, he said, but it's much better just to accept that they don't fly and plan for it. I quite like this analogy. I know for a fact that without assistance, I would find it difficult to challenge someone if they were simply standing outside our doors, seemingly waiting to be let in. I would feel as if I were being rude or violating some sort of unwritten social contract. Framing the technical controls to recognise this failing, and then support me in conducting myself in a security aware manner would be the sort of workplace that would be more comfortable, but also more secure to work in.
Why Kind Cultures Are More Secure Than Paranoid Ones
Harrison argues that paranoid, stressed employees make terrible security assets. "You become self-centred and paranoid. You get sick a lot, and you have trouble with mood regulation," he told me. His claim is that stressed people make poor decisions under pressure and avoid taking risks, including good risks.
I'd not thought about it this way before. We're conditioned to think security requires suspicion and vigilance. But Harrison's arguing that when people feel supported and trusted, they're more likely to speak up about suspicious behaviour and think creatively about problems.
He was adamant about where this culture starts, stating "If your C-suite people are genuinely living the behaviour that they want from their people, it'll be a thousand times easier." CyberTeam is a smaller company, where I have direct visibility over our leadership's actions and their effects, so this especially rings true for me. If I saw our directors consistently disregarding security protocols, I'd probably be more inclined to do the same. Not out of malice, but because if they don't see it as important or a priority, then why would I? Leadership behaviour sets the tone for what actually matters versus what's just policy on paper.
The Foundation is Human After All
What struck me most wasn't Harrison's solutions, but his framing of the problem. He kept coming back to culture. Culture that encourages people to care about security without being paranoid, gives them tools to challenge behaviour gracefully, and treats mistakes as learning opportunities.
His argument is that you can have sophisticated security systems, but if people don't feel empowered to use them properly, you're just building expensive theatre. The foundation isn't technical at all, it's human.
Alex Keegan, Security Consultant at CyberTeam, and tall person
