FYI.

This story is over 5 years old.

Tech

We Need to Change the Psychology of Security

What's wrong in the world of cybersecurity, and how it can be fixed.
Image: Jesus Sanz/Shutterstock

Adrian Sanabria is an industry analyst at 451 Research, where he does his best to make sense of the security industry for clients. After over 15 years as a hacker, security professional, PCI QSA and and incident responder he still sees the cup as half full.


There are a wide variety of opinions on how to fix security and stop the seemingly endless parade of breaches. Like many, I believe the problem is multi-faceted: it's more than just a lack of encryption, the inability to block malware, or that IT professionals don't do "the basics," though these all contribute to security failures. I believe we have a people problem, but not in the same way that most might think.

Advertisement

The security industry has made a lot of mistakes along the way, and some of these mistakes have made the security professional's job needlessly difficult. In hindsight, one of these mistakes was to separate security from IT. We need to correct that.

Hear me out before you flip any tables—I'm not suggesting we need to fire all the security professionals.

The security problems we're now facing can't be fixed with products alone. We can't fix them with more security analysts any more than a retailer could fix shoplifting by assigning a security guard to watch every shopper as they wander around the store.

If security isn't part of someone's job, that doesn't make them neutral. It usually results in them working against us.

A combination of products and people, then? Maybe, but I believe that still doesn't go far enough. We do need more people, but they need not all be security guards. I'm not convinced we need more products at this point. Better products, perhaps, but not more.

What we need is to change the psychology of security. We need to change it for everyone, especially those dedicated to the role of stopping attacks and breaches. If we can believe that, in a for-profit organization, helping the company protect and grow profit is everyone's job, I think we must also believe that security should be everyone's job. For the same reason a retailer can't hire enough security guards to personally watch each customer, we'll also never be able to hire enough security staff to effectively protect the company.

Advertisement

In fact, if security isn't part of someone's job, that doesn't make them neutral. It usually results in them working against us.

Doing (more of) the same thing and expecting a different result

What are we doing that's not working and why isn't it working?

Some believe better products could stop, or at least minimize data breaches and attacks on corporate networks. That, if we throw more machine learning and giant data sets at the problem, we could stop the madness. Others believe we simply need to throw more people at the problem. The truth, however, is that we've locked both into an inefficient cycle. The cycle roughly looks like this:

  • Security products typically generate huge amounts of data.
  • As a whole, the data is usually very low quality — mostly noise.
  • It takes a lot of human capital to pull any signal out of that noise.
  • Most of our people end up focused on sifting through noise instead of doing security work. I'm regularly told "we're lucky if we ever see 2 percent of the alerts we get."
  • The industry creates more products that use machine learning and NoSQL databases to prioritize this mess of noise. What's the result of having one security product clean up the output of another one? A smaller stream of data that's still mostly noise. This kind of situation gave birth to one of my favorite terms: expense-in-depth, a play on the security principle of "defense-in-depth."

Playing the cyber edition of Where's Waldo isn't a great use of a talented and expensive workforce. Furthermore, slogging through piles of alerts and logs isn't the kind of engaging, creative work that security-minded folks enjoy doing. The result is both a lot of turnover and more interest in jobs on the offensive side — the whitehats that perform penetration tests and do security research. Admittedly, being the one causing security systems to vomit endless streams of alerts is a lot more fun, and most of the folks I know on that side of the security industry aren't terribly interested in switching to defense. I don't blame them.

Advertisement

Why do we put up with all this noise? Why don't we just tune most of it out? FOMO, or Fear of Missing Out, is the the fear that we might miss something, or the feeling of regret that comes afterward. I don't know if this is the real cause, and I'm no psychologist, but I've seen it anecdotally during penetration tests:

-"What do you want me to look for, or focus on?"
-"Uncover everything. We want to know everything that's wrong."

This seems like a noble approach — responsible even. In reality, it sets an impossible standard that few, if any defenders will achieve.

How bad is the problem? Independent analyses of the Target breach discovered that Target's systems successfully detected the criminals several times during the attack. Most of these alerts weren't seen until long after the attackers stole the payment data. None of them were acted upon. This is par for the course as data breaches go.

Better Products, Worse Security: shifting the mindset

Security is a skill, not a role.

Okay, security is a role and a skill. I'm not advocating for security departments to be dissolved entirely. I am advocating for an overwhelming shift of security responsibility and expertise to the subject matter experts.

Wait, what? We're not the subject matter experts? Not really — as a rule, unless you have a huge security workforce, the security role in a company is as a generalist. Any subject matter expertise we have in security is really a subset of larger subject matter. You're unlikely to find a database security expert that wasn't previously a database administrator.

Advertisement

The generalist role, however, turned into an absolute where anything security-related could become the security team's job. Personally, I've been responsible for redesigning access control on mainframes, writing policy for secure software development and talking to new hires about not clicking on links in suspicious emails. Simultaneously. It is important to understand that nine-tenths of security is a secondary layer on top of some other technology or discipline. It generally doesn't exist for its own sake. The subject matter experts are the mainframe administrators, the developers and the training specialists. I have no direct experience as a trainer or developer. I've never even used a mainframe directly for actual work. The only reason I was stuck with security tasks in areas I was wholly unqualified for is because they happened to be security related.

Playing the cyber edition of Where's Waldo isn't a great use of a talented and expensive workforce.

That's brings me to one of the key ways in which we've screwed up security: We isolated security from the rest of IT. Deepening the issue was the development of a culture that prized the elitist. As a community and an industry, we've pointed and laughed at how inept IT was at security. With their egos already bruised from the embarrassment of a breach or a discovery by the security team, the "unworthy" were all too ready to accept the pronouncement: "bad at security."

Advertisement

It gets worse. I believe that once we separated security from the rest of the company, psychologically, employees came to see anything associated with security as someone else's job.

"We have a department with security in its title! They keep us secure — we don't bother with it. Surely we'd just screw things up, right?"

In fact, this effect has been seen before in the software quality assurance world, and has been well documented by Elisabeth Hendrickson (@testobsessed). Did you do a double-take on that PDF? Yes, it was written in February 2001, around the time that most of the larger companies were just beginning to establish dedicated security roles. It's not a dry read at all, either — more like a shorter, QA-focused Phoenix Project, as she outlines everything from the cause through the effect and the proposed solution.

In short, she found that adding a lot of QA resources inadvertently sent the message that quality was now someone else's concern. The result was a marked drop in code quality. To address the problem, the responsibility for getting code out on schedule was shifted to the developers, and QA resources were artificially bottlenecked. Faced with limited time and QA resources, developers were forced to ensure code was good enough to get through QA quickly and predictably. There's much more to it than that, but you'll need to read it to get the full picture.

As I read through "Better Testing, Worse Quality," I couldn't shake the thought that the same psychological effect Hendrickson was describing could apply to security as well. I became increasingly convinced that it did apply. The more I looked outside security, the more I saw the same effect occurring in other industries — even music.

"Some producer with computers fixes all my shitty tracks" — Rockin' the Suburbs by Ben Folds

The security experts we need — even the warm bodies we need just for looking at logs—aren't going to materialize. I don't see machine learning or AI making up the deficit either. We need all the help we can get. We need everyone to worry about security, but the unintentional message we sent was, "don't bother with it — don't even try, we've got this."

We need to deputize everyone. We need to shift responsibility back to the real experts — the folks that write the code, train the new hires and (bless their hearts) keep those mainframes running. The security role doesn't go away, but it does change. We dismantle the paradigm of security as the gatekeeper — a warden that needs to verify anything that comes or goes. We instead become partners and resources for the business to leverage. Instead of hoarding security knowledge, we share it and teach it, so that we can mount a defense that all employees feel confident to take part in.

The Hacks We Can't See is Motherboard's theme week dedicated to the future of security and the hacks no one's talking about. Follow along here.