As I put the swerve on the storied RSA Security Conference for the first time in 16 years, I am reminded of just how long I have been saying, and hearing, the same lessons around security: “There is no silver bullet,” “You need process, people, -and- technology,” “Vulnerabilities are on the rise,” “Attackers are getting smarter and more sophisticated.” Given my own 20 or 25 year run in this space, I admit that I have a hard time seeing that there will be a fundamental shift in these beloved chestnuts in the next 20 or 25 years either. So, I decided to spin the dial in the opposite direction. Maybe there was someone who had ventured philosophical truths that I could adopt? Maybe –waaaay- back in the past, there were views that could reinvigorate the way that I was thinking about the challenges of protecting our organizations and ourselves.
I figured that there might be something good around the year 0. Why not? I’ve seen Bill and Ted’s Excellent Adventure; Socrates (pronounced “SO – Crates” for the uninformed) offering philosophical lessons before an audience of rapt acolytes, sharing views on our own evanescence (“… all we are is dust in the wind. Dude … ”). He was still topical and relevant, and could relate to both Ted “Theodore” Logan and Kansas …
While Socrates certainly had plenty to offer, I found myself captivated by a different toga-wearer. I spent my time with Seneca the Younger: scholar, humorist and tutor to Nero, lived from 4 BC to 65 AD. Before he off’d himself because he was under charges of treason against the emperor, Seneca said some things that could drive a whole new agenda at the Moscone Center some year. Since I am unwilling to wait, I’d like to share them here:
On the Aspirations of Security
“It is not goodness to be better than the worst.”
This is a lesson we need desperately, as enterprises ask for maturity models and best practices become synonymous with sufficiency. Blended with a current analog, “I don’t need to run faster than the bear; I just need to run faster than you,” Seneca counsels us to drive towards an appropriate level of security to protect our interests and not to fall for the mistaken logic that supposes that attackers will be so lazy as to pick their targets solely from among the weakest of the lot. In the current environment, attackers choose target and value first, and then derive the attack(s) that will get them in. Being better than somebody else is nothing like being good enough.
On the Virtues of Proactivity
“Fate leads the willing, and drags along the reluctant.”
Nothing screams regret like an announcement that the SEC is going to be auditing your security for 10 years, or that third-party review of your next several security plans is needed to get recertified or reaccredited by some compliance or standards group.
Organizations that wait for the badness to happen in order to understand when they should start to worry are playing a very risky game with their autonomy and their reputation. In a recent meeting with a group of automakers on the challenges of securing the rapidly expanding family of wireless sensors and controls in automobiles, I was describing the well-documented vulnerabilities in several types of automobile communications. I was told that “…until this is done to our customers in the real world…” there was little appetite to address the issues. Because “this” means disabling brakes and door locks, I think there will be plenty of dragging when Fate does come knocking.
Our Insatiable Appetite for New Stuff
“It is easier to exclude harmful passions than to rule them, and to deny them admittance than to control them after they have been admitted.”
When asked by the media and customers about what to do first in order to secure their systems, I invariably recommend that before buying new technology or rolling out new plans, the asker should take a step back and look at their assets and their exposure. Do they need all that connectedness? Does the customer database need to be accessible to every requestor? Is there some reason why they have adopted 11 different platforms and 200 different frameworks? I truly believe that with good strategies to limit what gets stored, and to think hard about categorizing and partitioning data, access and networks, much of the risk in many environments can be mitigated before adopting another single piece of active defensive technology. In the same way, thinking hard about what services get spun-up, what systems get internal access, and what software or service providers are allowed to see private data, risk can be eliminated by simply reducing the introduction of new moving parts.
Our Difficulty in Justifying and Expanding Security
“Our plans miscarry because they have no aim. When a man does not know what harbor he is making for, no wind is the right wind.”
In the ironic icons category, the same security practitioner who will assert modestly that “No security is 100%” will also have a Mission Statement that says “To ensure the security of corporate and customer data against a growing set of threats and attackers.” In an absence of specificity and a limited vocabulary to define what is meant by security, the planning, justification and measurement of that security is a very soft art form. The indicators of failure are clear: There is a breach. The indicators of success are almost unknowable. There wasn’t a breach? Well, how do you know? And is it just because the organization didn’t lose the botnet lottery and get picked up as the next target? We need to develop this language, these goals and measurements that can be consistently applied, whether internal to our own organizations or more broadly for our industry. Without it, we can only continue to develop mitigation for the threat or breach we see, and we will never be able to project the path we can justify as we move ahead.
On the Role of Awareness
As a teacher, it is not surprising that there are several applicable Seneca soundbites here:
“Shame may restrain what law does not prohibit.
A strong awareness plan that helps people to better understand their own roles and responsibilities can go a long way to inspiring the right behaviors. In my experience, people, even users, want to do to the right thing, and helping them to know that they are potentially hurting their organizations or themselves can be just the right message to get them moving.
“That is never too often repeated, which is never sufficiently learned.”
I like this one because it is both true and an interesting exercise in semantic gymnastics. Basically, this tells us that our security awareness is not complete until people are fully aware of security. I extend it to finding new ways to repeat the old message, to drive the habituation of good security behavior. Confirmation dialogs, forced second click to download content, outbound filters and proxies, all are good. By forcing ourselves to address the absolutely human tendency to backslide with recurring, if redundant, messages, we can keep the priority and visibility of security up.
“The first step towards amendment is the recognition of error.”
A culture of openness in terms of internal disclosure is positively vital at a time when internal attack and malicious code spreads so quickly. Whether it is at the organizational or individual level, rapid acknowledgement of a breach is the most effective way to bolster the accuracy and immediacy of analysis and mitigation. The awareness culture must foster a resistance to attack, but it should also make it clear that things like this happen. If security applies too much pressure on individuals to never be compromised, then there is a vanishing likelihood that the earliest users affected will actually step up to claim responsibility.
On Creating Long-Term Strategic Change
“An unpopular rule is never long maintained.”
Most security is usually viewed as gate-keeping, lock-stepping, buzzkill. Nobody likes complicated passwords; log and system monitoring feel kind of creepy; and stopping a product or service release because of a security problem can seriously reduce the invites for lunch. Security, though, is now about much more than disabling user function to protect corporate goodies. With social networking, personal email and the interweaving of public and private lives, security has got to become a more natural and appreciated part of the fabric in our relationship to technology and each other. The types of cultural change that good security requires needs to be something that fits, and is valued, by virtue of the contributions that it makes. We need to strive to be less disruptive, to be less of a block and to be a safer path to get business goals accomplished. If we can move from Traffic Cop to Mountain Guide, the security will become a goal that people are happy to contribute to.
Lastly, On Changing Our Attitude
“If you would judge, understand.”
For most of my career, I have watched many in our industry point and laugh. Expose and condemn. It happens today, as security lapses are viewed as abject failures by companies or as results of systemic neglect. I am not arguing that these lapses are not failures, but I’d ask that we take a more balanced view. There is no acceptable reason for comfort with an insecure status quo, but there are many reasons why this problem will persist for most of my lifetime. The underlying infrastructure and the education and habits of our developers are riddled with gaps in terms of understanding and enabling security. Today’s developer or administrator at almost any firm will be dealing with devices and software that was written by someone else, in a different context, and likely in a less threatening environment. People are loath to engage in security improvement because they are reluctant to look like fools. It is human nature. Many illnesses and injuries are the result of some personal choice, or ignorance, or accident, but if the doctor’s first 15 minutes were always a lecture on what a glutton, or idiot, or clod, the patient was, I can guarantee that overall health would decrease pretty rapidly.
Let’s resolve, in support of all of this wisdom from so long ago, to reexamine ourselves. To reapply ourselves to the problems and potential in security, and to again aspire to improvement, along all of these paths.
About the Author:
Jack Danahy is the Director for North American Security Consulting at IBM, and has built and sold two successful security software companies; the second, Ounce Labs, to IBM in 2009. Jack is an international speaker and writer on software, system and data security, and he also holds five security technology patents. He is a fellow in the Ponemon Institute, a Computerworld Honors Laureate, and he has contributed to legislation on computer security in both the U.S. House and Senate. He has dedicated his career to making computer security both comprehensible and tractable.