As one of the early innovators of firewall, VPN and intrusion detection systems, Marcus J. Ranum is considered a pioneer in the security technology field. Ranum is now CSO at Tenable Network Security, Inc., where he continues his groundbreaking computer security research and design work.
Ranum is a huge proponent of system logging, which he calls the most important tool in computer security. In this video interview, conducted at the RSA 2014 Conference in San Francisco, Ranum discusses the security benefits of system logging and offers his thoughts on the evolution of the IT security field.
If you were entering the field now and had a chance to invent or develop something, what would your area of focus be? Why do you think it's so important?
Marcus Ranum: System logging. I've always been interested in system logging. I think it's probably the most important computer security tool. I've been working on system logging for 29 years or so now, and I would still be working on it.
People say the next big thing is intelligence- and data-driven security. Why do you think system logging is so important?
Ranum: Where's the data that people are going to build that intelligence from? I still see system logging as a really interesting piece of it. At Tenable, we're pushing a strong continuous monitoring model: Being able to collect information about everything that's going on in your systems and your networks at all times. It's going to shorten your response time in the event of a breach. It's going to allow you to improve your security if you identify things that are going wrong. It's going to allow you to have a clue of how your systems and network are being used. These are core capabilities that I think a lot of organizations don't really have.
More from RSA 2014
CISO collective intelligence provides data security advantage
Is there a challenge with using an analytics- or intelligence-driven security model?
Ranum: I think there is. The biggest problem is having analysts. A lot of organizations are collecting all of this data and then they say, 'Well, we don't actually have anybody that can review it,' and they turn the collector off. That's a really bad move. If you don't have someone who can review the data, keep it. If something goes wrong, you're going to need the data in order to review it. Ideally, what you'd be doing is you'd be monitoring in real time, and continuously monitoring your network to look for anomalies or potentially abusive activity.
Are there emerging threats that people should be aware of, and are there any measures they can take to protect themselves?
Ranum: I don't think that there are any particularly new emerging threats, honestly. Unfortunately, security tends to always fight a rear-guard battle. We're almost always up against the mistakes that were made 10 or 15 years ago. In security, we're really just now starting to cope with the problems that were raised with distributed computing. We haven't even gotten to transitive trust. If the hackers start to understand transitive trust and use transitive trust attacks, we're going to have a serious problem.
I think system designers need to really try to think outside the basic problem set that they're dealing with, which is a big problem. I don't think system designers are even dealing with the problem set that they should be dealing with in a lot of cases.
What has been your most memorable job-related experience so far? What issue excites you?
Ranum: I'm fascinated and I really can't wait to see how the whole issue around Edward Snowden's disclosures sorts itself out. That raises huge public policy issues and it raises huge system integrity issues. I think most of the security practitioners are focused on the public policy issues, but nobody's really talking about what it means for software reliability when you have governments building back doors into everyone's critical infrastructure. Who is liable?
What happens if some hospital's computer gets crashed by some backdoor code that a government agency planted there and someone gets killed as a result? We can't allow government to do the kind of things that we put people in prison for doing when we catch them. That's a huge, very important question. Do you want to be part of the computer security gravy train and support that kind of activity? Or, do you want to be part of the computer security defenders and try to defend against that kind of activity?
I've always stated very clearly and decisively that I think my job as a security practitioner is to defend systems. I don't care whose systems they are, I'll defend them against anyone who's attacking them in any way. My responsibility is to keep bad guys out of systems. I don't have to worry about whether this justified under this particular situation and under this particular treaty. I think that's something that a lot of security practitioners should be trying to figure out how to align themselves with.
Are enough people involved in application development, or is that the problem?
Ranum: Ideally, security practitioners would be involved at all of the levels of an application's development. But really, the place where we're not involved where we should be is at the overall system design level. That comes back to the business objectives of whatever the particular piece of software is.
If your objective is that you're going to write an online game where people log in and bash trolls and orks, that's one thing. If you're trying to write a system where someone's going to remote control a hospital's surgical theater and you're going to have somebody cut open on a table, that's another thing.
When businesses say 'This is what we're going to build,' that's the point when someone has to say 'Okay, the chief architect is putting this system together so you have to put some thought into where you need to architect security into it.' Then the individual project team leaders are going to implement the components of this system. Eventually, it gets down to the programmers who have to understand how to implement the components of the system using good practices, like minimized methods and fail-safe defaults -- all this stuff that we've been talking about since the '70s.
The trend in the industry now is to say 'Woo-hoo, rapid application development, we're going to throw stuff over the fence and we'll fix it after we've got 6 million customers driving it.' It's a good thing nuclear engineers and nautical engineers don't build airplanes and power systems that way.