It seems the further we get into this game called compliance, the more we see and hear about big security breaches. Regulations such as HIPAA, HITECH, GLBA and PCI DSS have been put into place to improve information security and privacy. We also have widely accepted security frameworks such as OWASP and IEC/ISO 27002 at our disposal. Furthermore, product vendors have find-tuned their security technologies.
Given all this, one would think that achieving reasonable compliance and minimizing business risks would be relatively simple. Yet data breaches still occur. Subsequent fines are levied and people get into trouble. But the wheel keeps turning and nothing really seems to change.
Is this a just misconception? Perhaps it's because the increased visibility of security and privacy have created a "switching station" effect similar to when you buy a new car and then all of a sudden everyone else has one. It could be a little of both. We'll never really know. But there's one thing I know with conviction that's creating a lot of the problems we're experiencing today: checklist audits.
By checklist audit, I mean someone coming in, literally with clipboard in hand, and reviewing your information system's security controls at a very high level. I can hear it now:
- Are passwords being used? Check.
- Are patches being applied? Check.
- Are unique user roles required? Check.
- Is a security policy in place? Check.
Such a list of security controls may be based on best practices or something like the PCI Self-Assessment Questionnaire. It may even be something as simple as a security scan. Skimming across the top of a complex information security system, however, is in no way indicative of how things really are.
Just this week, I experienced something that could have created a false sense of security -- and some serious problems -- had a checklist audit been performed. During a Web application security assessment project, the Web vulnerability scanners I used didn't reveal much. Looking at the application I found that it required strong passwords and even logged users out after a certain time period. There were different role levels to ensure that users had the proper access privileges. But digging deeper into the bit and bytes of the application revealed a serious authentication issue that allowed a regular user to escalate his privileges and do things that only admin-level users were supposed to be able to do. It was a big vulnerability that would've been completely overlooked by a checklist audit or simple security scan.
The same goes for sensitive data stored on network and mobile devices. People assume that all's well just because the data is on a "protected" server that requires a login to gain access. Likewise, it's assumed that a laptop computer's power-on password or Windows login prompt is going to keep everything safe and secure. These scenarios would most certainly pass a high-level audit, but looking at them from a real-world perspective reveals how they can easily lead to pretty serious data breaches.
The lesson to be learned here is that high-level security reviews of your computers and applications are not nearly enough. They may be enough for the regulators, but they aren't nearly enough for your business. Don't take this the wrong way. I know plenty of IT and security auditors who are fine people and are good at what they do. But the majority of the "audits" looking at your information systems from this perspective are missing the boat. You have to put on your malicious user hat and go about poking and prodding your systems using ethical hacking techniques. Otherwise, you're going to miss the big things that really matter.
Kevin Beaver is an information security consultant and expert witness, as well as a seminar leader and keynote speaker at Atlanta-based Principle Logic LLC. He's the creator of the Security On Wheels information security audio books and blog, providing security learning for IT professionals on the go. He can be reached at www.principlelogic.com.