The Vulnerability Arms Race

11 05 2010

This post was originally posted on CSO Online here.

If you are working in an organization with any sizable technology infrastructure, it has probably become quite apparent that your vulnerability management program has a lot more “vulnerabilities” than “management”. I recently had an email exchange with Gene Kim, CTO at Tripwire, regarding this issue and he boiled it down much better than I had heard anyone do this before. To quote Gene, “The rate of vulnerabilities that need to be fixed greatly exceeds any IT organizations ability to deploy the fixes”. Continuing the conversation with Gene, he expanded on his comment:

“The rate at which information security and compliance introduce work into IT organizations totally outstrips IT organizations ability to complete, whether it’s patching vulnerabilities or implementing controls to fulfill compliance objectives. The status quo almost seems to assume that IT operations exist only to deploy patches and implement controls, instead of completing the projects that the business actually needs.

Solving the vulnerability management problem must reduce the number of calories required to mitigate the risks — this is not a patch management problem. Instead it requires a way to figure out what risks actually matter, and introducing mitigations that don’t jeopardize every other project commitment that the IT organization has, and jeopardize uptime and availability.”

Having lived through this vulnerability arms race myself, this statement really rang true. Gene and I clearly have a mutual passion for solving this issue. In fact, I would extend Gene’s sentiment to the development side of the house where application security vulnerabilities are piling up in the industry. Great, so now what?

My first reaction to this, being a bit of an admitted data junkie, was to start pulling some sources to see if my feeling of being buried was supported and accurate. This post is an oversimplified approach, but works for confirming a general direction.

Lets go to the data!

First, what types of vulnerability information are security teams generally dealing with? I categorized them into the following buckets: Custom Applications, Off The Shelf Applications, Network and Host, and Database. A couple of very large data sources for three of the four categories can be found via the National Vulnerability Database as well as the Open Source Vulnerability Database. Additionally during some research I happened upon cvedetails.com. To simplify further we’ll take a look at OSVDB, which has some overlap with the NVD.

Looking at the first four months of 2010 we can see OSVDB is averaging over 600 new vulnerabilities per month. That’s great but on average how many of these new vulnerabilities affect a platform in my environment? The cvedetails.com site has a handy list of the top 50 vendors by distinct vulnerabilities. Viewing the list, it’s a fairly safe assumption most medium and large businesses probably have 70% or more of the top 20 vendors (Note: one of several sweeping assumptions, plug in your own values here). This ends up being quite a large number even while ignoring the long tail of vulnerabilities that may exist within your organization.

One category of vulnerabilities not covered by these sources is custom web applications. These are unique to each company and must be addressed separately. To get a general sense of direction I turned to the 2008 Web Application Security Statistics project from WASC. According to the report, “The statistics includes data about 12186 web applications with 97554 detected vulnerabilities of different risk levels”. That equates out to about 8 unique vulnerabilities per application. My experience tells me the actual number varies GREATLY between these based on size and complexity of the application. One piece of information not included in the report is the actual number of companies, which would give us a better idea on the number of applications each company was managing. For this data, we can use the recent WhiteHat Statistics Report, “Which Web programming languages are most secure?” (registration required). While the report subject was focused on vulnerabilities of sites written in different programming languages, they were kind enough to include the following – “1,659 total websites. (Over 300 organizations, generally considered security early adopters, and serious about website security.)”. That’s about 5.5 web sites per organization. But wait a minute; if we ignore the various platform data and just look at the total number of vulnerabilities versus the total number of sites, the organizations are averaging over 14 vulnerabilities per site. Of course this is averaged over a 4 plus year time period so we need to analyze resolution rates to understand at any point in time what a team is dealing with. According to the report they range drastically anywhere from just over a week to several months or even remain unfixed. The organization’s development and build process will influence not only the rate of resolution but also the rate of introduction. Given the limited data sources covered here, it’s easy to assert the rate of introduction is still greater than the rate of resolution.

While I haven’t gone into enough detail to pull out exact numbers, I believe I satisfied my end goal of confirming what my gut was already telling me. Obviously I could go on and on pulling in more sources of data (I told you I’m a bit of a junkie). For the sake of trying to keep this a blog post and not a novel I must move on.

So how do we go about fixing this problem?

The vulnerability management problem is evolving. No longer is it difficult to identify vulnerabilities. Like many areas within technology, we are now overwhelmed by data. Throwing more bodies at this issue isn’t a cost effective option, nor will end in winning this arms race. We need to be smarter. Many security practitioners complain about not having enough data to make these decisions. I argue we will never have a complete set but we already have enough to make smarter choices. By mining this data, we should be able to create a much better profile of our security risks. We should be combining this information with other sources to match up against the threats of our specific business or organization. Imagine combining your vulnerability data with information from the many available breach statistic reports. Use threat data and stats that are appropriate for your business to determine which vulnerabilities need to be addressed first.

Also, consider the number of ways a vulnerability can be addressed. To Gene’s point above, this isn’t a patch management problem. Mitigation comes in many different forms including patches, custom code fixes, configuration changes, disabling of services, etc. Taking a holistic view of the data including a threat-based approach will result in a much more efficient remediation process while fixing the issues that matter most.

Additionally, consider automation. This has long been a dirty word in Information Security, but many of these problems can be addressed through automation. I have written about SCAP before, which is one way of achieving this, but not the only solution. Regardless of your stance on SCAP, utilizing standards to describe and score your vulnerabilities will give you a better view into them while removing some of the biases that may inject themselves into the process.

In summary, vulnerability management has become a data management issue. But the good news is, this problem is already being solved in other areas of information technology. We just need to learn to adapt these solutions within a security context. How are you dealing with your vulnerabilities?





BlackHat Without The Drama

4 08 2009

Well another BlackHat is in the books and another round of vulnerabilities have been disclosed and bantered about. I was fortunate enough to be able to attend this year as a panelist on the Laws of Vulnerabilities 2.0 discussion. While I was happy and honored to be invited, I wanted to draw some attention to another talk.

No, I’m not talking about the SSL issues presented by Dan Kaminsky or Moxie Marlinspike. Nor am I referring to the mobile SMS exploits. Each year you can count on BlackHat and Defcon for presentations and displays in lots of interesting security research and incredibly sexy vulnerabilities and exploits. Every year attendees walk away with that sinking feeling that the end of the internet is nigh and we have little hope of diverting it’s destruction. But, despite this, we have not shut down the internet and we manage to continue to chug along and develop new applications and infrastructure on top of it.

I was able to attend a session on Thursday that explained and theorized about why this all works out the way it does. It was the final session of the conference and unfortunately was opposite Bruce Schneier, which meant a lot of people that should have seen this, didn’t. Of course, Bruce is a great speaker and I’m sure I missed out as well, but hey that’s what the video is for.

David Mortman and Alex Hutton presented a risk management session on BlackHat vulnerabilities and ran them through the “Mortman/Hutton” risk model – clever name indeed. They included a couple of real-world practitioners and ran through how these newly disclosed vulnerabilities may or may not affect us over the coming weeks and months. They were able to quantify why some vulnerabilities have a greater affect and at what point in time they reach a tipping point where a majority of users of a given technology should address.

David and Alex are regular writers on the New School of Information Security blog and will be posting their model in full with hopes of continuing to debate, evolve and improve it. Any of these new security vulnerabilities concern you? Go check out the model and see where they stand.

Note: This post was originally published on CSO Online.





WEIS Call for Papers – 2008

1 11 2007

The Workshop on the Economics of Information Security just opened up it’s call for papers for 2008. This year it will be held at Dartmouth College in New Hampshire.

I have written about this workshop in the past (here, here and here). The amount of quality content that comes out of this is incredible. As most readers of this blog know, information security is much greater than a technical issue. This workshop addresses many of those problems including the economic incentives of security and privacy, the various trade-offs individuals and groups must make to achieve a level of security, addressing negative externalities, the psychology of security and more.

If you have an interest in the driving factors of what makes many of our systems more or less secure, I would highly recommend this workshop. Last year’s workshop generated a tremendous amount of buzz about WabiSabiLabi. The online vulnerability marketplace for selling and purchasing vulnerabilities.

I would recommend reading Economics of Information Security as a great primer / introduction to the topics related to the workshop. You can find a link to it on the bookshelf of this site. It contains several papers that came directly from it. This is a great way to dig yourself out of some of the day to day technical details and start thinking about some of the more broad decision factors around security and privacy. While many of the papers come from academia, the information is a great way for those leading security programs in the private sector to understand the decision criteria that ultimately will fund or not fund their initiatives.


AddThis Social Bookmark Button





The Security Evangelism Tour Continues

11 10 2007

Fresh off the heels of speaking at the Security Trends event in Milwaukee, I will be participating in a keynote panel at the Technology Executives Club Risk Management event in Chicago.  

It was a pleasure meeting everyone in Milwaukee and wanted to thank my fellow speakers and moderator for putting on a good event. As I said before, these events tend to bring a wide array of backgrounds and I am always impressed by the “wisdom of the crowd”.  

The Risk Management event in Chicago will take place on November 15th. You can get more information on the event here and register here. If you find yourself in Chicago during this time, I’d love to meet you there and looking forward to some lively discussions and note comparison of the issues we’re facing.

UPDATE: The tour went through a bit of a shuffle this week. Due to some last minute commitments I was not able to make it to the Risk Management event this week, however; I have agreed to serve on a panel at the IT Security Best Practices event on January  24th. Hope to see you there. 

 

AddThis Social Bookmark Button





Recent Readings (and listenings)

6 08 2007

I recently finished two books (OK one of them was audio), The Long Tail: Why the Future of Business is Selling More by Chris Anderson and Silence on the Wire: A Field Guide to Passive Reconnaissance and Indirect Attacks by Michael Zalewski. While they are both very different, they were both good reads and very appropriate topics for this blog. I would recommend both to regular readers here.

Ever since I finished the long tail a couple of weeks ago, I had been meaning to post on it and what it means to information security. Well while I was busy with other things a couple of people went it did just that. Over the weekend Mark Curphey wrote a 2 part post which sums up the book and how it relates to our field at a high level. Part 1 is here and Part 2 is here. I encourage you to read his posts if you have an interest in the economics of information security.

Another area that came to mind while reading this book (sorry, listening to this book) was the ever present topic these days of compliance. Organizations today have a number of regulations and laws that they must comply with in a given industry or geographic region. Some of these requirements make economic sense for the business, others are their to control the negative externalities of security. After reading (argh! LISTENING) to The Long Tail, I spent some time wondering how could a set of tools, processes, etc. make compliance economically sound and a choice organizations would make regardless of outside requirements (laws, regulations, etc).

I would like to challenge readers of this post to come up with some new ideas that would make these requirements that traditionally go against the rules of risk management and make them more sound for YOUR organization. The key here is every organization is different. What may make economic sense within mine, makes little to no sense in yours. That’s what makes the “one size fits all” approach of several regulations difficult on most companies today.

Have an idea? Post it here in a comment or send me an email!


AddThis Social Bookmark Button





ISM Community Top 10

27 06 2007

The ISM Community has published the ten most important things all organizations should be doing regarding information security. Having played a role in this I am admittedly a bit bias, so I will leave all judgements open to the reader.

I especially enjoy the tips and tricks from the field. 🙂


AddThis Social Bookmark Button





Out of Hiding

7 06 2007

Well it’s been a while since I posted anything here. I have a million excuses, but I’m sure everyone has heard them all.

I have decided to take a change of direction in the PCI standard review that I most recently blogged about. After having several conversations with Mark Curphey, I’ve decided the best approach to the issue is working with him and several others on a new OWASP project – The OWASP Web Security Certification Framework.

It is our hope that this will be adopted and used to meet web application security requirements for PCI compliance and any additional regulatory requirements associated with this topic. Look for more on this standard this summer.

For those of you who don’t know Mark, I would highly encourage you check out his blog. He has a great security background working at places like Foundstone and ISS, as well as the original founder of OWASP. He’s currently working on a new startup that is taking off rapidly. I spoke to him about his new company and the work they are taking on, it’s very ambitous and fills a big gap in information security management software today.

If taking on the OWASP project wasn’t enough, I am also collaborating with Mark and others on something for the ISM Community. We’re creating a list of Tip & Tricks from the Field for the ISM Community Top 10. This will give readers a quick jump start on implementing key concepts for their Information Security Program.

Watch for more frequent updates and publications on these new projects.


AddThis Social Bookmark Button





Speaking of Risk Management

16 11 2006

I had the opportunity yesterday to participate on a panel discussing risk management at the Technology Executives Club in Chicago. I met a lot of interesting people and wanted to thank the TEC for the invite.

One of the recurring subjects at the event was the prioritization of risks. Of the 100 things you currently have on your plate, how do you decide what is the next issue to work on or address? Without trying to downplay or simplify the issue, this seems to me to be a basic risk management question. While managing information security risks can be as much art as science, in its simplest form, a risk is its potential impact multiplied by its likelihood. Given that result, you can make decisions to accept, mitigate or eliminate the risk based on cost (of all kinds). Of course this is a simplified view of things, and each risk certainly contains tough to quantify gray areas.

I think the real issue here is bad data. In industries such as insurance, actuaries have the ability to rely on good data from the past in order to predict the likelihood of certain events in the future. This ‘good data’ doesn’t really exist in information security today. The one report that is continually brought up on this subject is the CSI / FBI Survey. I think Bruce Schneier summed up this report best. Security professionals do not have large amounts of accurate data to rely on, making the likelihood portion of the risk management equation difficult at best.

Updated 2/26: Updated to add link to webcast of panel I participated in.


AddThis Social Bookmark Button