Comments on SCAP 2008

Posted September 24th, 2008 by

I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

Presention slides are available at the NVD website

I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

  • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
  • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
  • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
  • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
  • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
  • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

Hall of Standards inside NIST HQ photo by ME!!!

What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

Side notes:

I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

Deer on NIST Gaithersburg Campus photo by Chucka_NC.



Similar Posts:

Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
Tags:

Get Yer SCAP on with NIST

Posted September 5th, 2008 by

Our friends at NIST are hosting a SCAP conference and workshop from the 22 to the 25th of September.  Go check it out here.  Registration runs until the 16th.



Similar Posts:

Posted in NIST | 2 Comments »
Tags:

Oooh, DITSCAP to DIACAP is SOOO Hard

Posted April 9th, 2008 by

Very nice article in Military Information Technology Magazine (Online edition in case you couldn’t figure it out) about the DITSCAP to DIACAP transition.

Just looking at the concepts behind DIACAP, they’re very sound.  In some places, the article whines a bit too much.  Me, I’m glad to see DITSCAP go the way of the flesh in favor of risk registers and sharing of risk information with “business partners”.

My favorite quote this week:

“The services face a number of other challenges in implementing DIACAP, not least of which is what Lundgren called ‘significant cultural issue’ in moving from the ‘paperwork drill’ characteristic of DITSCAP, to DIACAP, ‘where you’re expected to actually go out and do the testing.'”

How can that NOT be a good thing?

Some other good quotes in the article and my random thoughts:

“Training and education of personnel is another concern faced by DoD components, according to King. ‘They must make sure they have a cadre of information assurance professionals who are in full understanding of what DIACAP is and how it differs from DITSCAP,’ he said. ‘This includes the complete realm of IA professionals, including principle certification and accreditation personnel to program managers and IA managers. There is a significant training and education tail that need to be accomplished for DIACAP to be properly implemented.'”

Well, to be very honest, I think that this was a problem with DITSCAP, is a problem with NIST 800-37, and will continue to be a problem until I work myself out of a job because everybody in the government understands risk management.

“This is going to save money and time because it allows capabilities to be put out to the field without having to be certified and accredited three or four times.”

That’s a happy thing.  Wait until DoD figures out how to do common controls, then they’ll find out how to save scads of money.

Now want to know the secret to why DIACAP will succeed?  This is a bit of brilliance that needs to be pointed out.  DIACAP became the standard in late 2007 after the DoD watched the civilian agencies go through 5 years of FISMA implementation and were able to steal the best parts and ignore the bad parts.

Future state:  civilian agencies borrowing some of the DIACAP details, like scorecards and eMASS.

Future state:  merging of DIACAP, DCID 6/3, and SP 800-37.

Future state:  adoption of the “one standard to rule them all” by anybody who trades data with the Government.



Similar Posts:

Posted in Army, Risk Management, What Works | 1 Comment »

SCAP for Dummies

Posted October 2nd, 2007 by

SCAP is becoming one of my favorite government acronyms: Security Content Automation Protocol. OK, what does that mean in English? Well, it’s a glue to hold together a whole slew of xml nummie data goodnesses such as the National Vulnerability Database and a standard for asset inventory management.

I was pretty skeptical on SCAP (and the Federal Desktop Core Configuration–FDCC) when it was first announced–like wow, we have yet another obscure memo from Karen Evans that we have to address.

I had a change of heart after I heard the magical phrase “We know it’s going to break things, and we don’t care”. That made me take notice. I thought about it all weekend–I was getting really riled up over such an obvious irresponsible security hard line. But then I found the magic in what they were doing and learned to stop fearing SCAP and embrace the love that it brings. I’ll tell you why.

Imagine you’re Microsoft. You can’t harden down your OS because you have all the applications vendors (including the A-V/Malware guys) raising the big anti-trust flag. And they’re right to do so. Maybe at one point, you could make your software “secure by default” but that was 20 years ago, and if you would have done so, you would have been last to market.

But that doesn’t work to plug the holes in the OS. In my opinion, it’s the lesson of Vista: if you make it stronger, it breaks applications. We all know that, so a design choice is to either leave the holes or give you a nag-screen or a combination of the two. Speaking strictly from the security side of things, that–along with continuous OS patching–is just “polishing a turd”. Yeah, you can make it all shiny on the outside, but deep down inside it’s still nothing pretty.

But now put yourselves in the Government’s shoes: You buy an OS and spend how much time and effort into OS hardening. That’s money you could spend elsewhere. The people at the top of the Government understand this, that’s why they’re always looking at ways to simplify.

OMB and others have been pushing SCAP pretty hard. So far, most of the focus has been on the databases that exist (CVE, NVD) and the desktop configuration (FDCC).

Think about a pre-hardened Government OS. What it does is break applications–applications that are poorly designed. If your application is poorly designed and doesn’t work with the FDCC, then you’re squeezed out of the public sector. The true capitalists here would say something like “let the market decide who the winners are” or something like that. Realistically, if you want a slice of the federal IT budget, then you need to make your software compatible with their hardening standard. They make it easy to do, with tools to test your software and a certification program.

The part that I like about SCAP is that it’s the Government doing what the OS vendors can’t–put pressure on the applications guys. As usual, this should have a trickle-down effect for the private sector, with the beginning being free hardening guides and the vulnerability databases and the end being a comprehensive information security management toolset.

Check out the presentations from the SCAP conference last month. The Tim Grance presentation (.ppt) alone is worth the price of admission.

Right now SCAP is at the national/CISO level. Give it 6 months and it will be at the forefront of what people are doing.



Similar Posts:

Posted in DISA, FISMA, NIST, What Works | 5 Comments »

Reinventing FedRAMP

Posted February 15th, 2011 by

“Cloud computing is about gracefully losing control while maintaining accountability even if the operational responsibility falls upon one or more third parties.”
–CSA Security Guidance for Critical Areas of Focus in Cloud Computing V2.1

Now enter FedRAMP.  FedRAMP is a way to share Assessment and Authorization information for a cloud provider with its Government tenants.  In case you’re not “in the know”, you can go check out the draft process and supporting templates at FedRAMP.gov.  So far a good idea, and I really do support what’s going on with FedRAMP, except for somewhere along the lines we went astray because we tried to kluge doctrine that most people understand over the top of cloud computing which most people also don’t really understand.

I’ve already done my part to submit comments officially, I just want to put some ideas out there to keep the conversation going. As I see it, these are/should be the goals for FedRAMP:

  • Delineation of responsibilities between cloud provider and cloud tenant.  Also knowing where there are gaps.
  • Transparency in operations.  Understanding how the cloud provider does their security parts.
  • Transparency in risk.  Know what you’re buying.
  • Build maturity in cloud providers’ security program.
  • Help cloud providers build a “Governmentized” security program.

So now for the juicy part, how I would do a “clean room” implementation of FedRAMP on Planet Rybolov, “All the Authorizing Officials are informed, the Auditors are helpful, and every ISSO is above average”?  This is my “short list” of how to get the job done:

  • Authorization: Sorry, not going to happen on Planet Rybolov.  At least, authorization by FedRAMP, mostly because it’s a cheat for the tenant agencies–they should be making their own risk decisions based on risk, cost, and benefit.  Acceptance of risk is a tenant-specific thing based on the data types and missions being moved into the cloud, baseline security provided by the cloud provider, the security features of the products/services purchased, and the tenant’s specific configuration on all of the above.  However, FedRAMP can support that by helping the tenant agency by being a repository of information.
  • 800-53 controls: A cloud service provider manages a set of common controls across all of their customers.  Really what the tenant needs to know is what is not provided by the cloud service provider.  A simple RACI matrix works here beautifully, as does the phrase “This control is not applicable because XXXXX is not present in the cloud infrastructure”.  This entire approach of “build one set of controls definitions for all clouds” does not really work because not all clouds and cloud service providers are the same, even if they’re the same deployment model.
  • Tenant Responsibilities: Even though it’s in the controls matrix, there needs to be an Acceptable Use Policy for the cloud environment.  A message to providers: this is needed to keep you out of trouble because it limits the potential impacts to yourself and the other cloud tenants.  Good examples would be “Do not put classified data on my unclassified cloud”.
  • Use Automation: CloudAudit is the “how” for FedRAMP.  It provides a structure to query a cloud (or the FedRAMP PMO) to find out compliance and security management information.  Using a tool, you could query for a specific control or get documents, policy statements, or even SCAP assessment content.
  • Changing Responsibilities: Things change.  As a cloud provider matures, releases new products, or moves up and down the SPI stack ({Software|Platform|Infrastructure}as a Service), the balance of responsibilities change.  There needs to be a vehicle to disseminate these changes.  Normally in the IA world we do this with a Plan of Actions and Milestones but from the viewpoint of the cloud provider, this is more along the lines of a release schedule and/or roadmap.  Not that I’m personally signing up for this, but a quarterly/semi-annually tenant agency security meeting would be a good way to get this information out.

Then there is the special interest comment:  I’ve heard some rumblings (and read some articles, shame on you security industry press for republishing SANS press releases) about how FedRAMP would be better accomplished by using the 20 Critical Security Controls.  Honestly, this is far from the truth: a set of controls scoped to the modern enterprise (General Support System supporting end users) or project (Major Application) does not scale to an infrastructure-and-server cloud. While it might make sense to use 20 CSC in other places (agency-wide controls), please do your part to squash this idea of using it for cloud computing whenever and wherever you see it.

Ramp

Ramp photo by ell brown.



Similar Posts:

Posted in FISMA, Risk Management, What Works | 2 Comments »
Tags:

NIST Security Automation Conference

Posted September 13th, 2010 by

It’s at the end of September, check it out.  Even if you’re not in the vulnerability/patch rat race on a daily basis, it would “behoove” you to go check out what’s new.  If you’ve been paying attention to OMB Memo 10-15, you’ll notice that Cyberscope takes some SCAP input.



Similar Posts:

Posted in FISMA, NIST, Technical | 1 Comment »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: