Security Automation Developers Conference Slides

Posted July 2nd, 2009 by

Eh? What’s that mean?  Developer Days is a weeklong conference where they get down into the weeds about the various SCAP schemas and how they fit into the overall program of security automation. 

Highlights and new ideas:

Remedial Markup Language: Fledgeling schema to describe how to remediate a vulnerability.  A fully automated security system would scan and then use the RML content to automagically fix the finding… say, changing a configuration setting or installing a patch.  this would be much awesome if combined with the CVE/CWE so you have a vulnerability scanner that scans and fixes the problem.  Also needs to be kept in a bottle because the operations guys will have a heartattack if we are doing this without any human intervention.

Computer Network Defense: There is a pretty good scenario slide deck on using SCAP to automate hardening, auditing, monitoring, and defense.  The key from this deck is how the information flows using automation.

Common Control Identifier:  This schema is basically a catalog of controls (800-53, 8500.2, PCI, SoX, etc) in XML.  The awesomeness with this is that one control can contain a reference implementation for each technology and the checklist to validate it in XCCDF.  At this point, I get all misty…

Open Checklist Interactive Language: This schema is to capture questionaires.  Think managerial controls, operational controls, policy, and procedure captured in electronic format and fed into the regular mitigation and workflow tools that you use so that you can view “security of the enterprise at a glance” across technical and non-technical security.

Network Event Content Automation Protocol:  This is just a concept floating around right now on using XML to describe and automate responses to attacks.  If you’re familiar with ArcSight’s Common Event Format, this would be something similar but on steroids with workflow and a pony!

Attendance at developer days is limited, but thanks to all the “Powar of teh Intarwebs, you can go here and read the slides!



Similar Posts:

Posted in NIST, Technical | 3 Comments »
Tags:

Your Security “Requirements” are Teh Suxxorz

Posted July 1st, 2009 by

Face it, your security requirements suck. I’ll tell you why.  You write down controls verbatim from your catalog of controls (800-53, SoX, PCI, 27001, etc), put it into a contract, and wonder how come when it comes time for security testing, we just aren’t talking the same language.  Even worse, you put in the cr*ptastic “Contractor shall be compliant with FISMA and all applicable NIST standards”.  Yes, this happens more often than I could ever care to count, and I’ve seen it from both sides.

The problem with quoting back the “requirements” from a catalog of controls is that they’re not really requirements, they’re control objectives–abstract representations of what you need in order to protect your data, IT system, or business.  It’s a bit like brain surgery using a hammer and chisel–yes, it might work out for you, but I don’t really feel comfortable doing it or being on the receiving end.

And this is my beef with the way we manage security controls nowadays.  They’re not requirements, functionally they’re a high-level needs statement or even a security concept of operations.  Security controls need to be tailored into real requirements that are buildable, testable, measurable, and achievable.

Requirements photo by yummiec00kies.  There’s a social commentary in there about “Single, slim, and pleasant looking” but even I’m afraid to touch that one. =)

Did you say “Wrecks and Female Pigs’? In the contracting world, we have 2 vehicles that we use primarily for security controls: Statements of Work (SOW) and Engineering Requirements.

  • Statements of Work follow along the lines of activities performed by people.  For instance, “contractor shall perform monthly 100% vulnerability scanning of the $FooProject.”
  • Engineering Requirements are exactly what you want to have build.  For instance, “Prior to displaying the login screen, the application shall display the approved Generic Government Agency warning banner as shown below…”

Let’s have a quick exercise, shall we?

What 800-53 says: The information system produces audit records that contain sufficient information to, at a minimum, establish what type of event occurred, when (date and time) the event occurred, where the event occurred, the source of the event, the outcome (success or failure) of the event, and the identity of any user/subject associated with the event.

How It gets translated into a contract: Since it’s more along the lines of a security functional requirement (ie, it’s a specific functionality not a task we want people to do), we brake it out into multiple requirements:

The $BarApplication shall produce audit records with the following content:

  • Event description such as the following:
    • Access the $Baz subsystem
    • Mounting external hard drive
    • Connecting to database
    • User entered administrator mode
  • Date/time stamp in ‘YYYY-MM-DD HH:MM:SS’ format;
  • Hostname where the event occured;
  • Process name or program that generated the event;
  • Outcome of the event as one of the following: success, warn, or fail; and
  • Username and UserID that generated the event.

For a COTS product (ie, Windows 2003 server, Cisco IOS), when it comes to logging, I get what I get, and this means I don’t have a requirement for logging unless I’m designing the engineering requirements for Windows.

What 800-53 says: The The organization configures the information system to provide only essential capabilities and specifically prohibits and/or restricts the use of the following functions, ports, protocols, and/or services: [Assignment: organization-defined list of prohibited and/or restricted functions, ports, protocols, and/or services].

How It gets translated into a contract: Since it’s more along the lines of a security functional requirement, we brake it out into multiple requirements:

The $Barsystem shall have the software firewall turned on and only the following traffic shall be allowed:

  • TCP port 443 to the command server
  • UDP port 123 to the time server at this address
  • etc…..

If we drop the system into a pre-existing infrastructure, we don’t need firewall rules per-se as part of the requirements, what we do need is a SOW along the following lines:

The system shall use our approved process for firewall change control, see a copy here…

So what’s missing, and how do we fix the sorry state of requirements?

This is the interesting part, and right now I’m not sure if we can, given the state of the industry and the infosec labor shortage:  we need security engineers who understand engineering requirements and project management in addition to vulnerability management.

Don’t abandon hope yet, let’s look at some things that can help….

Security requirements are a “best effort” proposition.  By this, I mean that we have our requirements and they don’t fit in all cases, so what we do is we throw them out there and if you can’t meet the requirement, we waiver it (live with it, hope for the best) or apply a compensating control (shield it from bad things happening).  This is unnerving because what we end up doing is arguing all the time over whether the requirements that were written need to be done or not.  This drives the engineers nuts.

It’s a significant amount of work to translate control objectives into requirements.  The easiest, fastest way to fix the “controls view” of a project is to scope out things that are provided by infrastructure or by policies and procedures at the enterprise level.  Hmmm, sounds like explicitly stating what our shared/common controls are.

You can manage controls by exclusion or inclusion:

  • Inclusion:  We have a “default null” for controls and we will explicitly say in the requirements what controls you do need.  This works for small projects like standing up a pair of webservers in an existing infrastructure.
  • Exclusion:  We give you the entire catalog of controls and then tell you which ones don’t apply to you.  This works best with large projects such as the outsourcing of an entire IT department.

We need a reference implementation per technology.  Let’s face it, how many times have I taken the 800-53 controls and broken them down into controls relevant for a desktop OS?  At least 5 in the last 3 years.  The way you really need to do this is that you have a hardening guide and that is the authoritative set of requirements for that technology.  It makes life simple.  Not that I’m saying deviate from doctrine and don’t do 800-53 controls and 800-53A test procedures, but that’s the point of having a hardening guide–it’s really just a set of tailored controls specific to a certain technology type.  The work has been done for you, quit trying to re-engineer the wheel.

Use a Joint Responsibilities Matrix.  Basically this breaks down the catalog of controls into the following columns:

  • Control Designator
  • Control Title
  • Provided by the Government/Infrastructure/Common Control
  • Provided by the Contractor/Project Team/Engineer


Similar Posts:

Posted in BSOFH, Outsourcing, Technical | 3 Comments »
Tags:

NIST Framework for FISMA Dates Announced

Posted April 10th, 2009 by

Some of my friends (and maybe myself) will be teaching the NIST Framework for FISMA in May and June with Potomac Forum.   This really is an awesome program.  Some highlights:

  • Attendance is limited to Government employees only so that you can talk openly with your peers.
  • Be part of a cohort that trains together over the course of a month.
  • The course is 5 Fridays so that you can learn something then take it back to work the next week.
  • We have a Government speaker ever week, from the NIST FISMA guys to agency CISOs and CIOs.
  • No pitching, no marketing, no product placement (OK, maybe we’ll go through DoJ’s CSAM but only as an example of what kinds of tools are out there) , no BS.

See you all there!



Similar Posts:

Posted in NIST, Speaking | 1 Comment »
Tags:

Certification and Accreditation Seminar, March 30th and 31st

Posted March 13th, 2009 by

We’ve got another good US Government Security Certification and Accreditation (C&A) Seminar/Workshop coming up at the end of March with Potomac Forum.

Graydon McKee (Ascension Risk Management and associated blog) and Dan Philpott (Fismapedia Mastermind and Guerilla-CISO Contributor) are going to the core of the instruction, with a couple others thrown in to round it all out.  I might stop by if I have the time.

What we promise:

  • An opportunity to hear NIST’s version of events and what they’re trying to accomplish
  • An opportunity to ask as many questions as you possibly can in 2 days
  • Good materials put together
  • An update on some of the recent security initiatives
  • An opportunity to commiserate with security folks from other agencies and contractors
  • No sales pitches and no products

See you all there!



Similar Posts:

Posted in FISMA, NIST, Speaking | No Comments »
Tags:

The 10 CAG-egorically Wrong Ways to Introduce Standards

Posted February 20th, 2009 by

The Consensus Audit Guidelines (CAG) appear, at this point, to be a reasonable set of guidelines for mediating some human threats. I’m looking forward to seeing what CAG offers and have no doubt there will be worthwhile and actionable controls in the document. That said, there are significant reasons approach CAG with skepticism and assess it critically.

The motivation for CAG is described in a set of slides at the Gilligan Group site. It starts with a focus on what CIO’s fear most: attacks, reduced operational capability, public criticism, data loss, etc. Then it rightly questions whether FISMA is adequately addressing those problems. It doesn’t and this is the genesis of the CAG.

Consensus photo by Eirik Newth.

Unfortunately CAG subsequently develops by pairing this first valid premise with a set of false premises.  These propositions are drawn from slides at gilligangroupinc.com, attributed to John Gilligan or Alan Paller:

  1. All that matters are attacks. The central tenet of Bush’s Comprehensive National Cyber Initiative (CNCI) is adopted as the CAG theme: “Defense Must Be Informed by the Offense”. CAG envisions security as defense against penetration attacks. As any seasoned security practitioner knows, attacks are a limited subset of the threats to confidentiality, integrity and availability that information and information systems face.
  2. Security through obscurity. CAG seems to have taken the unspoken CNCI theme to heart too, “The most effective security is not exposed to public criticism.” Since its very public December 11th announcement no drafts have been made publicly available for comment.
  3. False dichotomy. CAG has been promoted as an alternative to the OMB/NIST approach to FISMA. It isn’t. An alternative would target a fuller range of threats to information and information system security. CAG should be considered a complement to NIST guidance, an addendum of security controls focused on defense against penetration by hackers. NIST has even acted on this approach by including some CAG controls into the 800-53 Rev. 3 catalog of controls.
  4. There is too much NIST guidance! This is the implication of one CAG slide that lists 1200 pages of guidance, 15 FIPS docs and the assorted Special Publications not related to FISMA as detriments to security. It’s like complaining that Wikipedia has too many articles to contribute to improved learning. Speaking as someone who scrambled to secure Federal systems before FISMA and NIST’s extensive guidance, having that documentation greatly improves my ability to efficiently and effectively secure systems.
  5. NIST guidance doesn’t tell me how to secure my systems! NIST’s FISMA guidance doesn’t step you through securing your SQL Server. The Chairman of the Joint Chiefs also doesn’t deliver your milk. Why not? It’s not their job. NIST’s FISMA guidance helps you to assess the risks to the system, decide how to secure it, secure it accordingly, check that a minimum of controls are in place and then accept responsibility for operating the system. NIST also provides documents, checklists, repositories, standards, working groups and validation of automated tools that help with the actual security implementation.
  6. Automated security controls negate human errors. With the premise of all threats being attacks this is nearly a plausible premise. But not all security is technical. Not all threats come from the Internet. DHS, NIST, Mitre, and their partners have pursued automated security controls to enforce and audit security controls for years but automated security controls can only go so far. Human errors, glitches, unexpected conflicts and operational requirements will always factor into the implementation of security.
  7. Audit compatibility as a hallmark of good security. There is a conflict of focus at the heart of the CAG, it seeks to both improve its subset of security and improve audit compatibility. For technical controls this is somewhat achievable using automation, something NIST has pursued for years with government and industry partners. For operational and management controls it results in audit checklists. But audits are fundamentally concerned with testing the particular and repeatable, security needs focus on evaluating the whole to ensure the necessary security results. An audit sees if antivirus software is installed, an evaluation sees if the antivirus software is effective.
  8. Metrics, but only these metrics over here. When selecting the current crop of CAG controls decisions on what to include were reportedly based on metrics of the highest threats. Great idea, a quantitative approach often discovers counter-intuitive facts. Only the metrics were cherry picked. Instead of looking at all realized threats or real threat impacts only a count of common penetration attacks were considered.
  9. With a sample of 1. As a basis for determining what security should focus on the whole breadth of the security profession was queried, so long as they were penetration testers. Yes, penetration testers are some very smart and talented people but penetration testing is to security what HUMINT is to intelligence services. Important players, expert practitioners but limited in scope and best used in conjunction with other intelligence assets.
  10. Assessments rely on paper artifacts. The NIST guidance does not require paper artifacts. The first line in the NIST SP 800-53A preface is, “Security control assessments are not about checklists, simple pass-fail results, or generating paperwork to pass inspections or audits-rather, security controls assessments are the principal vehicle used to verify that the implementers and operators of information systems are meeting their stated security goals and objectives.” NIST SP 800-37 specifically and repeatedly states, “Security accreditation packages can be submitted in either paper or electronic format.”

CAG is a missed opportunity. Of the myriad problems with our current FISMA regime a lot of good could be achieved. The problems with guidance have many causes but can be addressed through cooperative development of best practices outside of NIST. The Assessment Cases for SP 800-53A is an example of how cooperative development can achieve great results and provide clear guidance. Other problems exist and can be addressed with better training and community developments.

My hope is that the Consensus Audit Guidelines will move towards a more open, collaborative development environment. The first release is sure to deliver useful security controls against penetration attacks. As with all good security practices it will likely need to go through a few iterations and lots of critical assessment to mature. An open environment would help foster a more complete consensus.

Consensus photo by mugley.



Similar Posts:

Posted in BSOFH, FISMA, Rants, Technical, What Doesn't Work, What Works | 9 Comments »
Tags:

Comments on SCAP 2008

Posted September 24th, 2008 by

I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

Presention slides are available at the NVD website

I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

  • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
  • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
  • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
  • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
  • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
  • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

Hall of Standards inside NIST HQ photo by ME!!!

What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

Side notes:

I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

Deer on NIST Gaithersburg Campus photo by Chucka_NC.



Similar Posts:

Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: