Random Thoughts on “The FISMA Challenge” in eHealthcare

Posted August 4th, 2009 by

OK, so there’s this article being bounced all over the place.  Basic synopsis is that FISMA is keeping the government from doing any kind of electronic health records stuff because FISMA requirements extend to health care providers and researchers when they take data from the Government.

Read one version of the story here

So the whole solution is that, well, we can’t extend FISMA to eHealthcare when the data is owned by the Government because that security management stuff gets in the way.  And this post is about why they’re wrong and right, but not in the places that they think they are.

Government agencies need to protect the data that they have by providing “adequate security”.  I’ve covered this a bazillion places already. Somewhere somehow along the lines we let the definition of adequate security mean “You have to play by our rulebook” which is complete and utter bunk.  The framework is an expedient and a level-setting experience across the government.  It’s not made to be one-size-fits-all, but is instead meant to be tailored to each individual case.

The Government Information Security Trickle-Down Effect is a name I use for FISMA/NIST Framework requirements being transferred from the Government to service providers, whether they’re in healthcare or IT or making screws that sometimes can be used on the B2 bombers.  It will hit you if you take Government data but only because you have no regulatory framework of your own with which you can demonstrate that you have “adequate security”.  In other words, if you provide a demonstrable level of data protection equal to or superior to what the Government provides, then you should reasonably be able to take the Government data, it’s finding the right “esperanto” to demonstrate your security foo.

If only there was a regulatory scheme already in place that we could use to hold the healthcare industry to.  Oh wait, there is: HIPAA.  Granted, HIPAA doesn’t really have a lot of teeth and its effects are maybe demonstrable, but it does fill most of the legal requirement to provide “adequate security”, and that’s what’s the important thing, and more importantly, what is required by FISMA.

And this is my problem with this whole string of articles:  The power vacuum has claimed eHealthcare.  Seriously, there should be somebody who is in charge of the data who can make a decision on what kinds of protections that they want for it.  In this case, there are plenty of people with different ideas on what that level of protection is so they are asking OMB for an official ruling.  If you go to OMB asking for their guidance on applying FISMA to eHealthcare records, you get what you deserve, which is a “Yes, it’s in scope, how do you think you should do this?”

So what the eHealthcare people really are looking for is a set of firm requirements from their handlers (aka OMB) on how to hold service providers accountable for the data that they are giving them.  This isn’t a binary question on whether FISMA applies to eHealthcare data (yes, it does), it’s a question of “how much is enough?” or even “what level of compensating controls do we need?”

But then again, we’re beaten down by compliance at this point.  I know I feel it from time to time.  After you’ve been beaten badly for years, all you want to do is for the batterer to tell you what you need to do so the hurting will stop.

So for the eHealthcare agencies, here is a solution for you.  In your agreements/contracts to provide data to the healthcare providers, require the following:

  • Provider shall produde annual statements for HIPAA compliance
  • Provider shall be certified under a security management program such as an  ISO 27001, SAS-70 Type II, or even PCI-DSS
  • Provider shall report any incident resulting in a potential data breach of 500 or more records within 24 hours
  • Financial penalties for data breaches based on number of records
  • Provider shall allow the Government to perform risk assessments of their data protection controls

That should be enough compensating controls to provide “adequate security” for your eHealthcare data.  You can even put a line through some of these that are too draconian or high-cost.  Take that to OMB and tell them how you’re doing it and how they would like to spend the taxpayers’ money to do anything other than this.

Case Files and Medical Records photo by benuski.



Similar Posts:

Posted in FISMA, Rants | 1 Comment »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 2

Posted April 16th, 2009 by

Rybolov Note: this is part 2 in a series about S.773.  Go read the bill hereGo read part one here. Go read part 3 here. Go read part four hereGo read part 5 here. =)

SEC. 7. LICENSING AND CERTIFICATION OF CYBERSECURITY PROFESSIONALS. This section has received quite a bit of airtime around the blagosphere.  Everybody thinks that they’ll need some kind of license from the Federalies to run nessus.  Hey, maybe this is how it will all end up, but I think this provision will end up stillborn.

I know the NIST folks have been working on licensing and certification for some time, but they usually run into the same problems:

  • Do we certify individuals as cybersecurity professionals?
  • Do we certify organizations as cybersecurity service providers?
  • What can the Government do above and beyond what the industry provides? (ISC2, SANS, 27001, etc)
  • NIST does not want to be in the business of being a licensure board.

Well, this is my answer (I don’t claim that these are my opinion):

  • Compulsory: the Government can require certifications/licensure for certain job requirements.  Right now this is managed by HR departments.
  • Existing Precedent: We’ve been doing this for a couple of years with DoDI 8570.01M, which is mandatory for DoD contracts.  As much as I think industry certification is a pyramid scheme, I think this makes sense in contracting for the Government because it’s the only way to ensure some kind of training for security staff.If the Government won’t pay for contractor training (and they shouldn’t) and the contractor won’t pay for employees to get training because their turnover rate is 50% in a year, it’s the only way to ensure some kind of training and professionalization of the staff.  Does this scale to the rest of the country?  I’m not sure.
  • Governance and Oversight: The security industry has too many different factions.  A Government-ran certification and license scheme would provide some measure of uniformity.

Honestly, this section of the bill might make sense (it opens up a bigger debate) except for one thing:  we haven’t defined what “Cybersecurity Services” are.  Let’s face it, most of what we think are “security” services are really basic IT management services… why should you need a certification to be the goon on the change control board.  However, this does solve the “problem” of hackers who turn into “researchers” once they’re caught doing something illegal.  I just don’t see this as that big of a problem.

Verdict: Strange that this isn’t left up to industry to handle.  It smells like lobbying by somebody in ISC2 or SANS to generate a higher demand for certs.  Unless this section is properly scoped and extensively defined, it needs to die on the cutting room floor–it’s too costly for almost no value above what industry can provide.  If you want to provide the same effect with almost no cost to the taxpayers, consider something along the 8570.01 approach in which industry runs the certifications and specific certifications are required for certain job titles.

SEC. 8. REVIEW OF NTIA DOMAIN NAME CONTRACTS. Yes, there is a bunch of drama-llama-ing going on between NTIA, ICANN, Verisign, and a cast of a thousand.  This section calls for a review of DNS contracts by the Cybersecurity Advisory Panel (remember them from section 3?) before they are approved.  Think managing the politics of DNS is hard now?  It just got harder–you ever try to get a handful of security people to agree on anything?  And yet, I’m convinced that either this needs to happen or NTIA needs to get some clueful security staffers who know how to manage contracts.

Verdict: DNSSEC is trendy thanks to Mr Kaminski.  I hate it when proposed legislation is trendy.  I think this provision can be axed off the bill if NTIA had the authority to review the security of their own contracts.  Maybe this could be a job for the Cybersecurity Advisor instead of the Advisory Panel?

SEC. 9. SECURE DOMAIN NAME ADDRESSING SYSTEM. OK, the Federal Government has officially endorsed DNSSEC thanks to some OMB mandates.  Now the rest of the country can play along.  Seriously, though, this bill has some scope problems, but basically what we’re saying is that Federal agencies and critical infrastructure will be required to implement DNSSEC.

Once again, though, we’re putting Commerce in charge of the DNSSEC strategy.  Commerce should only be on the hook for the standards (NIST) and the changes to the root servers (NTIA).  For the Federal agencies, this should be OMB in charge.  For “critical infrastructure”, I believe the most appropriate proponent agency is DHS because of their critical infrastructure mission.

And as for the rest of you, well, if you want to play with the Government or critical infrastructure (like the big telephone and network providers), it would behoove you to get with the DNSSEC program because you’re going to be dragged kicking and screaming into this one.  Isn’t the Great InfoSec Trickle-Down Effect awesome?

Verdict: If we want DNSSEC to happen, it will take an act of Congress because the industry by itself can’t get it done–too many competing interests.  Add more tasks to the agencies outside of Commerce here, and it might work.

Awesome Capitol photo by BlankBlankBlank.

SEC. 10. PROMOTING CYBERSECURITY AWARENESS. Interesting in that this is tasked to Commerce, meaning that the focus is on end-users and businesses.

In a highly unscientific, informal poll with a limited sample of security twits, I confirmed that nobody has ever heard of Dewie the Webwise Turtle.  Come on, guys, “Safe at any speed”, how could you forget that?  At any rate, this already exists in some form, it just has to be dusted off and get a cash infusion.

Verdict: Already exists, but so far efforts have been aimed at users.  The following populations need awareness: small-medium-sized businesses (SMBs), end-users, owners of critical infrastructure, technology companies, software developers.  Half of these are who DHS is dealing with, and this provision completely ignores DHS’s role.

SEC. 11. FEDERAL CYBERSECURITY RESEARCH AND DEVELOPMENT. This section is awesome to read, it’s additions to the types of research that NSF can fund and extensions of funding for the existing types of research.  It’s pretty hard to poke holes in, and based on back-of-the-envelope analysis, there isn’t much that is missing by way of topics that need to be added to research priorities.  What I would personally like to see is a better audit system not designed around the accounting profession’s way of doing things.  =)

Verdict: Keep this section intact.  If we don’t fund this, we will run into problems 10+ years out–some would say we’re already running into the limitations of our current technology.

SEC. 12. FEDERAL CYBER SCHOLARSHIP-FOR-SERVICE PROGRAM. This is an existing program, and it’s pretty good.  Basically you get a scholarship with a Government service commitment after graduation.  Think of it as ROTC-light scholarships without bullets and trips to SW Asia.

Verdict: This is already there.  This section of the bill most likely is in to get the program funded out to 2014.



Similar Posts:

Posted in NIST, Public Policy, What Doesn't Work, What Works | 2 Comments »
Tags:

Inside the Obama Administration’s Cyber Security Agenda

Posted January 28th, 2009 by

Interesting article in Security Focus on President Obama and cybersecurity.  Yes, I complained on twitter because the “document on homeland security” is not really any kind of a solution, more like a bullet list of goals that sound suspiciously like a warmed-over campaign platform.

Guess what?  Every President does this, they put out their agenda for everyone to see.  With the last administration, it was the 5-point President’s Management Agenda.

Let’s be honest here, as Bubba the Infantryman would say, “There are only a couple of ways to suck an egg, and this egg has been around for a long time.”  Any cybersecurity strategy will harken back to the National Strategy to Secure Cyberspace because the problems are the same.  If you remember back to when the NStSC was first released, a horde of critics appeared out of the woodwork to say that there wasn’t enough implementation details and that the strategy wouldn’t be implemented because of that.  Well, they were partly right.

And now there’s the President stating his agenda with the same ideas that people have been saying for 6 years in more detail than what and suddenly it’s new and innovative.  That’s politics for you, folks.  =)  Bubba, in a rare fit of wisdom would say “The way you can tell the true pioneers is that they have arrows sticking out of their backs” and it might seem apropos here, if maybe a little bit cynical.

Hidden Agenda Eats Agenda photo by emme-dk.

Let’s go through each of the points with a little bit of analysis from myself:

  • Strengthen Federal Leadership on Cyber Security:Declare the cyber infrastructure a strategic asset and establish the position of national cyber advisor who will report directly to the president and will be responsible for coordinating federal agency efforts and development of national cyber policy.

  • Great idea.   Between OMB, NIST, DHS, DoD, DOJ, and a cast of thousands, there is a huge turf war over who really owns security.  Each of these groups do a phenomenal job doing what it is they do, but coordination between them is sometimes more like a semi-anarchist commune than a grand unified effort.  I seem to remember saying at one point that this was needed.  Granted, I was specifically talking about the internal side of the InfoSec Equitites Issue, so the scope here is a little different.

    The Cyber Czar is literally buried deep down inside DHS with no real authority, a presidential advisor like is in the agenda would report directly to the President. 

  • Initiate a Safe Computing R&D Effort and Harden our Nation’s Cyber Infrastructure:Support an initiative to develop next-generation secure computers and networking for national security applications. Work with industry and academia to develop and deploy a new generation of secure hardware and software for our critical cyber infrastructure. 

  • We have a very good R&D plan in place (.pdf caveat), it just needs to be adopted and better funded.  For those of you who need a project, this is like a wishlist on things that some very smart Government guys are willing to fund.

  • Protect the IT Infrastructure That Keeps America’s Economy Safe: Work with the private sector to establish tough new standards for cyber security and physical resilience.

  • Ouch, I cringe when I read this one.  Not that it’s needed because when it really comes down to it, every CISO in the US is dependent on the software and hardware vendors and their service providers.

    Something the world outside the Beltway doesn’t understand is that “standards” are roughly equal to “regulation”.  It’s much, much better if the Government goes to industry groups and says “hey, we want these things to be part of a standard, can you guys work to put it all together?” There might be some regulation that is needed but it should be kept as small as possible.  Where the Government can help is to sponsor some of the standards and work along with industry to help define standards.

    Maybe the best model for this is the age-old “lead the horse to water, demonstrate to the horse how to drink, hold the horses mouth in the water, and you still can’t get them to drink.”  We’ve tried this model for a couple of years, what is needed now is some kind of incentive for the horse to drink and for vendors to secure their hardware, software, firmware, and service offerings.

  • Prevent Corporate Cyber-Espionage: Work with industry to develop the systems necessary to protect our nation’s trade secrets and our research and development. Innovations in software, engineering, pharmaceuticals and other fields are being stolen online from U.S. businesses at an alarming rate.

  • Maybe this gets down to political beliefs, but I don’t think this is the Government’s responsibility to prevent corporate cyber-espionage, nor should you as a company allow the Government to dictate how you harden your desktops or  where you put your IDS.  If you are not smart enough to be in one of these high-tech industries, you should be smart enough to keep your trade secrets from going offshore, or else you’ll die like some weird brand of corporate darwinism.

    Government can prosecute evildoers and coordinate with other countries for enforcement efforts, which is exactly what you would expect their level of involvement to be.

    Yes, in some cases when it’s cyber-espionage directed at the Government by hacking contractors or suppliers (the military-industrial complex), then Government can do something about it with trickle-down standards in contracts, and they usually do.  Think ITAR export controls scoped to a multi-national corporation and you have a pretty good idea of what the future will hold.

  • Develop a Cyber Crime Strategy to Minimize the Opportunities for Criminal Profit: Shut down the mechanisms used to transmit criminal profits by shutting down untraceable Internet payment schemes. Initiate a grant and training program to provide federal, state, and local law enforcement agencies the tools they need to detect and prosecute cyber crime.

  • This point is interesting to me.  We already have rules to flag large transactions or multiple transactions, that’s how Elliot Spitzer got caught.  Untraceable Internet payment schemes sounds like pulp-fiction stuff and income tax tracking to me, I would like to know if they really exist.

    On the other hand, law enforcement does need training.  There really is a shortage of people with the law enforcement and technical security backgrounds who can do investigations.

  • Mandate Standards for Securing Personal Data and Require Companies to Disclose Personal Information Data Breaches: Partner with industry and our citizens to secure personal data stored on government and private systems. Institute a common standard for securing such data across industries and protect the rights of individuals in the information age.

  • National data breach law == good, because it standardizes all of the state laws that are such a hodge-podge you need a full-time staff dedicated to breaking down incidents by jursidiction.  We have something like this proposed, it’s S.459 which just needs to be resurrected and supported by the Executive Branch as part of their agenda.

    A common standard could be good as long as it’s done right (industry standards v/s Government regulation), see my comments above.

     

    Note some key points I want you to take away:

    Nothing is new under the sun.  These problems have been around a long time, they won’t go away in the next 4 years.  We have to build on the work of people who have come before us because we know they’ve looked at the problem and came to the same conclusions we will eventually come to.

    Partnership is emphasized.  This is because as much lip-service we give to the Government solving our problems, the American Way (TM) is for the Government not to be your Internet Nanny.  Government can set the environment to support private information security efforts but it really is up to the individual companies to protect themselves.

    Industry needs to solve its own problems.  If you want the Government to solve the nation’s information security problems, it means that we take US-CERT and have them monitor everything whether you want them to or not.  Yes, that’s where things are heading, folks, and maybe I just spilled the beans on some uber-secret plan that I don’t know about yet.  Trust me, you don’t want the transparency that the Government watching your data would provide.

    Be careful what you ask for.  You just might get it.  When it comes to IT security, be extra careful because you’ll end up with regulation which means more auditors.

    Agenda Grafitti photo by anarchosyn.



    Similar Posts:

    Posted in Public Policy, Rants | 5 Comments »
    Tags:

    Comments on SCAP 2008

    Posted September 24th, 2008 by

    I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

    Presention slides are available at the NVD website

    I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

    Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

    • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
    • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
    • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
    • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
    • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
    • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

    Hall of Standards inside NIST HQ photo by ME!!!

    What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

    And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

    He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

    Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

    Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

    Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

    More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

    OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

    But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

    Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

    Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

    Side notes:

    I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

    Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

    Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

    Deer on NIST Gaithersburg Campus photo by Chucka_NC.



    Similar Posts:

    Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
    Tags:

    Some Words From a FAR

    Posted September 9th, 2008 by

    FAR: it’s the Federal Acquisition Regulation, and it covers all the buying that the government does.  For contractors, the FAR is a big deal–violate it and you end up blackballed from Government contracts or having to pay back money to your customer, either of which is a very bad thing.

    In early August, OMB issued Memo 08-22 (standard .pdf caveat blah blah blah) which gave some of the administratrivia about how they want to manage FDCC–how to report it in your FISMA report, what is and isn’t a desktop, and a rough outline on how to validate your level of compliance.

    Now I have mixed feelings about FDCC, you all should know that by now, but I think the Government actually did a decent thing here–they added FDCC (and any other NIST secure configuration checklists) to the FAR.

    Check this section of 800-22 out:

    On February 28, 2008, revised Part 39 of the Federal Acquisition Regulation (FAR) was published which reads:
    PART 39-ACQUISITION OF INFORMATION TECHNOLOGY
    1. The authority citation for 48 CFR part 39 continues to read as follows: Authority: 40 U.S.C. 121(c); 10U.S.C. chapter 137; and 42 U.S.C. 2473(c).
    2. Amend section 39.101 by revising paragraph (d) to read as follows:
    39.101 Policy.
    * * * * *

    (d) In acquiring information technology, agencies shall include the appropriate IT security policies and requirements, including use of common security configurations available from the NIST’s website at http://checklists.nist.gov. Agency contracting officers should consult with the requiring official to ensure the appropriate standards are incorporated.

    Translated into English, what this means is that the NIST configurations checklists are coded into law for Government IT purchases.

    This carries a HUGE impact to both the Government and contractors.  For the Government, they just outsourced part of their security to Dell and HP, whether they know it or not.  For the desktop manufacturers, they just signed up to learn how FDCC works if they want some of the Government’s money. 

    Remember back in the halcyon days of FDCC when I predicted that one of the critical keys to success for FDCC was to be able to buy OEM desktops with the FDCC images on them.  It’s slowly becoming a reality.

    Oh what’s that, you don’t sell desktops?  Well, this applies to all NIST configuration checklists, so as NIST adds to the intellectual property in the checklists program, you get to play too.  Looking at the DISA STIGs as a model, you might end up with a checklist for literally everything.

    So as somebody who has no relation to the US Federal Government, you must be asking by now how you can ride the FDCC wave?  Here’s Rybolov’s plan for secure desktop world domination:

    • Wait for the government to attain 60-80% FDCC implementation
    • Wait for desktops to have an FDCC option for installed OS
    • Review your core applications on the FDCC compatibility list
    • Adopt FDCC as your desktop hardening standard
    • Buy your desktop hardware with the image pre-loaded
    • The FDCC configuration rolls uphill to be the default OS that they sell
    • ?????
    • Profit!

    And the Government security trickle-down effect keeps rolling on….

    Cynically, you could say that the OMB memos as of late (FDCC, DNSSEC) are very well coached and that OMB doesn’t know anything about IT, much less IT security.  You probably would be right, but seriously, OMB doesn’t get paid to know IT, they get paid to manage and budget, and in this case I see some sound public policy by asking the people who do know what they’re talking about.

    While we have on our cynical hats, we might as well give a nod to those FISMA naysayers who have been complaining for years that the law wasn’t technical/specific enough.   Now we have very static checklists and the power to decide what a secure configuration should be has been taken out of the hands of the techies who would know and given to research organizations and bureaucratic organizations who have no vested interest in making your gear work.

    Lighthouse From Afar

    Lighthouse From AFAR photo by Kamoteus.



    Similar Posts:

    Posted in FISMA, NIST, What Doesn't Work, What Works | 8 Comments »
    Tags:

    More GAO Testimony

    Posted March 14th, 2008 by

    GAO has delivered an updated version of the testimony from February 14th that I talked about here. I’m not going to rehash what I’ve already said, but I want to focus your attention on something I didn’t talk about then: incident statistics.

    According to GAO, the number of incidents that were reported to US-CERT increased 259% (*cue shock and awe*, but I think that they forgot to add “average annual increase of 259%” because otherwise the math doesn’t even pass BOTE calculations) from 3634 in FY2005 to 13029 in FY2007. OK, so the number is increasing. But there are several failures in GAO’s logic here that need to be pointed out:

    “The need for effective information security policies and practices is further illustrated by the number of security incidents experienced by federal agencies that put sensitive information at risk.”

    In other words, they’re trying to indirectly draw a conclusion that the high number of incidents is directly proportional to their audit findings. While this may be true in some (most?) ways, it’s also bad to make this comparison in other ways because you would expect the number of incidents to go down over 2 years because the number of implemented, tested, and integrated security controls has gone up.

    So really, what’s the dealio?

    The first thing that I would like to point out is that security policies and practices have an indirect impact on security incidents. You don’t have a solid one-for-one comparison that you can use, so I think GAO is doing itself an injustice by trying to correlate these two things. However, you can use incident metrics as a holistic metric for measuring how well your information security program is doing, but overall it’s a very coarse method.

    The second thing that I need to point out is the trend of the incident number itself. Anybody who starts tracking incident metrics has to ask themselves one question: because we’re now tracking the number of incidents, does it mean that we’ll now notice that there are more incidents simply due to the fact that we’re now measuring them? It’s the incident response equivalent to Schrödinger’s cat and the Measurement Problem. =)

    There’s a couple of reasons that the incident count has increased 259% in just two years:

    • First is the awareness of incidents. Government-wide, 2 things have happened in these 2 years that should have increased the number of reportable incidents: maturity of US-CERT to receive and categorize larger amounts of incident data; and the maturity of agencies to have their own incident response and reporting procedures. In short: the infrastructure to respond and report now exists where it really didn’t 3 years ago.
    • A series of high-profile incidents around PII followed by OMB mandating that all incidents related to PII be reported to US-CERT within one hour. As a result, many more incidents are now being reported if there is a possibility that there is an incident and if there is a possibility that the incident involved PII because it’s the career-safe move: “When in doubt, report it up”. Whether they admit it or not, the people out in the agencies are now what we could call “gun shy” about PII incidents, and that increases the amount of reported incidents.
    • The criteria for an incident is very broad and includes “improper usage”, “scans/probes attempted access”, and “investigations” which is classified as “Unconfirmed incidents that are potentially malicious or anomalous activity deemed by the reporting entity to warrant further review”.

    If this were an SIEM or IDS, I would say that we’re flagging on too many things and need to tune our systems down a little bit. Keep in mind that it’s the nature of Government to underreport (when they’re not required to report) and overreport (when they are required to report).

    You still need to track the aggregate number of incidents reported to US-CERT and in theory this number should trend downward as we get better at governance at the national level as sort of a “trickle-down infosec economy”. Keep in mind that this number should peak within 5-10 years and then slowly be reduced as we fine-tune our reporting criteria and as we get better at securing information. Of course, I won’t be surprised if it doesn’t due to the threat environment, but that’s a conversation for another day.

    However, what I propose is the middle-ground on incident reporting: what we really need to pay attention to for the next couple of years is the number of “severe” incidents. Those are the incidents that have actually have an impact that we really care about. These are mentioned in the GAO report, and we should all be able to recall a handful of them without even seeing what GAO had to say.

    Knowing this town, I propose we use “Rybolov’s Washington Post Metric”: How many security incidents were significant enough to be deemed “newsworthy” by the Washington Post and mentioned somewhere. For fine tuning, you could use, say, daily front page v/s the Sunday supplement technology section.

    My parting shot for the FISMA-haters:  in the years of yore before FISMA (or GISRA if you want to go back that far), how many of these incidents would have been reported?  It seems like we’re failing if you take the numbers and the reports at face-value, but as GAO says in their title:  “Progress Reported, but Weaknesses at Federal Agencies Persist”.  What more do you need to know?



    Similar Posts:

    Posted in FISMA | 3 Comments »

    « Previous Entries


    Visitor Geolocationing Widget: