NIST Framework for FISMA Dates Announced

Posted April 10th, 2009 by

Some of my friends (and maybe myself) will be teaching the NIST Framework for FISMA in May and June with Potomac Forum.   This really is an awesome program.  Some highlights:

  • Attendance is limited to Government employees only so that you can talk openly with your peers.
  • Be part of a cohort that trains together over the course of a month.
  • The course is 5 Fridays so that you can learn something then take it back to work the next week.
  • We have a Government speaker ever week, from the NIST FISMA guys to agency CISOs and CIOs.
  • No pitching, no marketing, no product placement (OK, maybe we’ll go through DoJ’s CSAM but only as an example of what kinds of tools are out there) , no BS.

See you all there!



Similar Posts:

Posted in NIST, Speaking | 1 Comment »
Tags:

In Response to “Cyber Security Coming to a Boil” Comments….

Posted March 24th, 2009 by

Rybolov’s comment: This is Ian’s response to the comments for his post on Cybersecurity Coming to a Boil.  It was such a good dialog that he wanted to make a large comment which as we all know, eventually transforms itself into a blog post.  =)

You are making some excellent points; putting the leadership of the Administration’s new Cyber security initiative directly in the White House might appear to be a temporary solution or a quick fix. From my point of view, it looks more like an honest approach. By that I mean that I think the Administration is acknowledging a few things:

  • This is a significant problem
  • There is no coherent approach across the government
  • There is no clear leadership or authority to act on the issue across the government
  • Because of the perception that a large budget commitment will have to be allocated to any effective solution, many Agencies are claiming leadership or competing for leadership to scoop up those resources
  • The Administration does not know what the specific solution they are proposing is — YET

I think this last point is the most important and is driving the 60-day security assessment. I also think that assessment is much more complex than a simple review of FISMA scores for the past few years. I suspect that the 60-day review is also considering things like legal mandates and authorities for various aspects of Cyber security on a National level. If that is the case, I’m not familiar with a similar review ever having taken place.

2004 World Cyber Games photo by jurvetson.  Contrary to what the LiquidMatrix Security folks might think, the purpose of this post isn’t to jam “cyber” into every 5th word.  =)

So, where does this take us? Well, I think we will see the Cyber Security Czar, propose a unified policy, a unified approach and probably some basic enabling legislation. I suspect that this will mean that the Czar will have direct control over existing programs and resources. I think the Cyber Security Czar taking control of Cyber Security-related research programs will be one of the most visible first steps toward establishing central control.

From this we will see new organizational and reporting authorities that will span existing Agencies. I think we can also anticipate that we will see new policies put in place and a new set of guidelines of minimum level of security capabilities mandated for all Agency networks (raising bottom-line security). This last point will probably prove to be the most trying or contentious effort within the existing Agency structure. It is not clear how existing Agencies that are clearly underfunding or under supporting Cyber Security will be assessed. It is even less clear where remedial funding or personnel positions will come from. And the stickiest point of all is…. how do you reform the leadership and policy in those Agencies to positively change their security culture? I noticed that someone used the C-word in response to my initial comments. This goes way beyond compliance. In the case of some Federal Agencies and perhaps some industries we may be talking about a complete change sea-change with respect to the emphasis and priority given to Cyber Security.

These are all difficult issues. And I believe the Administration will address them one step at a time.
In the long-term it is less clear how Cyber Security will be managed. The so-called war on drugs has been managed by central authority directly from the White House for decades. And to be sure, to put a working national system together that protects our Government and critical national infrastructure from Cyber attack will probably take a similar level of effort and perhaps require a similar long-term commitment. Let’s just hope that it is better thought-out and more effective than the so-called war on drugs.

Vlad’s point concerning Intelligence Community taking the lead with respect to Cyber Security is an interesting one, I think the Intelligence Community will be important players in this new initiative. To be frank, between the Defense and Intelligence Communities there is considerable technical expertise that will be sorely needed. However, for legal reasons, there are real limits as to what the Intelligence and Defense Communities can do in many situations. This is a parallel problem to the Cyber Security as a Law Enforcement problem. The “solution” will clearly involve a variety of players each with their own expertise and authorities. And while I am not anticipating that Tom Clancy will be appointed the Cyber Security Czar any time soon. I do expect that a long-term approach will require the stand-up of either a new organization empowered to act across current legal boundaries (that will require new legislation), or a new coordinating organization like the Counter Terrorism Center, that will allow all of the current players bring their individual strengths and authorities to focus on a situation on a case by case basis as they are needed (that may require new legislation).

If you press me, I think a joint coordinating body will be the preferred choice of the Administration. Everyone likes the idea of everyone working and playing well together. And, that option also sounds a lot less expensive. And that is important in today’s economic climate.



Similar Posts:

Posted in FISMA, Public Policy, Technical | 2 Comments »
Tags:

Et Tu, TIC?

Posted October 7th, 2008 by

Let’s talk about TIC today, dear readers, for I smell a conspiracy theory brewing.

For those of you who missed the quick brief, TIC is short for “Trusted Internet Connections” and is an architecture model/mandate/$foo to take all of the Internet connections in the Government (srsly, nobody knows how many of them really exist, but it’s somewhere in the 2,000-10,000 range) and consolidate them into 50.  These connections will then be monitored by DHS’s Einstein program.

No, Not That Kind of TIC photo by m.prinke.

Bringing you all up to date, you’ll need to do some homework:

Now having read all of this, some things become fairly obvious:

  • If you have the following people needing connections:
    • 24 agencies, plus
    • DoD with 2 points of presence, plus
    • Intelligence agencies with a handful of Internet connections, means that:
  • That basically, everybody gets one Internet connection.  This is not good, it’s all single point-of-DOS.
  • Agencies have been designated as Internet providers for other agencies.  Sounds like LoB in action.
  • Given the amount of traffic going through the TIC access points, it most likely is going to take a significant amount of hardware to monitor all these connections–maybe you saved 50% of the monitoring hardware by reducing the footprint, but it’s still hardware-intensive.
  • TIC is closely tied with the Networx contract.
  • In order to share Internet connections, there needs to be a network core between all of the agencies so that an agency without a TIC access point can route through multiple TIC service provider agencies.

And this is where my conspiracy theory comes in:  TIC is more about making a grand unified Government network than it is monitoring events–Einstein is just an intermediate goal.   If you think about it, this is where the Government is headed.

We were headed this way back in ought-two with a wonderful name: GovNet.  To be honest, the groundwork wasn’t there and the idea was way ahead of its time and died a horrible death, but it’s gradually starting to happen, thanks to TIC, FDCC, and Einstein. 

More fun links:

If you want to get a reaction out of the OMB folks, mention GovNet and watch them backpedal and cringe,–I think the pain factor was very high for them on GovNet. So I think that we should, as a cadre of information security folks, start calling TIC what it really is:  Govnet 2.0!  =)



Similar Posts:

Posted in Technical | 2 Comments »
Tags:

Comments on SCAP 2008

Posted September 24th, 2008 by

I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

Presention slides are available at the NVD website

I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

  • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
  • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
  • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
  • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
  • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
  • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

Hall of Standards inside NIST HQ photo by ME!!!

What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

Side notes:

I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

Deer on NIST Gaithersburg Campus photo by Chucka_NC.



Similar Posts:

Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
Tags:

Current Government Security Initiatives

Posted May 5th, 2008 by

In building slides for our ongoing NIST Framework for FISMA class, I put together a deck of the ongoing Government security initiatives.  It’s plenty of stuff to keep you busy.

Government Security System

“Government Security System” Photo by Kahala

These are some of the more interesting initiatives and a brief description of them:

President’s Management Agenda Scorecard:  This is a quarterly red-yellow-green (hmm, wonder why nobody but the military uses black-red-yellow-green) scorecard on the various aspects of the agenda.  Security is represented as some of the values behind the E-Government score.  More specifically, OMB calls out the following in their FISMA report to congress:

To “get to green” under the E-Government scorecard, agencies must meet the following 3 security criteria:

  • IG or Agency Head verifies effectiveness of the Department-wide IT security remediation process. (rybolov: Plans of Actions and Milestones)
  • IG or Agency Head rates the agency C&A process as “Satisfactory” or better.
  • The agency has 90 percent of all IT systems properly secured (certified and accredited). (rybolov: C&A does not always equate to “secured”, but is an indicator)

In order to “maintain green,” by July 1, 2008, agencies must meet the following security and privacy criteria:

  1. All systems certified and accredited. (rybolov: same C&A caveat as before)
  2. Systems installed and maintained in accordance with security configurations. (rybolov: lots of wiggle room here since it’s the agency’s standard except for the Federal Desktop Core Configuration)
  3. Has demonstrated for 90 Percent of applicable systems a PIA has been conducted and is publicly posted. (rybolov:  PIA is a Privacy Impact Assessment.  It gets posted in the Federal Register as a public notification of what the Government is collecting and what the use is)
  4. Has demonstrated for 90 percent of systems with PII contained in a system of records covered by the Privacy Act to have developed, published, and maintained a current SORN. (rybolov: System of Record Notice, this is what is filed with the Federal Register)
  5. Has an agreed-upon plan to meet communication requirements for COOP and COG. (rybolov: Continuity of Government)

You can view the current scorecard and learn more about it at results.gov.

OMB Management Watch List:  This is a list of “at-risk” projects.  Security is one part of the list of risks, but for the most part this is a list of high-risk projects within the context of a program/project manager.  The security criteria for being on the Watch List are based on on IG assessments of:

  • Certification and Accreditation
  • Plan of Actions and Milestones
  • Privacy Impact Assessment

 You can check out the most recent Watch List at OMB’s website.

Combined Catalog of Controls:  Superseding DoDI 8500.2 (DoD catalog of controls) and DCID 6/3 (intelligence community catalog of controls) with a reinforced SP 800-53.  Process flow would be along SP 800-37.  I’ve talked about this before.

Security Line of Business:  Agencies become subject-matter experts in an area and become a contractor to the other agencies.  Not a new concept, we’ve seen it elsewhere.

Privacy Management:  OMB Memo 07-16 lays out a privacy plan containing the following tenets:

  • Breach Notification:  Requires each agency to have a breach notification policy
  • SSN Reduction:  Each agency reduces the use of Social Security Numbers where not needed
  • PII Reduction:  Restrict the collection of PII where not needed
  • Rules of Behavior:  Rules for employees to follow when they deal with PII

SCAP and FDCC:  I’ve covered these in much detail. 

Trusted Internet Connections: This is a plan to reduce the number of Government internet connections to 50.  Even the most ardent OMB supporters have to agree that this is both a fairly arbitrary number, not achiveable in the next several years, and not even really a good idea.  You heard it here first, folks, but conventional wisdom says that 500 is a better, more realistic number for the time being, and that is the “real” number that OMB is considering.  The start of this is OMB Memo 08-05.

Einstein:  Basically a Government-wide IDS and SIEM run by US-CERT.  It’s offered under the Security Line of Business.  The good thing about Einstein is that it allows DHS to correllate events government-wide.

Air Force Cyber Command:  It’s provisional now, doesn’t have a permanent headquarters, and is trying to figure out what its mission is, but it’s here.  Gossip around town is that it’s focused on both defensive and offensive missions, although they pictures are all defensive-based.  There’s some information on their website, but be sure to read between the lines.  =)

Cyber Corps:  Scholarship program for college students (both post-grad and undergrad) with a public service obligation following graduation.  You can find out more here.

SmartBuy:  A GSA-run program to bulk-purchase commercial off-the-shelf software at a high-volume discount.  Think of it as a buyer’s club for software.  SmartBuy has disk-encryption software.  You can get more information on the GSA website.



Similar Posts:

Posted in FISMA | 2 Comments »
Tags:


Visitor Geolocationing Widget: