Some Comments on SP 800-39

Posted April 6th, 2011 by

You should have seen Special Publication 800-39 (PDF file, also check out more info on Fismapedia.org) out by now.  Dan Philpott and I just taught a class on understanding the document and how it affects security managers out them doing their job on a daily basis.  While the information is still fresh in my head, I thought I would jot down some notes that might help everybody else.

The Good:

NIST is doing some good stuff here trying to get IT Security and Information Assurance out of the “It’s the CISO’s problem, I have effectively outsourced any responsibility through the org chart” and into more of what DoD calls “mission assurance”.  IE, how do we go from point-in-time vulnerabilities (ie, things that can be scored with CVSS or tested through Security Test and Evaluation) to briefing executives on what the risk is to their organization (Department, Agency, or even business) coming from IT security problems.  It lays out an organization-wide risk management process and a framework (layer cakes within layer cakes) to share information up and down the organizational stack.  This is very good, and getting the mission/business/data/program owners to recognize their responsibilities is an awesome thing.

The Bad:

SP 800-39 is good in philosophy and a general theme of taking ownership of risk by the non-IT “business owners”, when it comes to specifics, it raises more questions than it answers.  For instance, it defines a function known as the Risk Executive.  As practiced today by people who “get stuff done”, the Risk Executive is like a board of the Business Unit owners (possibly as the Authorizing Officials), the CISO, and maybe a Chief Risk Officer or other senior executives.  But without the context and asking around to find out what people are doing to get executive buy-in, the Risk Executive seems fairly non-sequitor.  There are other things like that, but I think the best summary is “Wow, this is great, now how do I take this guidance and execute a plan based on it?”

The Ugly:

I have a pretty simple yardstick for evaluating any kind of standard or guideline: will this be something that my auditor will understand and will it help them help me?  With 800-39, I think that it is written abstractly and that most auditor-folk would have a hard time translating that into something that they could audit for.  This is both a blessing and a curse, and the huge recommendation that I have is that you brief your auditor beforehand on what 800-39 means to them and how you’re going to incorporate the guidance.



Similar Posts:

Posted in FISMA, NIST, Risk Management, What Works | 5 Comments »
Tags:

Google Advanced Operators and Government Website Leakage

Posted August 24th, 2010 by

Ah yes, the magic of Google hacking and advanced operators.  All the “infosec cool kids” have been having a blast this week using a combination of filetype and site operators to look for classification markings in documents. I figure that with the WikiLeaks brouhaha lately, it might be a good idea to write a “howto” for government organizations to check for web leaks.

Now for the search string:, “enter document marking here” site:agency.gov filetype:rtf | filetype:ppt | filetype:pptx | filetype:csv | filetype:xls | filetype:xlsx | filetype:docx | filetype:doc | filetype:pdf looks for typical document formats on the agency.gov website looking for a specific caveat.  You could easily put in a key phrase used for marking sensitive documents in your agency.  Obviously there will be results from published organizational policy describing how to mark documents, but there will also be other things that should be looked at.

Typical document markings, all you have to do is pick out key phrases from your agency policy that have the verbatim disclaimer to put on docs:

  • “This document contains sensitive security information”
  • “Disclosure is prohibited”
  • “This document contains confidential information”
  • “Not for release”
  • “No part of this document may be released”
  • “Unauthorized release may result in civil penalty or other action”
  • Any one of a thousand other key words listed on Wikipedia

Other ideas:

  • Use the “site:gov” operator to look for documents government-wide.
  • Drop the “site” operator altogether and look for agency information that has been published on the web by third parties.
  • Chain the markings together with an “or” for one long search string: “not for release” | “no part of this document may be released” site:gov filetype:rtf | filetype:ppt | filetype:pptx | filetype:csv | filetype:xls | filetype:xlsx | filetype:docx | filetype:doc | filetype:pdf

If you’re not doing this already, I recommend setting up a weekly/daily search looking for documents that have been indexed and follow up on them as an incident.



Similar Posts:

Posted in Hack the Planet, Technical, What Works | 2 Comments »
Tags:

Privacy Camp DC on June 20th

Posted June 11th, 2009 by

Saturday, June 20, 2009 from 8:00 AM – 5:00 PM (ET) in downtown DC.

I’ll be going.  This will be a “Bar Camp Stylie” event, where you’re not just an attendee, you’re also a volunteer to make it all happen.  You might end up running a conversation on your favorite privacy topic, so you have been warned. =)

*Most* of the folks going are of the civil libertarian slant.  With my background and where I work, I usually “bat for the other team on this issue”.  The organizers have assured me that I’ll be welcome and can play the heretic role.

How to play:

Some themes that I’ve seen develop so far:

  • How some concepts (System of Record) from the Privacy Act are outdated or at least showing their age
  • How the open government “movement” and the push for raw data means we need to look at the privacy concerns
  • FOIA and privacy data
  • Ending the political robocalls

See Y’all there!



Similar Posts:

Posted in Public Policy, Speaking | No Comments »
Tags:

Ed Bellis’s Little SCAP Project

Posted March 19th, 2009 by

So way back in the halcyon days of 2008 when Dan Philpott, Chris Burton, Ian Charters, and I went to the NIST SCAP Conference.  Just by a strange coincidence, Ed Bellis threw out a twit along the lines of “wow, I wish there was a way to import and export all this vulnerability data” and I replied back with “Um, you mean like SCAP?

Fast forward 6 months.  Ed Bellis has been busy.  He delivered this presentation at SnowFROC 2009 in Denver:

So some ideas I have about what Ed is doing:

#1 This vulnerability correllation and automation should be part of vulnerability assessment (VA) products.  In fact, most VA products include some kind of ticketing and workflow nowadays if you get the “enterprise edition”. That’s nice, but…

#2 The VA industry is a broken market with compatibility in workflow.  Everybody wants to sell you *their* product to be the authoritative manager. That’s cool and all, but what I really need is the connectors to your competitor’s products so that I can have one database of vulnerabilities, one set of charts to show my auditors, and one trouble ticket system. SCAP helps here but only for static, bulk data transfers–that gets ugly really quickly.

#3 Ed’s correllation and automation software is a perfect community project because it’s a conflict of interest for any VA vendor to write it themselves. And to be honest, I wouldn’t be surprised if there aren’t a dozen skunkwork projects that people will admit to creating just in the comments section of this post. I remember 5 years ago trying to hack together some perl to take the output from the DISA SRR Scripts and aggregate them into a .csv.

#4 The web application security world needs to adopt SCAP. So far it’s just been the OS and shrinkwrapped application vendors and the whole race to detection and patching. Now the interesting part to me is that the market is all around tying vulnerabilities to specific versions of software and a patch, where when you get to the web application world, it’s more along the lines of one-off misconfigurations and coding errors. It takes a little bit of a mindshift in the vulnerability world, but that’s OK in my book.

#5 This solution is exactly what the Government needs and is exactly why SCAP was created. Imagine you’re the Federal Government with 3.5 million desktops, the only way you can manage all those is through VA automation and a tool that aggregates information from various VA products across multiple zones of trust, environments, and even organizations.

#6 Help Ed out! We need this.



Similar Posts:

Posted in Technical, What Works | 4 Comments »
Tags:

A Perspective on the History of Digital Forensics

Posted January 27th, 2009 by

Back in 1995 the junior high school students around the world were taken in by a sensationalized and carefully marketed hoax film called Alien Autopsy. Alien Autopsy was in fact a cheap film purporting to be actual footage of an actual autopsy of the cadaver of an extraterrestrial. The film was marketed as footage shot during the famous 1947 Roswell incident.

Alien Autopsy photo by jurvetson.

Well, back in 1995 I was in a mood for a good laugh so I popped up some popcorn, chilled a six-pack of Mountain Dew and kicked up my feet for a little silly entertainment. A couple of friends came over just in time for the show. So, I popped more popcorn, chilled more drinks and we all had a great time giggling, guffawing, and generally acting like a bunch of nitwits having some good clean fun.

Then in 2005, my wife asked if I could sit down with her to watch something called Grey’s Anatomy. Thinking that I was about to relive a guilty pleasure from ten years before, I readily agreed. Let’s face it, a show called Grey’s Anatomy could only be a sequel to the 1995 Alien Autopsy.

Well, having been fooled, I shared my mistake and agony with the guys at work the next day. To say the least, they were amused at the story but entirely at my expense. Some mistakes in life should just never be mentioned again.

I hope that is not the case with today’s comments. Today, I’d like to encourage you all to down load and read my paper on the History of Digital Forensics (.pdf caveat applies). It is based on a paper I presented at NIST’s annual digital forensics conference. However, since the slides from briefings do not read well, I converted the presentation to prose. Dissect it as you think appropriate. That is to say, let me know what you think.



Similar Posts:

Posted in NIST, Technical | 2 Comments »
Tags:

Database Activity Monitoring for the Government

Posted November 11th, 2008 by

I’ve always wondered why I have yet to meet anyone in the Government using Database Activity Monitoring (DAM) solutions, and yet the Government has some of the largest, most sensitive databases around.  I’m going to try to lay out why I think it’s a great idea for Government to court the DAM vendors.

Volume of PII: The Government owns huge databases that are usually authoritative sources.  While the private sector laments the leaks of Social Security Numbers, let’s stop and think for a minute.  There is A database inside the Social Security Administration that holds everybody’s number and is THE database where SSNs are assigned.  DAM can help here by flagging queries that retrieve large sets of data.

Targetted Privacy Information:  Remember the news reports about people looking at the presidential candidate’s passport information?  Because of the depth of PII that the Government holds about any one individual, it provides a phenomenal opportunity for invation of someone’s privacy.  DAM can help here by flagging VIPs and sending an alert anytime one of them is searched for. (DHS guys, there’s an opportunity for you to host the list under LoB)

Sensitive Information: Some Government databases come from classified sources.  If you were to look at all that information in aggregate, you could determing the classified version of events.  And then there are the classified databases themselves.  Think about Robert Hanssen attacking the Automated Case System at the FBI–a proper DAM implementation would have noticed the activity.  One interesting DAM rule here:  queries where the user is also the subject of the query.

Financial Data:  The Government moves huge amounts of money, well into $Trillions.  We’re not just talking internal purchasing controls, it’s usually programs where the Government buys something or… I dunno… “loans” $700B to the financial industry to stay solvent.  All that data is stored in databases.

HR Data:  Being one of the largest employers in the world, the Government is sitting on one of the largest repository of employee data anywhere.  That’s in a database, DAM can help.

 

Guys, DAM in the Government just makes sense.

 

Problems with the Government adopting/using DAM solutions:

DAM not in catalog of controls: I’ve mentioned this before, it’s the dual-edge nature of a catalog of controls in that it’s hard to justify any kind of security that isn’t explicitly stated in the catalog.

Newness of DAM:  If it’s new, I can’t justify it to my management and my auditors.  This will get fixed in time, let the hype cycle run itself out.

Historical DAM Customer Base:  It’s the “Look, I’m not a friggin’ bank” problem again.  DAM vendors don’t actively pursue/understand Government clients–they’re usually looking for customers needing help with SOX and PCI-DSS controls.

 

 

London is in Our Database photo by Roger Lancefield.



Similar Posts:

Posted in Rants, Risk Management, Technical, What Works | 2 Comments »
Tags:

« Previous Entries


Visitor Geolocationing Widget: