The 10 CAG-egorically Wrong Ways to Introduce Standards

Posted February 20th, 2009 by

The Consensus Audit Guidelines (CAG) appear, at this point, to be a reasonable set of guidelines for mediating some human threats. I’m looking forward to seeing what CAG offers and have no doubt there will be worthwhile and actionable controls in the document. That said, there are significant reasons approach CAG with skepticism and assess it critically.

The motivation for CAG is described in a set of slides at the Gilligan Group site. It starts with a focus on what CIO’s fear most: attacks, reduced operational capability, public criticism, data loss, etc. Then it rightly questions whether FISMA is adequately addressing those problems. It doesn’t and this is the genesis of the CAG.

Consensus photo by Eirik Newth.

Unfortunately CAG subsequently develops by pairing this first valid premise with a set of false premises.  These propositions are drawn from slides at gilligangroupinc.com, attributed to John Gilligan or Alan Paller:

  1. All that matters are attacks. The central tenet of Bush’s Comprehensive National Cyber Initiative (CNCI) is adopted as the CAG theme: “Defense Must Be Informed by the Offense”. CAG envisions security as defense against penetration attacks. As any seasoned security practitioner knows, attacks are a limited subset of the threats to confidentiality, integrity and availability that information and information systems face.
  2. Security through obscurity. CAG seems to have taken the unspoken CNCI theme to heart too, “The most effective security is not exposed to public criticism.” Since its very public December 11th announcement no drafts have been made publicly available for comment.
  3. False dichotomy. CAG has been promoted as an alternative to the OMB/NIST approach to FISMA. It isn’t. An alternative would target a fuller range of threats to information and information system security. CAG should be considered a complement to NIST guidance, an addendum of security controls focused on defense against penetration by hackers. NIST has even acted on this approach by including some CAG controls into the 800-53 Rev. 3 catalog of controls.
  4. There is too much NIST guidance! This is the implication of one CAG slide that lists 1200 pages of guidance, 15 FIPS docs and the assorted Special Publications not related to FISMA as detriments to security. It’s like complaining that Wikipedia has too many articles to contribute to improved learning. Speaking as someone who scrambled to secure Federal systems before FISMA and NIST’s extensive guidance, having that documentation greatly improves my ability to efficiently and effectively secure systems.
  5. NIST guidance doesn’t tell me how to secure my systems! NIST’s FISMA guidance doesn’t step you through securing your SQL Server. The Chairman of the Joint Chiefs also doesn’t deliver your milk. Why not? It’s not their job. NIST’s FISMA guidance helps you to assess the risks to the system, decide how to secure it, secure it accordingly, check that a minimum of controls are in place and then accept responsibility for operating the system. NIST also provides documents, checklists, repositories, standards, working groups and validation of automated tools that help with the actual security implementation.
  6. Automated security controls negate human errors. With the premise of all threats being attacks this is nearly a plausible premise. But not all security is technical. Not all threats come from the Internet. DHS, NIST, Mitre, and their partners have pursued automated security controls to enforce and audit security controls for years but automated security controls can only go so far. Human errors, glitches, unexpected conflicts and operational requirements will always factor into the implementation of security.
  7. Audit compatibility as a hallmark of good security. There is a conflict of focus at the heart of the CAG, it seeks to both improve its subset of security and improve audit compatibility. For technical controls this is somewhat achievable using automation, something NIST has pursued for years with government and industry partners. For operational and management controls it results in audit checklists. But audits are fundamentally concerned with testing the particular and repeatable, security needs focus on evaluating the whole to ensure the necessary security results. An audit sees if antivirus software is installed, an evaluation sees if the antivirus software is effective.
  8. Metrics, but only these metrics over here. When selecting the current crop of CAG controls decisions on what to include were reportedly based on metrics of the highest threats. Great idea, a quantitative approach often discovers counter-intuitive facts. Only the metrics were cherry picked. Instead of looking at all realized threats or real threat impacts only a count of common penetration attacks were considered.
  9. With a sample of 1. As a basis for determining what security should focus on the whole breadth of the security profession was queried, so long as they were penetration testers. Yes, penetration testers are some very smart and talented people but penetration testing is to security what HUMINT is to intelligence services. Important players, expert practitioners but limited in scope and best used in conjunction with other intelligence assets.
  10. Assessments rely on paper artifacts. The NIST guidance does not require paper artifacts. The first line in the NIST SP 800-53A preface is, “Security control assessments are not about checklists, simple pass-fail results, or generating paperwork to pass inspections or audits-rather, security controls assessments are the principal vehicle used to verify that the implementers and operators of information systems are meeting their stated security goals and objectives.” NIST SP 800-37 specifically and repeatedly states, “Security accreditation packages can be submitted in either paper or electronic format.”

CAG is a missed opportunity. Of the myriad problems with our current FISMA regime a lot of good could be achieved. The problems with guidance have many causes but can be addressed through cooperative development of best practices outside of NIST. The Assessment Cases for SP 800-53A is an example of how cooperative development can achieve great results and provide clear guidance. Other problems exist and can be addressed with better training and community developments.

My hope is that the Consensus Audit Guidelines will move towards a more open, collaborative development environment. The first release is sure to deliver useful security controls against penetration attacks. As with all good security practices it will likely need to go through a few iterations and lots of critical assessment to mature. An open environment would help foster a more complete consensus.

Consensus photo by mugley.



Similar Posts:

Posted in BSOFH, FISMA, Rants, Technical, What Doesn't Work, What Works | 9 Comments »
Tags:

Beware the Cyber-Katrina!

Posted February 19th, 2009 by

Scenario: American Internet connections are attacked.  In the resulting chaos, the Government fails to respond at all, primarily because of infighting over jurisdiction issues between responders.  Mass hysteria ensues–40 years of darkness, cats sleeping with dogs kind of stuff.

Sounds similar to New Orleans after Hurricane Katrina?  Well, this now has a name: Cyber-Katrina.

At least, this is what Paul Kurtz talked about this week at Black Hat DC.  Now I understand what Kurtz is saying:  that we need to figure out the national-level response while we have time so that when it happens we won’t be frozen with bureaucratic paralysis.  Yes, it works for me, I’ve been saying it ever since I thought I was somebody important last year.  =)

But Paul…. don’t say you want to create a new Cyber-FEMA for the Internet.  That’s where the metaphor you’re using failed–if you carry it too far, what you’re saying is that you want to make a Government organization that will eventually fail when the nation needs it the most.  Saying you want a Cyber-FEMA is just an ugly thing to say after you think about it too long.

What Kurtz really meant to say is that we don’t have a national-level CERT that coordinates between the major players–DoD, DoJ, DHS, state and local governments, and the private sector for large-scale incident response.  What’s Kurtz is really saying if you read between the lines is that US-CERT needs to be a national-level CERT and needs funding, training, people, and connections to do this mission.  In order to fulfill what the administration wants, needs, and is almost promising to the public through their management agenda, US-CERT has to get real big, real fast.

But the trick is, how do you explain this concept to somebody who doesn’t have either the security understanding or the national policy experience to understand the issue?  You resort back to Cyber-Katrina and maybe bank on a little FUD in the process.  Then the press gets all crazy on it–like breaking SSL means Cyber-Katrina Real Soon Now.

Now for those of you who will never be a candidate for Obama’s Cybersecurity Czar job, let me break this down for you big-bird stylie.  Right now there are 3 major candidates vying to get the job.  Since there is no official recommendation (and there probably won’t be until April when the 60 days to develop a strategy is over), the 3 candidates are making their move to prove that they’re the right person to pick.  Think of it as their mini-platforms, just look out for when they start talking about themselves in the 3rd person.

FEMA Disaster Relief photo by Infrogmation. Could a Cyber-FEMA coordinate incident response for a Cyber-Katrina?

And in other news, I3P (with ties to Dartmouth) has issued their National Cyber Security Research and Development Challenges document which um… hashes over the same stuff we’ve seen from the National Strategy to Secure Cyberspace, the Systems and Technology Research and Design Plan, the CSIS Recommendations, and the Obama Agenda.  Only the I3P report has all this weird psychologically-oriented mumbo-jumbo that when I read it my eyes glazed over.

Guys, I’ve said this so many times I feel like a complete cynic: talk is cheap, security isn’t.  It seems like everybody has a plan but nobody’s willing to step up and fix the problem.  Not only that, but they’re taking each others recommendations, throwing them in a blender, and reissuing their own.  Wake me up when somebody actually does something.

It leads me to believe that, once again, those who talk don’t know, and those who know don’t talk.

Therefore, here’s the BSOFH’s guide to protecting the nation from Cyber-Katrina:

  • Designate a Cybersecurity Czar
  • Equip the Cybersecurity Czar with an $100B/year budget
  • Nationalize Microsoft, Cisco, and one of the major all-in-one security companies (Symantec)
  • Integrate all the IT assets you now own and force them to write good software
  • Public execution of any developer who uses strcpy() because who knows what other stupid stuff they’ll do
  • Require code review and vulnerability assessments for any IT product that is sold on the market
  • Regulate all IT installations to follow Government-approved hardening guides
  • Use US-CERT to monitor the military-industrial complex
  • ?????
  • Live in a secure Cyber-World

But hey, that’s not the American way–we’re not socialists, damnit! (well, except for mortgage companies and banks and automakers and um yeah….)  So far all the plans have called for cooperation with the public sector, and that’s worked out just smashingly because of an industry-wide conflict of interest–writing junk software means that you can sell for upgrades or new products later.

I think the problem is fixable, but I predict these are the conditions for it to happen:

  • Massive failure of some infrastructure component due to IT security issues
  • Massive ownage of Government IT systems that actually gets publicized
  • Deaths caused by massive IT Security fail
  • Osama Bin Laden starts writing exploit code
  • Citizen outrage to the point where my grandmother writes a letter to the President

Until then, security issues will be always be a second-fiddle to wars, the economy, presidential impeachments, and a host of a bazillion other things.  Because of this, security conditions will get much, much worse before they get better.

And then the cynic in me can’t help but think that, deep down inside, what the nation needs is precisely an IT Security Fail along the lines of 9-11/Katrina/Pearl Harbor/Dien Bien Fu/Task Force Smith.



Similar Posts:

Posted in BSOFH, Public Policy, Rants | 6 Comments »
Tags:

Cyber-FEMA LOLCATS Prepare for Cyber-Katrina

Posted February 19th, 2009 by

Yes, I understand what Paul Kurtz is saying in that we need a single command structure for large-scale IT security incident response before we have bureaucratic paralysis like the previous administration’s response to Hurricane Katrina, but the metaphor is way ugly–too ugly just to let it go without IKANHAZFIZMA getting involved.  =)

More serious commentary if I ever get done with the “death by work” that the last 2 weeks has been.

funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | 1 Comment »
Tags:

Your Friendly Neighborhood C&A Podcast Panel

Posted February 17th, 2009 by

This weekend, Joe Faraone (Vlad the Impaler), Graydon Mckee, and I teamed up to be a guest panel for Michael Santarcangelo’s Security Catalyst podcast.  We wax esoterically on the fine points of certification and accreditation and what kind of value that it brings to an agency or company that does it right.

You can check it out here.



Similar Posts:

Posted in Speaking, What Works | No Comments »
Tags:

Everything I know about security, I learned from Ghostbusters…

Posted February 17th, 2009 by

(Well maybe not everything…)
I’ve been the defacto security officer at a government agency going on two years now; it’s been quite a challenge. Without getting too deeply into how this happened (since I’m a contractor), I’d like to share some of the insights, horror stories, tips, and interesting anecdotes I’ve gathered over the past 22+ months.

If nothing else, many of my “preconceived notions” about managing an effective security program at a federal agency have been confirmed. Many others have been changed in ways I would never have suspected. I’m going to attempt to explain these in what I hope is an insightful, if not humorous way.

Ghostbusters works for me… At the time (1984), it was, hands-down, the funniest movie I had ever seen–it left its mark. It sure beats “Dude Where’s My Car?” for quotes that can be applied to security. But then some may say I’ve either set the bar a bit low, or I need to expand my movie viewing habits. Hey, work with me on this one people!!!

So, here are several quotes from the movie and their application to my philosophy on information security. I hope you enjoy it!


Ecto-1 photo by chad davis.

I’m from security, and I’m ready to believe you.
Listen. Foster discussion. Then, draw upon your experience and make your decision. Do not enter into a discussion with a mandate (unless from above). Mandates do not foster discussions, especially in areas where policy is absent or maybe not-so-explicit. Most importantly, this is an invitation for the person you’re talking to begin their side of their story.
Important Safety Tip: As the security professional, remember – this is the time for you to begin listening!

“Next time, if <someone> asks whether you’re a GOD, you say YES!”
Face it. Many of us security folks are humble. We all may even know what it is we don’t know. We might be a little gun-shy in our first few weeks on the job. However, don’t let your humility or shyness overcome you…

Like it or not, you are your organization’s security expert. “The Shell Answer Man,” the “Pro from Dover,” the “Go-to Guy/Gal.” While you may not have committed the processes contained within the IKE negotiation phases to memory, and may not be able to quote RFC 3514 off the top of your head, you probably DO know where to find the information… “I don’t know,” should never roll off your lips.

When you’re hired as the subject matter expert on security, you need to be confident–whether you’re knocking a soft-toss out of the park, but especially when you tell folks that you’ll research the topic and get back to them. Come back with the facts, and your credibility will be strengthened.

Likewise, when you have reservations about a particular situation, let folks know why you’re not jumping on board their crazy train. Invite discussion. State your case plainly and propose solutions, or if you can’t suggest an alternative, discuss it offline in another meeting focused on solutions. While your mission is to guard the organization’s interests, you can’t do so at the expense of the organization’s mission. Working closely with client service or engineering teams shows that security can be an integral part of solution development, and not an impediment. Think of this as guiding others to the solution – without telling them the “right” answer. This allows others to “own” the solution – their help may be valuable, if not necessary to help you socialize a potentially contentious (or expensive) solution.

“Don’t cross the streams…”
I love this one. I get to use this at least twice a day while speaking to engineering, operations, management or other folks at my agency. It’s gotten so that people have heard it so many times, they’re using it. Best part is, they are using the phrase correctly!

So what does this mean exactly? Generally/normally, the following things should never be directly connected to one another:

  • Classified and Unclassified Networks
  • The Internet and a Classified Network
  • Networks classified at different levels
  • Development, Test, and Production Networks/Environments
  • Accredited/trusted networks / less trusted
  • Management and Production Networks

“Wait! I thought you said crossing the streams was BAD?!”
So, what does this Ghostbusters quote mean to we security folk?
Every policy, however rigidly enforced, needs a waiver process.

So what do I really mean? When you understand and can quantify the risk of a particular practice or a particular action, you can develop compensating controls to make otherwise unthinkable practices (e.g., connecting unclassified networks to classified networks) less risky. In this example, it can be done using one-way guard technology, or some other similar trusted, manual process.

Face it, jumping off a bridge can be dangerous, if not suicidal. However, when the jumper attaches themselves to a bungee cord or uses a parasail, the act of jumping off a bridge can be reduced from a Darwin-qualifying stunt to thrilling fun or awesome opening movie scene (like the opening of the first XXX movie starring Vin Diesel as Xander Cage). It may not be for everyone – but, given the right safety equipment, some of us might even consider taking the leap.

There’s an even better example. Let’s say your network security policy forbids use of USB memory devices. Anyone seen with one is given a stern talking-to, if not killed outright. Well, maybe not killed… the first time. Let’s say a virus or worm gets into your network. Hey – it happens. As a precautionary measure, your response to this type of incident requires you to sever your network connections to your business partners as well as the Internet. So… How do you get the new virus definition file and virus engine from your Platinum Support Provider and install it on your server? It just so happens that in this case, you downloaded a copy using your uninfected laptop via your home internet connection… onto a USB memory stick. So, how do you reconcile what needs to be done against your policy? Obviously, an exception to the policy needs to be made.

As a matter of fact, every organization needs a policy that allows exceptions to be made to existing policy. This may sound like doublespeak, and the above may not be the best example, but it certainly does illustrate the point.

“What about the Twinkie?  Tell him about the Twinkie?!”
Never hide stuff from superiors. They don’t like surprises.
Never hide stuff from auditors. They have less of a sense of humor than your superiors.

“Human sacrifice, dogs and cats living together… MASS HYSTERIA.”
FUD doesn’t work. Don’t try it!

I hope these good-natured examples have gotten you to laugh (minimally), or possibly gotten the aspiring CISOs among you to think about how you might use humor in your day-to-day existence. I’d like to leave you with one more thought:
If you’re not having fun, you’re doing it wrong!

Cheers,
Vlad

FUD Fighter photo by cote.



Similar Posts:

Posted in BSOFH | 4 Comments »
Tags:

The Authorizing Official and the Lolcat

Posted February 12th, 2009 by

Hot on the heels of our DAA presentation, the Guerilla CISO is proud to present our lolcat Authorizing Official.

Yes, it’s all about the budget. If you do not have a budget, you do not have the ability to change things. We have a word for people who have all the authority of an Authorizing Official but none of the budget: scapegoat.

funny pictures

And since I’m in Toronto for an extended stay thanks to the weather, today is a 2-fer:
funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | No Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: