Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 3

Posted April 30th, 2009 by

Rybolov Note: this is part 3 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two here. Go read part four hereGo read part 5 here. =)

SEC. 13. CYBERSECURITY COMPETITION AND CHALLENGE. This section of the bill creates a series of competitions for a range of ages and skills… with cash prizes!  Mostly it’s just the administration of competitions–cash prizes, no illegal activities, etc.

This goes back to the age-old discussions of glorification of illegal activities, giving tools to people who are too young to know how to stay out of jail.

But then again, I know why this section of the bill is in there.  If we want to grow enough security professionals to even remotely keep up with demand, we need to do a much better job at recruiting younger techies to the “security dark side”.  Competitions are a start, the next step is to get them into formal education and apprenticeships to learn from the gray-hairs that have been in industry for awhile.

Once again, the same verbiage about tasking Commerce with leading this effort… I’m not sure they’re the ones to do this.

Verdict: Already happening although in ad-hoc fashion.  I’m not sold on teaching high school kids to hack, but yeah, we need to do this.

SEC. 14. PUBLIC-PRIVATE CLEARINGHOUSE. Although the title of this sounds really cool, like super-FOIA stuff, it’s really just information-sharing with critical infrastructure owners and operators.

One interesting provision is this:

“The Secretary of Commerce–

(1) shall have access to all relevant data concerning such networks without regard to any provision of law, regulation, rule, or policy restricting such access”

In other words, all your critical infrastructure information belong to Feds.  This is interesting because it can run the range from the Feds asking power grid operators for information and getting what they get, or it can be stretched into justification for auditing of privately-owned critical infrastructure.  I’m pretty sure that they mean the former, but I can see the latter being used at a later stage in the game.

One thing I thought was interesting is that this section only refers to information sharing with critical infrastructure.  There is a big gap here in sharing information with state and local government, local (ie, non-Federal) law enforcement, and private industry.  I think other sections–most notably  section 5–deal with this somewhat, but it’s always been a problem with information dissemination because how do you get classified data down to the people who need it to do their jobs but don’t have any level of clearance or trustability other than they won an election to be sheriff in Lemhi County, Idaho? (population 5000)  Also reference the Homeland Security Information Network to see how we’re doing this today.

Verdict: Really, I think this section is a way for the Feds to gather information from the critical infrastructure owners and I don’t see much information flow the other way, since the means for the flow to critical infrastructure owners already exists in HSIN.

Capitol photo by rpongsaj.

SEC. 15. CYBERSECURITY RISK MANAGEMENT REPORT. This small section is to do some investigation on something that has been bouncing around the security community for some time now: tying security risks into financial statements, cyberinsurance, company liability, etc.

Verdict: Seems pretty benign, hope it’s not just another case where we report on something and nothing actually happens. This has potential to be the big fix for security because it deals with the business factors instead of the symptoms.

SEC. 16. LEGAL FRAMEWORK REVIEW AND REPORT. This section requires a review of the laws, national-level policies, and basically what is our national-level governance for IT security.  As weird as this sounds, this is something that needs to be done because once we have a national strategy that aligns with our laws and policies and then is translated into funding and tasks to specific agencies, then we might have a chance at fixing things.  The one caveat is that if we don’t act on the report, it will become yet another National Strategy to Secure Cyberspace, where we had lots of ideas but they were never fulfilled.

Verdict: Some of this should have been done in the 60-day Cybersecurity Review.  This is more of the same, and is a perfect task for the Cybersecurity Advisor when the position is eventually staffed.

SEC. 17. AUTHENTICATION AND CIVIL LIBERTIES REPORT. This section is really short, but read it verbatim here, you need to because this one sentence will change the game considerably.

“Within 1 year after the date of enactment of this Act, the President, or the President’s designee, shall review, and report to Congress, on the feasibility of an identity management and authentication program, with the appropriate civil liberties and privacy protections, for government and critical infrastructure information systems and networks.”

So my take on it is something like REAL-ID and/or HSPD-12 but for critical infrastructure.

My personal belief is that if you have centralized identity management, it runs contrary to civil liberties and privacy protections: the power of identification lies with the group that issues the identification.  Hence the “rejection” of REAL-ID.

If I operated critical infrastructure, I would definitely protest this section because it gives the Government the decision-making authority on who can access my gear.  Identity and access management is so pivotal to how we do security that there is no way I would give it up.

On the bright side, this section just calls for a feasibility report.

Verdict: Oh man, identification and authentication nation-wide for critical infrastructure?  We can’t even do it in a semi-hierarchical top-down world of Government agencies, much less the privately-owned critical infrastructure.



Similar Posts:

Posted in Public Policy | 1 Comment »
Tags:

Database Activity Monitoring for the Government

Posted November 11th, 2008 by

I’ve always wondered why I have yet to meet anyone in the Government using Database Activity Monitoring (DAM) solutions, and yet the Government has some of the largest, most sensitive databases around.  I’m going to try to lay out why I think it’s a great idea for Government to court the DAM vendors.

Volume of PII: The Government owns huge databases that are usually authoritative sources.  While the private sector laments the leaks of Social Security Numbers, let’s stop and think for a minute.  There is A database inside the Social Security Administration that holds everybody’s number and is THE database where SSNs are assigned.  DAM can help here by flagging queries that retrieve large sets of data.

Targetted Privacy Information:  Remember the news reports about people looking at the presidential candidate’s passport information?  Because of the depth of PII that the Government holds about any one individual, it provides a phenomenal opportunity for invation of someone’s privacy.  DAM can help here by flagging VIPs and sending an alert anytime one of them is searched for. (DHS guys, there’s an opportunity for you to host the list under LoB)

Sensitive Information: Some Government databases come from classified sources.  If you were to look at all that information in aggregate, you could determing the classified version of events.  And then there are the classified databases themselves.  Think about Robert Hanssen attacking the Automated Case System at the FBI–a proper DAM implementation would have noticed the activity.  One interesting DAM rule here:  queries where the user is also the subject of the query.

Financial Data:  The Government moves huge amounts of money, well into $Trillions.  We’re not just talking internal purchasing controls, it’s usually programs where the Government buys something or… I dunno… “loans” $700B to the financial industry to stay solvent.  All that data is stored in databases.

HR Data:  Being one of the largest employers in the world, the Government is sitting on one of the largest repository of employee data anywhere.  That’s in a database, DAM can help.

 

Guys, DAM in the Government just makes sense.

 

Problems with the Government adopting/using DAM solutions:

DAM not in catalog of controls: I’ve mentioned this before, it’s the dual-edge nature of a catalog of controls in that it’s hard to justify any kind of security that isn’t explicitly stated in the catalog.

Newness of DAM:  If it’s new, I can’t justify it to my management and my auditors.  This will get fixed in time, let the hype cycle run itself out.

Historical DAM Customer Base:  It’s the “Look, I’m not a friggin’ bank” problem again.  DAM vendors don’t actively pursue/understand Government clients–they’re usually looking for customers needing help with SOX and PCI-DSS controls.

 

 

London is in Our Database photo by Roger Lancefield.



Similar Posts:

Posted in Rants, Risk Management, Technical, What Works | 2 Comments »
Tags:

When the Feds Come Calling

Posted October 21st, 2008 by

I’ve seen the scenario about a dozen times in the last 2 months–contractors and service providers of all sorts responding to the Government’s security requirements in the middle of a contract.  It’s almost reached the stage where I have it programmed as a “battle drill” ala the infantryman’s Battle Drill 1A, and I’m here to share the secret of negotiating these things.

Let’s see, without naming names, let’s look at where I’ve seen this come up:

  • Non-Government Organizations that assist the Government with para-Government services to the citizens
  • Companies doing research and development funded by the Government–health care and military
  • Universities who do joint research with the Government
  • Anybody who runs something that the Government has designated as “critical infrastructure”
  • State and local governments who use Federal Government data for their social plans (unemployment system, food stamps, and ) and homeland security-esque activities (law enforcement, disaster response)
  • Health Care Providers who service Government insurance plans

For the purposes of this blog post, I’ll refer to all of these groups as contractors or service providers.  Yes, I’m mixing analogies, making huge generalizations, and I’m not precise at all.  However, these groups should all have the same goals and the approach is the same, so bear with me while I lump them all together.

Really, guys, you need to understand both sides of the story because this a cause for negotiations.  I’ll explain why in a minute.

On the Government side:  Well, we have some people we share data with.  It’s not a lot, and it’s sanitized so the value of it is minimal except for the Washington Post Front Page Metric.  Even so, the data is PII that we’ve taken an anonymizer to so that it’s just statistical data that doesn’t directly identify anybody.  We’ve got a pretty good handle on our own IT systems over the past 2 years, so our CISO and IG want us to focus on data that goes outside of our boundaries.  Now I don’t expect/want to “own” the contractor’s IT systems because they provide us a service, not an IT system.  My core problem is that I’m trying to take an existing contract and add security requirements retroactively to it and I’m not sure exactly how to do that.

Our Goals:

  • Accomplishing the goals of the program that we provided data to support
  • Protection of the data outside of our boundaries
  • Proving due-diligence to our 5 layers of oversight that we are doing the best we can to protect the data
  • Translating what we need into something the contractor understands
  • Being able to provide for the security of Government-owned data at little to no additional cost to the program

On the contractor/service provider side:  We took some data from the Government and now they’re coming out of the blue saying that we need to be FISMA-compliant.  Now I don’t want to sound whiney, but this FISMA thing is a huge undertaking and I’ve heard that for a small business such as ourselves, it can cripple us financially.  While I still want to help the Government add security to our project, I need to at least break even on the security support.  Our core problem is to keep security from impacting our project’s profitability.

Our Goals:

  • Accomplishing the goals of the program that we were provided data to support
  • Protection of the data given to us to keep the Government happy and continuing to fund us (the spice must flow!)
  • Giving something to the Government so that they can demonstrate due-diligence to their auditors and IG
  • Translating what we do into something the Government understands
  • Keeping the cost of security to an absolute minimum or at least funded for what we do add because it wasn’t scoped into the SOW

Hmm, looks like these goals are very much in alignment with each other.  About the only thing we need to figure out is scope and cost, which sounds very much like a negotiation.

Hardcore Negotiation Skills photo by shinosan.

Little-known facts that might help in our scenario here:

  • Section 2.4 of SP 800-53 discusses the use of compensating controls for contractor and service-provider systems.
  • One of the concepts in security and the Government is that agencies are to provide “adequate security” for their information and information systems.  Have a look at FISMA and OMB Circular A-130.
  • Repeat after me:  “The endstate is to provide a level of protection for the data equivalent or superior to what the Government would provide for that data.”
  • Appendix G in SP 800-53 has a traceability matrix through different standards that can serve as a “Rosetta Stone” for understanding each other.  Note to NIST:  let’s throw in PCI-DSS, Sarbanes-Oxley,  and change ISO 17799 to 27001.

So what’s a security geek to do?  Well, this, dear readers, is Rybolov’s 5-fold path to Government/contractor nirvana:

  1. Contractor and Government have a kickoff session to meet each other and build raport, starting from a common ground such as how you both have similar goals.  The problem really is one of managing each others’ expectations.
  2. Both Government and Contractor perform internal risk assessment to determine what kind of outcome they want to negotiate.
  3. Contractor and Government meet a week later to negotiate on security.
  4. Contractor provides documentation on what security controls they have in place.  This might be as minimal as a contract with the guard force company at their major sites, or it might be just employee background checks and
  5. Contractor and Government negotiate for a 6-month plan-of-action.  For most organizations considering ISO 27001, this is a good time to make a promise to get it done.  For smaller organizations or data , we may not even

Assumptions and dependencies:

  • The data we’re talking about is low-criticality or even moderate-criticality.
  • This isn’t an outsourced IT system that could be considered government-owned, contractor-operated (GO-CO)


Similar Posts:

Posted in FISMA, Outsourcing | 1 Comment »
Tags:

Effective Inventory Management

Posted August 20th, 2008 by

So what exactly is a “system”?  After all this time, it’s still probably one of the most misunderstood ways that we manage security in the Government.

The short answer is this:  a system is what you say it is.  Long answer is it depends on the following factors:

  • Maturity of your agency
  • Budget processes and Exhibit 300s
  • The extent of your common controls
  • Political boundaries between inter-agency organizations
  • Agency missions
  • Amount of highly-regulated data such as PII or financial

Yes, this all gets complicated.  But really, whatever you say is a system is a system, the designation is just for you so you can manage the enterprise in pieces.  There are 3 main techniques that I use to determine what is a system:

  • As a budget line-item: If it has an Exhibit 300, then it’s a system.  This works better for Plan of Actions and Milestones (POA&Ms) but in reality there might not be a 1:1 correllation between systems and Exhibit 300s.
  • As a data type: If it has a particular type of data, then it’s a system.  This works well for special-purpose systems or where a type of data is regulated, such as PII or financial data.
  • As a project or program: if it’s the same people that built it and maintain it, then it’s a system.  This dovetails in nicely with any kind of SDLC or with any kind of outsourcing.

Inventory

Inventory photo by nutmeg.

Inventory management techniques that work:

  • Less systems are better.  Each system incurs overhead in effort and cost.
  • More systems works when you have no idea what is out there, but will cripple you in the long term because of the overhead.
  • Start with many systems, assess each as its own piece, then consolidate them into a general support system or common controls package.
  • Set a threshold for project size in either pieces of hardware or dollar value.  If the project exceeds that threshold, then it’s a system.
  • Determine if something will be a system when the budget request is made.  Good CISOs realize this and have a place on the investment control board or capital planning investment board.

Guerilla CISO war story time:

Way back when all this was new, one of the agency CISOs would have a roundtable every quarter or so.  Won’t name who, but some of my blog readers do.  Almost every meeting devolved at some point into the time-honored sticking point of “what is a system?”  Everybody wanted to know if they had “2 servers, 3 PCs, a database, a dog, and a dickfore”, was that a system.  After one too many iterations, the gray-hair in the group would put up “Exhibit 300=System” on the whiteboard before every meeting.  Then when the inevitable conversation of “what is a system?” would come up, he would just point to the board.

And another story:

Several years ago I was working an IT outsourcing contract with an inventory that was determined using the budget line-item technique.  Turned out we had all sorts of systems, some of which didn’t make sense, like the desktop client to manage the local admin account.  One of my first priorities was to consolidate as many systems as I could.  Not that I was altruistic about saving money or anything, it was that the less systems I had, the less paperwork needed to be generated. =)   Most of the systems I rolled up into a general support system aimed at basic user connectivity.



Similar Posts:

Posted in FISMA | No Comments »
Tags:

New SP 800-60 is Out, Categorize Yerselves Mo Better

Posted August 18th, 2008 by

While I was slaving away last week, our friends over at NIST published a new version of SP 800-60.  Go check it out at the NIST Pubs Page.

Now for those of you who don’t know what 800-60 is, go check out my 3-part special on the Business Reference Model (BRM), a primer on how SP 800-60 aligning FIPS-199 with the BRM, and a post on putting it all together with a catalog of controls.

And oh yeah, the obligatory press reference: Government Computer News.

Data Release Show

Data Release Show photo by Discos Konfort.

So deep down inside, you have to be asking one question by now:  “Why do we need SP 800-60?”  Well, 800-60 does the following:

  • Level-sets data criticality across the Government:  Provides a frame of reference for determining criticality–ie, if my data is more important than this but less than this, then it’s a moderate for criticality.
  • Counters the tendency to rate system criticality higher than it should be:  Everybody wants to rate their system as high criticality because it’s the safe choice for their career.
  • Protection prioritization:  Helps us point out at a national level the systems that need more protection.
  • Is regulations-based:  The criticality ratings reflect laws and standards.  For example, Privacy Act Data is rated higher for confidentiality.

All things considered, it’s a pretty decent systemfor Government use.

Now this is where I have a bit of heartburn with GRC tools and data classification in general in the private sector–they classify the wrong things.  How the vendors (not all of them, there is a ton of variation in implementation) want you to categorize your data:

  • HIPAA-regulated
  • PCI-DSS-regulated
  • SOX-regulated
  • All other data types

How your CISO needs to categorize data to keep the business afloat:

  • Data that gets you paid:  If you’re a business, your #1 priority is getting money.  This is your billing/AR/POS data that needs to keep going.
  • Data that keeps you with a product to sale over the next week:  usually ERP data, stuff that slows down the production line.
  • Data that people want to rip off your customers:  hey, almost all the regulated data (PCI-DSS, HIPAA, etc) fits in here.
  • Data where people will rip you off:  ie, your internal financial systems.  Typically this is SOX country.

I guess really it comes down to the differences between compliance and risk, but in this case, one version will keep you from getting fined, the other will keep your business running.



Similar Posts:

Posted in FISMA, NIST | No Comments »
Tags:

Civilians Ask “What’s With All the Privacy Act Kerfluffle?”

Posted June 26th, 2008 by

And by “kerfluffle”, I mean these articles:

Well, let’s talk about how privacy and the Government works with Uncle Rybolov (please hold the references to Old Weird Uncle Harold until we’re through with today’s lesson please).

We have a law, the Privacy Act of 1974.  Think about it, what significant privacy-wrenching activities happened just a couple of years prior?  Can we say “Watergate Scandal“?  Can we say “Church Committee“?  Suffice it to say, the early 1970s was an era filled with privacy issues and is where most of our privacy policy and law comes from.  Remember this for later:  this was the 1970’s!

Each of the various sections of the Privacy Act deals with a particular data type.  For instance, Title 13 refers to data collected by the Census Bureau when they’ll go count everybody in 2010.

The Privacy Act talks about the stuff that everybody in the Government needs to know about:  how you’re going to jail if you disclose this information to a third party.  For those of you who have ever been in the military or had to fill out a government form that required your social security number, the light in the back of your head should be going off right now because they all have the warnings about disclosure.

Huts and Chairs Need Privacy Too

Remember to respect the privacy of the beach huts and chairs photo by Joe Shlabotnik

When it comes to IT security, the Privacy Act works like this:

  • You realize a need to collect PII on individuals.
  • You do a privacy impact assessment to determine if you can legally collect this data and what the implications of collecting the data are.
  • You build rules about what you can do normally with the data once you have collected it.  This is called the “routine use”.
  • You write a report on how, why, and about whom you’re collecting this information.  This is known as the “System of Record Notice”.
  • You file this report with the Federal Register to notify the public.
  • This IT system becomes the authoritative source of that information.

IE, no secret dossiers on the public.  We’ll suspend our disbelief in FISA for a minute, this conversation is about non-intelligence data collection.

Now the problem with all this is that if you stop and think about it, I was 1 year old when the Privacy Act was signed.  Our technology for information sharing has gone above and beyond that.  We can exchange data much much much more quickly than the Privacy Act originally intended.  As a result, we have PII everywhere.  Most of the PII is needed to provide services to the citizens, except that it’s a royal PITA to protect it all, and that’s the lesson of the past 2 years in Government data breaches.

Problems with the Privacy Act:

  • The SORN is hard to read and is not easy to find.
  • Privacy Act data given to contractors or “business partners” (aka, state and local government or NGOs) does not have the same amount of oversight as it does in the Government.
  • Data given to the Government by a third-party is not susceptible to the Privacy Act because the Government did not collect it.  Wow, lots of room for abuse–waterboarding-esque abuse.
  • Privacy Act procedures were written for mainframes.  Mainframes have been replaced with clusters of servers.  It’s easy to add a new server to this setup.  Yes, this is a feature.
  • If you build a new system with the same data types and routine uses as an already existing SORN, you can “piggyback” on that existing SORN.
  • It’s very easy to use the data in a way that isn’t on your “routine use” statement, thus breaking the entire privacy system.

Obviously, at this point, you should have gotten the hint that maybe we need to revise the Privacy Act.  I think GAO and OMB would agree with you here.

So, what alternatives do we have to the existing system?

  • Make blanket data types and do a PIA and SORN on them regardless of where that data lies.
  • Bend the Paperwork Reduction act and OMB guidance so that we don’t collect as much information.
  • Make the Privacy Act more specific on what should be in SORN, PIA, and routine use statements.

To be honest, it seems like most of this is already in place, it just needs to get tuned a little bit so we’re doing the right things.  Once again, the scale of the Government’s IT infrastructure is keeping us from doing the right thing:    there isn’t enough time in the day to do PIAs on a per-server basis or to keep track of every little bit of data.  You have to automate our privacy efforts in some fashion.

And this is why, dear readers, I think the Government needs DLP solutions more than the private sector does.  Too bad the DLP vendors are stuck on credit cards and social security numbers.



Similar Posts:

Posted in FISMA, Rants, What Doesn't Work | No Comments »
Tags:

« Previous Entries


Visitor Geolocationing Widget: