General security

C&A: The Square Peg

November 24, 2011 by Len Marzigliano

This C&A related call for help is from Latonya in Washington, DC:

Need help! I am desperately searching for an instruction that will exempt a legacy fielded system from going through the C&A process. This system lack interoperability testing and current requirement documentation. This system was fielded back in 1950 there have been absolutely NO upgrades, it does not have an operating system and I need to find an instruction or criteria that will waive the C&A process.

Fear not Latonya – This happens much more often than one would think, and although there is no magic policy citation that can fix your problem, I can offer you some specific help for your case along with three phases of general advice on this subject:

You’re facing a common square peg scenario in C&A which I like to call the “Not a System System”.
Military environments often have these, because they deploy highly specialized information systems on vehicles or aircraft, in space, and even built into weapons. Although these meet the classic definition of an information system in that they process or store data, they sometimes differ from the intent of security controls and C&A process authors to such a degree that perhaps the C&A process shouldn’t apply to them at all, hence the concept of a C&A waiver. This is tricky business however. I know there are misguided folks out there who, though misinterpretation of even the best C&A policies, will try to scoot a system into a waiver bucket where it doesn’t belong, or conversely, force programs to C&A a file cabinet as an information system. Whether intentional or not, these actions (and precedents set by them) can cause either unnecessary costs and headaches, weaknesses in security, or both.

A dearth of testing or documentation would probably not be acceptable rationale for a waiver, but
because this legacy ‘system’ lacks a typical OS and presumably has no semblance of modern networking capability either, your Korean War era relic could likely be categorized along with specialized “Platform IT” type systems, or fall into a bucket where the full C&A treatment doesn’t apply, yet some other measures can still be taken to perform the task that C&A fulfills, which is to assess and address risk.

Phase I: Research
Look in the policy documents for your specific civilian agency or branch of military service. It’s highly unlikely that you’ll find anything this specific in the overarching national/department level policies and processes. If there is a written exception, it will be found in the more interpretive policies of the agency/bureau or service/office level. Dialog with the more veteran folks at your agency or those who might have been through a similar experience and glean some advice or a reference to someone who can help.

Phase II: Determination
Take your case to the C&A gods for your agency. The highest level Senior IA Officer (SIAO) or equivalent would be the best fit, but don’t rock the boat – if your manager or a senior leader instructs you to take it to a certain officer or group, there is risk involved in going over someone’s head which might send them chasing after yours. That being said, you must also fight to gain acceptance for your request. Explain the unique circumstances, perhaps use some content you might have found in policy documents to bolster your position, and obtain a concise and definitive answer. Do this all in writing (email or memorandum), so that you have a record of the entire exchange, which might be called upon.


Phase III: Execution
If you do garner an exception, it is vital that your waiver will be accepted by all of the entities and

stakeholders involved in the system, its interconnections, and its dependencies. This means not only
being ‘cool’ with the operations folks implementing or maintaining the system, but also with any
external entities that might be impacted as well. It will look very bad for everyone if you hoist your
waiver up on the yardarm and start sailing off into the sunset, only to find out too late that an authority you didn’t consider (or provide deference to) blockades you into the bay.

If you end up being forced to do the C&A, just suck it up and grab a shovel – any further resistance
will paint you and your team as either lazy whiners who can’t handle doing hard work, or sneaky
devils intent on subverting the C&A process – or both. You still have room to maneuver however, if
you work smart and avoid related pitfalls: Use the established features of your C&A program that
address exceptions at the more granular control level like Inheritance, Reciprocity, and Applicability to slay those specific controls that just don’t fit your system. Keep in mind that just because a control is difficult for you or your team doesn’t mean it deserves to be blotted out – a proposed ‘N/A’ must still
be approved by the certifying authority. Also, leveraging Inheritance and Reciprocity can saddle you
with dependencies on other systems and entities that must be managed and can cause your ATO to be yanked due to factors beyond your control.


Parting Thoughts

As a final note – it’s important for Information Assurance program developers and policy makers to pay attention to the action that happens at the system level and even a bit below. Much treasure can be found by studying not only how system level security officers are operating their C&A shop, but also by witnessing how the actual developers and engineers are performing the task of implementing security controls.

Keeping a formalized policy for exceptions and waivers to the C&A process is a quality example and a
quick win. Designing alternatives to the full C&A process for popular categories of specialized systems goes a step further by leveraging mitigation effectively and upholding the integrity of the program. Any agency or enterprise that forgoes these details will unwittingly foster a hidden cancer of the IA program that breeds inefficiency, ineffectiveness, and subversion. Applying the proper policy and guidance elements up front prevents the rogue behavior and waste that ensues when folks make up their own rules, misinterpret existing rules, weasel around for an informal waiver, or waste time and money on a risk management process that that doesn’t fit their systems.

Posted: November 24, 2011
Len Marzigliano
View Profile

Len Marzigliano is an Information Assurance Manager with defense contractor BAM Technologies in Arlington, Virginia and a researcher for InfoSec Institute. With over 20 years experience as an IT contractor and consultant, Len has worked with hundreds of organizations and project teams in commercial, civilian federal, and defense environments worldwide. His certifications include (ISC)2 CISSP, NSA IAM/IEM, and EXIN ITIL. Len’s information security blog can be found at