I received an email from a colleague last week in regards to my recent post about the BITS Shared Assessments Program. In the entry I offered my high opinion of the framework but went out of my way to point out that by itself the assessment is not a vendor management program. The subject line for the email was “Why not?”.
Semantics aside, there’s an important distinction between assessing and managing. Assessing how a vendor conducts business within their own infrastructure is not the same as monitoring contractual obligations and service-level agreements that govern your relationship with that vendor. That the vendor has an information security program is a good thing, that the vendors information security program supports what regulations require you to do is a better thing.
The Shared Assessments Program does a great job of providing a consistent set of measurements by which every vendor can be assessed. But it does not offer a determination as to whether or not what the vendor does is sufficient for your own needs or purposes. It’s still incumbent upon your institution to form that opinion and act accordingly. And even so, that’s still just one piece of the vendor management puzzle.
FFIEC guidance breaks out vendor management into several separate and distinct parts with the assessment piece only being one element of the ongoing monitoring phase. Assuming you’ve conducted the necessary and expected steps to enter into a contract with a vendor you still need to review their performance against specific contractual obligations and measure them against the various elements of the service level agreement. And this needs to occur annually with a determination formed as to whether or not the contract is being adhered to and if not, what remedial steps are required to continue forward with the vendor. The Shared Assessments Program doesn’t do that for you.
One of my reasons for beating this drum is because of the popular misconceptions circulating out in the market place as to what’s needed to address vendor management. There are people I’ve worked with who offer themselves as industry experts pushing the Shared Assessments approach on even the smallest financial institutions trying to convince them that they need to have one completed for all of their high risk vendors in order to appease the examiners. This is simply not true and is really, in my opinion, just a ploy to try and generate revenue within an industry that’s hyper-sensitive to regulatory scrutiny. Basically what’s required is that the vendor, where applicable, provides its customers with something akin to a SAS70 report in which they demonstrate that their infrastructure is properly secured and managed. And that’s the point where I’m predicting that the Shared Assessments framework will become the standard, that it will replace SAS70’s and generic audit/assessment programs as the truest way to measure a technology service provider. And again, this represents only a portion of the work required to address vendor management. But rest assured, if all you have to show your examiner/regulator/auditor next time they ask to see your vendor management program is a few recently completed shared assessment reports you’re asking for trouble.
Which leads to another reason for my beating this drum. There are only a small percentage of vendors for whom the Shared Assessments Program even applies. When reviewing my clients’ vendor management programs I’m often confronted with the “high risk vendor” logic in which some arbitrary algorithm is applied to determine which vendors are included and which are excluded from the program. There’s almost no evidence of this assessment having been conducted and it never holds up under scrutiny. But with regards to which vendor would/should be required to provide an external and independent assessment of their infrastructure I could easily make the case for limiting it to only “high risk vendors.” As a matter of fact I can even offer a viable rule by which to make that determination. Does the vendor process, store or transmit non-public, personal information (NPPI) within their infrastructure? If the answer is “yes,” demand a SAS70 or equivalent. Otherwise you’re free to decide your threshold for pain and make the rules accordingly. But rest assured, if you have recently completed shared assessment reports for your high risk vendors to show your examiner/regulator/auditor as part of your vendor management program the next time they ask, you’re in for a lot less trouble.
I’m a fan of diversification. Professionally or personally I strive to mix and match and switch things around to avoid falling into a rut and to keep things fresh; I’m hopeful the contents of my blog reflect on that. And yet, here I go again about PCI simply because one of my current projects has pulled it into scope and dropped it tap-dead center on my radar.
I participated in a conversation with a client late last week in which the standard was discussed and I’ve thought about little else since. Why the dramatics you ask? Because it brought to light another issue with the beleaguered framework that for me, comes at a time when it’s already teetering on the edge of credibility.
For all but Level 1 candidates, PCI allows for an organization to conduct their own self-assessment and determine whether or not they’re compliant. And so you don’t really know if the quality of the assessment is up to snuff and even beyond that you don’t really know if it covers enough of the landscape to truly be meaningful. To compound the issues around the self-assessment process is that the people typically responsible for PCI are focused primarily on trying to achieve compliance as if though that by itself is the goal. For them it’s a series of steps to go through and, if there are no significant issues noted, are able to lay claim to the grand prize commonly referred to as being PCI compliant. But is that really what was intended when the credit card companies formed the PCI Council?
If you go to the website (https://www.pcisecuritystandards.org) one of the first things that you notice is where it states that the “PCI Security Standards Council’s mission is to enhance payment account data security by driving education and awareness of the PCI Security Standards.” Where is the awareness when only a select few within a company are even aware of what is required by the standard? And how aware are even they if the focus is on doing what’s necessary to “pass” rather than working to push the related controls deeper into the infrastructure?
It’s not just that so many in-scope organizations fail to apply the standard properly (e.g. TJ Maxx, Hannaford, Heartland, etc.), it’s that they don’t see it as the blueprint that it’s intended to be. Being PCI compliant simply because a small (but representative) subset of your infrastructure passed the test is not the goal, having the proper controls effectively deployed across the entire infrastructure is. The standard was created so that you know where you need to be looking, what you need to be looking at and what the people within your organization need to be aware of in order to support the standard. It truly is intended to be more about education and awareness and less about acing an exam.
I sliced into Visa a few weeks back for their finger-pointing posture related to Heartland and in the past I’ve been known to criticize the PCI Council as well for similar reasons. Stop playing the blame game! Go back to the beginning, back to the basics and remember that the purpose of the exercise is to provide a basic set of controls that are required at a minimum to protect card member data. It’s up to each organization to make adjustments that reflect on their own unique infrastructure. And ultimately that infrastructure shouldn’t be measured by PCI-DSS but rather viewed through its lenses.
The credit card companies and their PCI Council should be incenting businesses towards a time and place where they can claim that they manage in a style consistent with the basic tenets of PCI and away from where they strive to achieve a point-in-time designation whose value expires the moment anything within their infrastructure changes. To quote Michael Douglas in “The American President” movie: “We have serious problems to solve, and we need serious people to solve them.” We need those in positions of influence to not pull back on PCI but rather push forward, make the necessary course corrections and continue getting the word out and providing support to those who need it. When the next breach is made public (because it’s already happened only we don’t know about it yet) I want to see the people in charge come out and tell us they’re committed to understanding what happened, why it happened and how they can adjust PCI to help reduce the risk of it happening again.
And really, in the end, what I want personally is the chance to obsess over something else and send PCI to the back of the blogging line.
About thirty seconds after I posted my last blog an item on the SearchFinancialSecurity.com homepage caught my eye. It was an interview conducted by Marcia Savage with Michelle Edson and Charlie Miller from the Sante Fe Group about the Shared Assessment Program.
For those of you who aren’t already familiar with the Shared Assessment Program it’s a framework to assess third-party service providers that has been gaining in popularity over the past few years. Created by BITS, ” a non-profit industry consortium whose members are 100 of the largest financial institutions in the United States”, it’s fast becoming synonymous with vendor management. I’m hard pressed to recall a recent conversation with someone in the industry where, when the subject of vendor management was brought up, didn’t make reference to Shared Assessment somehow.
I’ve grown fond of saying that what CobIT became to SOX, the Shared Assessment Program is becoming to vendor management. Now, I’m an experienced hand with vendor management and some might even consider me an expert (though not for me to say about myself) and I’m hard-pressed to think of a better framework or approach to use when actually trying to determine what controls are in place and functioning at someone else’s company. Conceptually it presents itself as a SAS 70 process but unlike a SAS 70 this has clear, concise and repeatable steps that remove any ambiguity from the process. While it’s certainly true that the results are only as good as the people using it, the Shared Assessment approach at least serves as a relevant and comprehensive baseline.
I’m not going to offer a deep dive into the components of the program, feel free to check it out yourself. What I will tell you is why you should be at least familiar with it and likely be using it for your own purposes.
First, it covers everything you’d want or need to within the virtual four walls of any company. Take a look at what’s covered via the FFIEC guidance and related handbooks, take a look at any reasonable IT general controls audit program, read any SAS 70 report done for a technology service provider and map it back to the related Shared Assessment Program elements; it’s all in there.
Second, the language is clear and concise with almost no room for misinterpretation (I say “almost” only because I’ve learned never to underestimate or overestimate peoples ability to complicate the simple things). Anyone can pick up the templates and start using them immediately with no direction or instruction.
Third, it’s great to use as a self-assessment guideline. If you work for any organization in any business vertical and want to quickly get a snapshot of where your infrastructure is in terms of controls and their related activities open up the Standard Information Gathering questionnaire (SIG spreadsheet) and use the Lite version. As a matter of fact, pass out copies to stakeholders in other areas of your infrastructure and ask them to fill it out and see what things look like from multiple perspectives.
Fourth, if you’re in an industry with strict regulatory oversight, particularly within the banking sector, this will help you standardize on not only what information you’ll need from your vendors but also what you want to be able to share with those external to your own organization. When the examiners or external auditors show up to conduct their work and find that you’re using the Shared Assessment Program to measure and test yourselves it should engender confidence and reduce the amount of time necessary for them to conduct their own fieldwork. That’s sort of what CobIT did during the early and insane days of SOX. When the auditors showed up and discovered that you documented your controls so that they aligned with CobIT they tended to ease up a bit and place a greater dependency on management testing thus reducing time and (billable) expenses. This is a similar opportunity and in this economy who can easily ignore the chance to potentially lower costs.
To be clear, Shared Assessments is not a vendor management program, it’s part of one. You still have to conduct all of the other related activities involved (e.g. due diligence, contract compliance, etc.). But for that all-important element where you need to obtain proof that the necessary controls are in place and functioning effectively at your third-party service providers (HEY BANKING COMMUNITY, PAY ATTENTION ‘CAUSE THIS IS IMPORTANT FOR GLBA) this is what you should require the vendor to be using.
Oh, and did I mention it’s free?
I once heard a parent say that they wished they had a dollar for every time their teen-aged child rolled their eyes at them. I’m a parent so I get it. But what I really wish for is to have a dollar for every time a client rolls their eyes at me when I tell them they need to have all their policies and procedures documented.
It happened twice last week, two different clients in two different states. And it was a slow week.
A corporate lifetime ago policies and procedures were a nuisance put in place by management as a way to standardize business practices and attempt to use a single set of rules for everything everywhere they did business. And it was a drag. I have clear memories of my formative years on Wall Street with a seemingly endless row of binders on my cubicle shelf that appeared best suited to gather dust rather than provide anyone direction because in the end, well, all they did was gather dust. So the irony isn’t lost on me that here I am a decade or two later standing on my soapbox explaining why having things documented is a good thing.
Twenty years ago there really weren’t enforceable regulatory standards such as SOX or GLBA. Frameworks and assessment guidelines such as CobIT and NIST and ISO 17799 were either in their infancy or not yet developed. And so outside of a very few pockets of industry there wasn’t a whole lot of good reason to have to put down on paper what you did, why you did it and how you got it done. Sure there were the auditors that came around every now and again but things were simpler in those days and much of what they needed could either be found in the occasional dusty binder or grabbed from the data center operations library.
Today we live in a different world. There are a seemingly endless number of regulations in place that are tested monthly, quarterly, semi-annually and annually. There are rules as to how you must configure your network, your applications, your data (electronic and hard-copy), secure your facilities, your desktops, your laptops, your handheld’s. The only thing left is the kitchen sink and technically even that’s covered if the kitchen is located within the secured perimeter of a data center. The amount of work that must be done to be in compliance, to properly configure and secure your infrastructure is maddening. And so on top of all that work you’re now being told that doing the right things isn’t enough, you also need to document what you’re doing as well.
And so I get that rolled eye look which is often accompanied by the question “why do I have to document everything I do even if I can prove I’m doing the right things?”
I’ll tell you why; examiners and auditors are human. Some are smart and savvy humans, some are sensible and knowledgeable humans and some are just humans. Or rather, not everyone does their work the same way. The only way to ensure that you can get credit for doing the right things is by documenting what you’re doing so that anyone coming in and trying to gain an understanding of how things run within your four walls has it laid out for them. See here’s the problem, if you let an examiner/auditor wander logically and physically through your infrastructure they’re going to look for what they’d expect to find thus leaving you and your organization opened up to greater scrutiny. They pull out lists that include everything they could ever hope, expect or dream to find and start asking for items on that list. If you give them your road map explaining at the policy level what your organization is committed to doing and follow that with supporting procedures breaking out into detail exactly how those policies are supported you’re paving the path to be followed. You get to steer the examiner in the direction that you want them to go, the direction that your organization follows.
Last week I had one client operating under two regulatory frameworks, another operating under three frameworks plus PCI; that’s a whole lot of audit activity to have to deal with. Do you really want to have to repeatedly answer the same questions, conduct the same walk-throughs and explain yourself over and over and over again? Wouldn’t it simplify your life if you set aside the time to document everything so that anyone can walk in, be handed the (gulp) binders and figure out for themselves how things work within your world?
I’ll admit, this concept is a bit self-serving though sincere. If everyone had their documentation in order my job would be that much easier when I’m conducting the fieldwork. But if you knew what I looked for and often found you’d also see where you’d benefit; I’m a former technologist who used to break every rule in the book and figure out how to circumvent every control that was thrown at me and so I’m the last person you’d want left up to his own devices while conducting an audit.
Oh and one more reason why you should do it; GLBA and SOX both require you to do so, so there!
Let me kick this off my clearly stating that I have never met Adrian Phillips, Visa International’s Deputy Chief Enterprise Risk Officer and Regional Head of Risk for North America. As a matter of fact I had never even heard this name until earlier this month. I know so little about Mr. Phillips that until this morning I even thought “he” was a “she” based on the name alone (with all due apologies to Mr. Zmed and Mr. Dantley).
But here I am barely one month later and I can’t seem to shake this guy. He’s being quoted all over my working world on websites, in print articles and even in the mainstream media. And his comments and quotes all sound eerily similar as if though they’re rehearsed.
I first crossed Mr. Phillips path in an online article in which he defended the PCI standard in the matter of the Heartland breach. He talked about how the PCI-DSS standard “didn’t fail,” and how it’s “been largely touted as one of the best tools to protect cardholder data and fight breaches.” He then offered this caveat, “The reality is that fighting payment fraud is complex and multidimensional; there’s simply no single solution to make fraud go away”.
I liked the rhetoric, to be honest. No standard is by itself the solution to the problem against which it’s aligned. I’ve worked with all of them from SOX, to HIPAA, GLBA and yes, even PCI. All of them have merit and none of them actually solve the problems they are designed to address. They are all a good place to start in terms of scoping out what to do but really in the end it’s the same thing as handing someone the blueprints to a house, offer no supplies, no tools and no real guidance and expecting them to actually build a house. So for Mr. Phillips this was a nice, but confident step forward.
Then further in the same article he suddenly and very subtly switched gears when he said “no compromised entity to date has been found to be in compliance with PCI DSS at the time of the breach. In all cases, forensic investigations have concluded that compliance deficiencies have been a major contributor to the breach.” Or in other words it wasn’t that the PCI standard failed, it’s just that it wasn’t implemented properly and thus the reason for the breach. Interesting point Mr. Phillips, though a step backwards and away from the real issues.
Upon examination of this assertion it sort of falls apart. Heartland was PCI-compliant and certified at the time the breach occurred. As we all know the business entity cannot certify itself and must rely upon the opinion of an independent and PCI-approved entity to do so. So Heartland can’t be held responsible if in fact there were issues that went unnoticed and/or undocumented. Why wasn’t that part of the rhetoric being offered? And who was the certifying firm for Heartland and what’s happening to them? Although I’m willing to bet that they did their job exactly as they’re supposed to they still certified a company that Visa has since determined was out of compliance. To me it came across as a bit of an avoidance strategy on Mr. Phillips and Visa’s part and so a step to the wrong side of the issue.
Then he wrapped up the article by pointing out “there’s no silver bullet when it comes to protecting consumer data” and that “as criminals get better at what they do, our efforts to stop them must keep pace.” Excellent points, both. And really, in the end, the true key to all of this. Every decent cybercriminal out there clings tight to the belief that when one door closes, another one opens and so they go and find it. The operative word for any manner of security is vigilance; without it you’re doomed. And for Mr. Phillips this was a step to the right side of the issue thus bring us back to where we first started.
Let’s recap the way this worked: step forward, step back, step to the wrong (left), step to the right. Put this to some techno-beat driven music and I think we may have a suitable replacement to the Cha-Cha Slide line dance.
Sadly what Mr. Phillips and Visa need to do is come straight out and say what everyone already knows, that the PCI-standard though applied correctly according to its own rules failed to detect and prevent this breach from occurring. It’s a poorly held secret that those in the field who conduct PCI work are the most aware of how flawed the standard is in terms of providing reasonable assurances that the appropriate controls are in place and functioning as expected. And despite that fact it’s still a viable framework, it just needs to be pushed further and deeper into the infrastructure and the sampling requirements need to expand to be more than just “representative.” Yeah, I know, this isn’t going to be cheap and it’s certainly not going to be easy but it’s either that or we keep waiting for the next breach (and the next and the next and, and, and….).
Honestly all we have at this point in the lifespan of the PCI standard is good idea, huge residual risk to our credit card data and a snappy little dance routine. I think we can do better, I hope we can do better.
With all due respect to the brilliant novelist Charles Dickens, as I was sitting down to write this, my first blog post for Tech Target I came to think of one of the most brilliant openings to a novel perhaps ever.
“It was the best of times; it was the worst of times.”
For in the space that I ply my trade there is no more accurate line to describe where it is we are.
As a full-time practitioner, a sometimes commentator and a perpetual student of regulatory compliance these are the very best of times. We have historic conditions in the banking industry, epic failures on Wall Street, record-setting breaches in information security, the emergence of new regulations and the growing likelihood of even more to follow.
As a provider for my family, a home owner, a business owner and a consumer these are the very worst of times. The economy has collapsed from right underneath our feet, the value of my home has dropped, companies in all business verticals are either going out of business or freezing any manner of spending and my ability to obtain credit has been severely stunted.
But there’s a silver lining to all of this. When our country struggled through the Great Depression eighty years ago no one really knew it was a “Great” anything. Times just grew remarkably tough in record time and by the time the first cloud of dust settled there was a stunned nation hanging on my a thread. It was only in hindsight and many years into the future that any meaningful perspective was formed, lessons learned and systems either modified or created to protect ourselves from it ever happening again. This time though is different and thus the silver lining.
We know this is an unprecedented period in our history. We know there were failures, we know there are ways to restrict the damage we suffer, we know there is valuable information to be gleaned from analyzing and understanding what got us here and we’re determined to act on all of this information and insight. I’m fond of telling my children that no mistake is completely regrettable if a lesson is learned and a rule developed to avoid making that same mistake again. And so here I am with my fingers crossed hoping that we’re collectively smart enough to do just that.
One thing is for certain as I begin sharing my stories and observations from the field, there will be no shortage of topics to discuss. For when you mix the best of times and the worst of times together you end up with the most interesting of times.
Check back next week when I discuss the latest dance craze to hit the scene, the “Visa PCI Shuffle”.