Regulatory Reality


September 1, 2009  3:29 PM

IT audits versus reviews

David Schneier David Schneier Profile: David Schneier

I had mentioned in my last post a recent conversation with my partner regarding a proposed IT general controls (ITGC) audit. My primary role in our practice is to head up regulatory compliance services which includes audits, assessments and program development; my partner’s primary role is head of sales and business strategy. However, there’s a significant amount of overlap between our two sides and I sometimes forget that I’m the compliance expert when we’re discussing the industry. His knowledge of the myriad regulations is impressive and there are times where I’ll vet ideas through him to validate my own thinking.

Anyway, he’d mentioned a conversation with the client around the proposed ITGC audit in which the project sponsor asked what the difference is between an audit and a review. I knew right away where the question stemmed from because of my experience in the industry. Many firms we compete with (and some I’ve even worked for in another time and place) don’t conduct audits, they conduct reviews. Sometimes they don’t even conduct a review or an audit but rather an assessment. I’ve struggled with this blurred use of terms because in my mind there are very clear delineations. The lifecycle of the governance, risk and compliance (GRC) domain is as such: identify and assess risk, design controls to mitigate those risks and test to validate that those controls are functioning as expected. And so, a risk assessment is conducted, governance elements are introduced in the form of policies and procedures, and regularly scheduled tests occur to make sure that the whole enchilada is actually getting the job done. And so when practitioners offer services that aren’t specific to one of the key areas of the GRC spectrum, it bothers me.

See, an audit is an audit is an audit; you determine the control objectives that are supposed to be supported by the entity that are in scope for the type of audit, identify the related control activities that either are or should be in place and then design a series of test steps to determine if those activities are occurring and tie back to the overall objective. The auditor has some leeway when it comes to offering an opinion as to what the results actually mean (one auditors pass is oft times another auditors finding) but fundamentally the audit results tend to be binary, you either pass or fail. It’s all fairly straight forward.

Now a risk assessment is not an audit; it’s a bit more arbitrary. Management generally is polled to determine what areas of their infrastructure (including finance, operations and technology) they are most concerned with, factor in regulatory and industry requirements and then come up with a plan for conducting the risk assessments. As these assessments occur, what they reveal would be factored into the overall audit plan to make certain that the areas presenting the greatest risk to the business are being examined closely and in a timely fashion.

So here’s my question: what exactly is a review? If you’re conducting tests and examining evidence you’re conducting an audit. If you’re interviewing stakeholders and determining what controls are in place but not testing their effectiveness then you’re conducting a risk assessment (assuming you’re asking the right sort of questions – a post for another day). Is there some odd dimension between the risk assessment and the audit I’m unaware of where this review occurs? And what exactly are you expecting from the results of this review? Because I’ll tell you this much, examiners only recognize risk assessments and audits. You can present them with a report that’s called a review in the title and tell them it’s actually an audit but you’d better be prepared to produce work papers because they’re going to ask you for them, trust me on this point. But my experience is that reviews don’t produce work papers because evidence is generally not formally collected in support of the report and requires significantly less effort to conduct. Which is why when we discussed the semantics of our industry, my partner is fond of saying that the difference between an audit and a review is about 50% of the proposed amount.

And as long as I’m ranting on the topic, how many of you out there include work papers as a deliverable when you contract with external entities to conduct audits? I’ve long been amazed at how many audit projects I’ve managed where work papers weren’t required or provided because it wasn’t included in the statement of work. I’m of the opinion that a summary audit report by itself is useless if you don’t have the supporting documentation because it’s the only true way to confirm that the testing occurred and support the findings. And of course it’s important to loop back to my earlier comment: Examiners will absolutely ask you for the work papers. I routinely receive phone calls from clients I’ve worked with at my previous firms asking if I still had access to their work papers because their examiner is asking for them. It’s awful for me because I have to tell them that while the evidence is still available (I keep everything for seven years for fieldwork I’ve conducted) there are no formal work papers to provide. Thus the reason that when we started our firm and developed the methodologies, I made certain that part of the scoping process included asking the client if work papers were to be included in the deliverables. It adds time to the project and so there’s a cost implication, but it’s the right way to run an audit and so really a need-to-have. You can take a short cut where applicable and only document findings but even so, how can you prove that a successful test was actually successful? For those practitioners looking for shortcuts it provides the wrong incentive.

If you want/need an audit, schedule an audit. If you want/need a risk assessment, schedule a risk assessment. If you’re considering a review or a generic assessment reconsider; decide which of the two proper categories your work fits into and than schedule accordingly. This isn’t rocket science; the FDIC and NCUA have been quite clear as to what you’re required to do so keep it simple. And if the resources you’re using aren’t speaking in the common audit and compliance vernacular force their hand and make them do so. Audit or risk assess, pick one.

August 18, 2009  8:05 PM

Is perspective on our regulatory landscape a blessing or a curse?

David Schneier David Schneier Profile: David Schneier

I was away from the office last week trying squeeze in a family vacation before the kids head back to school. Despite taking the occasional phone call and replying to a number of emails, there was still plenty waiting for me today when I returned to my normal schedule.

It wasn’t until somewhere mid-morning after catching up with my partner that the incongruity of my professional life was revealed in an odd pattern. I’d read about a number of bank closings having been announced on Friday (sort of becoming a weekly ritual at this point) and two new reported credit card breaches (also fast becoming a same old, same old scenario) by the time I called into the office to touch base. Turns out we had a busy week beyond what I’d already knew about and we were discussing one proposal in particular to conduct an IT general controls audit (more on that in a few weeks) when the strangeness of the morning finally dawned on me.

Everyone is still working on trying to keep up with their regulatory compliance obligations, companies that participate in credit card processing are still pushing to obtain/maintain PCI compliance, and it just doesn’t seem to be making much of a difference. Despite our practice being busier than ever and there being a heightened sense of regulatory awareness out on the street there’s a general lack of evidence that it’s making a difference.

I’ve already beaten the PCI horse to death with regards to how the PCI-DSS by itself does not really go far enough (nor was it intended to be an be-all to end-all solution). I’ve long griped about how so much of what matters is missed by regulators due to too few budgeted hours available and lack of appropriately skilled and trained resources. So really nothing new about any of this.

But still, with a reasonably fresh perspective and clear head on this, my first day back to reality, it all seems that much more, I’m not sure what the right word would be…. depressing, frustrating, baffling?

How important can GLBA compliance be to a bank that’s just about out of financial options and on the verge of closing? And really, how much money should a company spend to be PCI compliant if that compliance doesn’t go far enough to actually mitigate the associated risks? I was just reading a story about how Intel turned things around in the 1980’s because their two senior most executives (Andy Grove and Gordon Moore) got together and stepped outside of their roles and imagined what someone new, with a fresh perspective would do with their company to address increasing competition and decreasing market share. Forcing themselves to obtain that perspective lead the way to a change in direction that would transform not only Intel’s fortunes but drive an entire industry into the future. So why can’t we do something similar for our financial institutions?

The short answer is that we can but it would require an act of bipartisan politics typically only observed during a true crisis such as acts of war and natural disasters. Of course it wouldn’t be too hard to make the argument that our banking crisis is a disaster, man-made or otherwise, but somehow when one party can blame the other there’s little chance of forging a common peace even if it benefits the citizens.

I’ll likely lose this perspective as the week moves ahead and get back to less of the “Big Picture” thinking and more of the nuts and bolts focus typically required of me, but still, I’m hoping someone, somewhere is reading this and thinking I’m right.


August 8, 2009  3:31 AM

How to combat the insider threat

David Schneier David Schneier Profile: David Schneier

I was reading an article last week about how there’s been a recent increase in the number of reported security breaches caused by internal resources.  The insider threat is not a new one as corporate espionage is as old as civilization but it certainly is getting more press lately as patterns are shifting and criminal activity is adapting with the times.

But what was really eye opening for me was the conversation I had later that day while onsite at a client.  I was sharing some of the details of the story with an associate and the person sitting across the aisle was one of those freaky smart network people: The sort who speaks IP as well as English, who can read a network topology as if though it’s the morning paper and who are often consumed with getting every component plugged into his (HIS) network configured and secured perfectly much like Claude Monet wanted to get the tone and dimensions of every flower just right on the canvas.  When he heard what we were talking about, he perked up and joined in on the conversation.

I was sharing in my amazement of how technology has made it so much easier to collect sensitive information in unobtrusive ways.  One of my favorites was the availability of an audio recording device embedded within a network cable.  It has a transmitter built into it that broadcasts up to 160 feet away so someone can record boardroom conversations with relative ease.  And of course it not only looks like a network cable, it is a network cable and functions normally so there’d be no reason to be suspicious.  I’ve long advocated for outside-the-box thinking regarding how foreign devices can (and do) circumvent the very best security controls and here was a great example as to why.  So the network guy chimed in that there are so many ways to gain access to sensitive information without detection.  He described how there are many vulnerabilities that exist that none of the frameworks or regulations come close to addressing (I’m not offering examples; no need to give anyone any ideas).  But what struck me was how impassioned and concerned he was in discussing this subject.  And it occurred to me that he couldn’t just turn this off; that this was something he had on his mind somewhere close to all of the time.  And so I asked him if he ever shared this during any of the various risk assessment activities conducted during the year and he couldn’t recall being included or asked.

How do you support activities focused on protecting your infrastructure without including your experts in the dialogue?  How do you know where the risks really are if you’re not asking the people who are charged with the responsibility of mitigating them?

I’m a bit biased based on my own personal experience.  I came into the audit and compliance domain in the mid-90s in a somewhat less than flattering way.  I had been an application project manager with a well-earned reputation of breaking just about every change management control in place.  When I had something that needed to get moved into production, I lacked the patience to work through the required processes and figured out how to get around almost all of them.  The architect of many of those controls was himself an auditor (and still one of the very best I’ve worked with) and he and I forged a healthy respect for one another along the way.  When Y2K rose to prominence and the client was concerned about maintaining their remediated production environment, I was recruited by the same person to help guard the gate, sort of like hiring the bank robber to protect the bank.  But it turned out to be a great idea as I sniffed out a number of undocumented backdoors and workarounds that were being used and ultimately led to my then new and now current career path.  And it also proved to be an important lesson that’s served me well all these years: If you want to find out what you need to be worried about, ask the people who have the necessary skills to make you worry.

If you want to identify as many vulnerabilities and potential exposures to your infrastructure from the insider threat, start by engaging the people who can speak IP like a second language, who can read a network topology like it’s a treasure map and who know they can easily download an entire database without detection or insert devices in the data center that capture non-tokenized, unencrypted data streams as they travel through the pipelines.  I’ve always found that for the most part these people are eager to share what they know because it’s their best chance to affect any sort of change.  While conducting an IT general controls audit recently I had someone coax me into asking a question about the physical designs of their datacenter because there was a structural issue that deeply concerned him and he was desperate to see it show up on my report.

Times are tough, people are desperate and there’s no telling what some people are capable of or willing to do in order to survive.  Make sure you’re doing what you can to protect yourself and your customer’s sensitive information and make sure you’re including the right people in the conversation.  The best defense against the insider threat may very well be the people sitting alongside them.


July 30, 2009  6:26 PM

Reports: MasterCard institutes new PCI fines

Marcia Savage Marcia Savage Profile: Marcia Savage

MasterCard apparently is continuing to up the ante when it comes to PCI compliance.  There are reports this week that the company has instituted new fines for merchants that are non-compliant with the PCI Data Security Standard. Branden Williams, PCI practice director at VeriSign, wrote about MasterCard’s new PCI fines on his blog Monday. According to Williams, MasterCard has been much quieter than Visa on the PCI enforcement front — until now. Robert Vamosi at Javelin Strategy and Research followed up with confirmation from MasterCard and some clarification on the new fines in a blog post Tuesday. Alas, I have not heard back from MasterCard on this subject.

The tougher stance on non-compliant merchants comes on the heels of MasterCard increasing PCI requirements for some merchants, including Level 2 merchants, which must now hire a PCI-approved auditor to complete an annual onsite data security assessment by Dec. 31, 2010.

Acquiring banks likely should be prepared for questions from their merchants on the new MasterCard rules.


July 27, 2009  8:56 PM

Let the FDIC lead the way!

David Schneier David Schneier Profile: David Schneier

I can’t think of any more telling comment about where I am in my professional life than what I’m about to offer:

Sheila Bair rocks!

If you don’t know who she is, well, shame on you.  Because over the past year or so as the banking world has been in a near free-falling, tail-spinning heap of confusion, the chairman of the Federal Deposit Insurance Corporation (FDIC) remains perhaps the only reason why we haven’t been experiencing pure panic in the banking sector.  We’ve all watched as she calmly navigates from bank failure to bank failure, never losing her composure or allowing the dire circumstances to consume her or the FDIC.  She routinely offers sound and sensible insight and perspective, framing what’s happening in the banking world and making sure that everyone knows that the FDIC continues to have our back.   From the very first publicized collapse last year (IndyMac) straight through to last week’s speech before the Senate Committee on Banking, she has proven that there’s no substitute for having the right person in the right job.

As to why I’m waving my Sheila Bair banner so vigorously this week you need only read the transcripts from her aforementioned Senate testimony last week.

She was among the very first and remains one of the very few industry leaders to rail against the idea that any financial institution is “too big to fail.”  Last week, she expanded on that considerably.  She discussed how the “notion of too big to fail creates a vicious circle that needs to be broken” or rather, “we need to end too big to fail.”  She highlighted how so much of what’s caused this nightmare stems from “the presence of significant regulatory gaps with the financial system” and followed that up by suggesting that “we need to develop a resolution regime that provides for the orderly wind-down of large, systemically important financial firms, without imposing large costs to the taxpayers.”

Wow!  I mean, like, wow!

So really what she’s saying is that if you’re, say Citigroup or Bank of America, and you’ve managed to paint your institution into a financial corner from which you can’t legitimately escape, the only thing to do is go out of business.  Y’know, sort of like the core principles of a free market economy would dictate, or so we all believed until this past year.  None of this government bailout activity would be allowed that essentially transferred risk from for-profit institutions to us, the taxpayers.  You mismanage your bank, you run out of options, you close; simple and fair.

Chairman Bair further expanded on her proposal by explaining that with a resolution regime “losses would be borne by the stockholders and bondholders of the holding company, and senior management would be replaced.”  Or rather in my own words, accountability would be enforced; those who made the decisions that caused the problem would be forced out and those that were banking on a windfall that until now was almost guaranteed would have to accept the unfortunate risk-side of their investment (no more “sure things”).  And towards that end, she suggested that “each bank holding company with subsidiaries engaged in non-banking financial activities would be required to have, under rules established by the FDIC, a resolution plan that would be annually updated and published for the benefit of market participants and other customers.”   This I’ve come to think of this as a disaster recovery plan of an entirely different nature.

Think about what’s being proposed: accountability, acceptance of risk and the need to plan for all potential outcomes, favorable or otherwise.  What a concept!  And what a breath of fresh air!

Chairman Bair also offered the concept of forming a Financial Services Oversight Council that effectively “should be able to harmonize rules regarding systemic risks to serve as a floor that could be met or exceeded, as appropriate, by the primary prudential regulator.”  But wait, there’s more.  Of the council she also suggested that “primary regulators would be charged with enforcing the requirements set by the Council. However, if the primary regulators fail to act, the Council should have the authority to do so.”  This would eliminate the current design restrictions in which individual oversight agencies could only pursue punitive and/or corrective actions to a point but once their jurisdiction ended so too would their ability to take additional and often necessary steps to address the issues at hand.  Generally speaking, this would eliminate a number of loopholes that currently exist in the system.

I find all of this remarkably refreshing.  It’s so simple and straightforward, it’s all but impossible to reject or ignore (but I’m sure our politicians will try just the same).  And to a very large degree, these proposed changes would work, maybe not completely but certainly enough so that it would be worth our time to at least attempt implementing them.

But does everyone think so highly of Ms. Bair and her proposal?  It’s received pitiful little coverage in the press (I couldn’t find anything on two of the major news sites and on the third it was skewered to look like partisan politics) and none of my contemporaries were even aware that she had spoken.  Frankly, I don’t understand why.

I’ll put it out there right now: If I have a vote that can be cast in support of her plan it’s hers; there’s no need to ask me twice.  And if I need to poke a senator or two from my home state to help inspire them to support her plan, someone only has to let me know and I’ll happily go call on them (at home or in DC, it’s close enough to drive).


July 17, 2009  1:58 PM

Does compliance equate to secure?

David Schneier David Schneier Profile: David Schneier

Despite earning a living in the space, I often question the value of regulatory compliance.

How is it that a business can be PCI-compliant but still have glaring vulnerabilities?  How is it that despite layer upon layer of controls it’s still entirely possible for an executive to fudge numbers in a spreadsheet and alter a company’s financial reports?  How is it possible that a financial institution undergoes an annual exam and, despite not adhering to the most basic tenants of FFIEC guidance, still receives a favorable report?  And how is it that there’s a regulation that made an entire industry jump all at once but has never actually been enforced (can I see a show of HIPAA hands)?

And don’t think these statements are pure hyperbole; these all come directly from the field and from engagements I’ve been on in the last few years.

Why, you may ask, am I feeling a bit down on the regs this week?  A couple of three reasons:

It started on Monday when I was catching up on my industry reading.  There was an article about data leak prevention (DLP) software and how sales have been heating up lately.  Of the reasons given by survey respondents as to why they were considering purchasing a DLP solution, the top two were pretty much pointing the finger at either industry or regulatory demands.  The third reason was to avoid damage to the company brand/reputation, the fourth was to avoid lawsuits and finally, all the way down at number five on the list of reasons: to prevent the theft of proprietary information.  That’s just Depressing (note the capital “D”).  I thought it was embarrassing that the vast majority of survey respondents were looking to prevent data theft not because it was the right thing to do or to protect customers’ or employees’ sensitive data but rather because they’re being made to do so.  And so maybe you can make the case that regardless of the reason, at least companies are being forced to do something about protecting their information.  Sadly, that’s exactly my problem.  When it comes to doing things for the sake of compliance most companies only take things as far as they need to in order to achieve/maintain compliance.  The people on the front lines sort of lack enthusiasm for doing these things and figure their job ends once the auditors and examiners are happy.

My week of regulatory woe continued on Tuesday when while reviewing key activities aligned against one of the aforementioned frameworks, I identified what was a potentially significant gap not in how the client was conducting their work, but rather in what the regulation specifically required.  In other words, despite my client being completely compliant with this stringent, well respected framework, there was still the very real possibility that a vulnerability could exist.  I dug a bit deeper, made some phone calls to associates whom I often consider to be way smarter than I and the result was that I was right, the gap existed.  One of my associates pointed out that in a well-run shop with a hardened infrastructure you would expect the situation I identified to be managed properly, but the reality is that unless they have to, few managers have the ability to go beyond what’s required (either by the business or regulations).  I suppose if ever a day comes to exist when an IT department has finally cleared out their project queue and has money left in the budget they may very well get around to it, but I’m not volunteering to hold my breath.

And finally, my week is closing with news that a former client of mine is on its financial ropes and very likely about to declare bankruptcy.  Really, in the end it’s just a sign of the times and the sad state of our economy.  They appeared to be making the necessary adjustments over the past few years by trimming back staff and scaling back on non-critical projects, but they’re a half-inch to the left of the epicenter of this whole financial mess and in the end I guess there was no way to avoid the inevitable.  But still, I think of all the money they’ve spent on compliance-based initiatives since SOX first hit the scene and I can’t help but wonder if all of that spend could’ve been put to better use.  In the end, despite all of the great work that was done they still weren’t going to be able to prevent someone from massaging the numbers in a spreadsheet (a personal pet peeve of mine)   Thinking about the number of people they’d brought in to size up and conduct the work to bring their controls up to the necessary levels and the fees they’ve paid to their external auditors to conduct the SOX audits is just plain depressing.  Maybe if they’d used that money to fund a project to offer a new product line or enhance an existing one, they’d have found additional streams of revenue that could’ve helped them through this mess.

I suppose it comes down to this: anything worth doing is worth doing right.  But in the regulatory space that’s not the general rule and I’m thinking that until the oversight bodies figure out a way to provide the proper incentives, the work will always be lacking if not deficient.  Until being compliant also means being secure the job isn’t truly getting done.

Along those lines check back next week; I have an idea I’d like to share with you about how to make things better for all of us in the regulatory domain and turn things around.


July 8, 2009  3:45 PM

How’s about a federally mandated Information Security Assessment?

David Schneier David Schneier Profile: David Schneier

I had a eureka moment recently that I’d like to share.

In considering the implications of the recently announced changes by MasterCard that will now require PCI Level 2 merchants to be assessed by a Qualified Security Assessor (QSA) it occurred to me that they may be onto something. Why would the credit card industry restrict who needs to be assessed based on size? Why not simply require any business entity that either issues, accepts or processes credit cards to be regularly assessed against the PCI standard by a properly trained practitioner? The size factor could come into play based on the frequency of these assessments but in general everyone would need to have one conducted.

That wasn’t the eureka moment.

It wasn’t until a day or two later, while reading about newly emerging state data privacy laws, that the clouds parted and the sun shone through. With the MasterCard news kicking around in the back of my mind, I started thinking about how these state-based laws were going to come into play, and when I tried to tie all of this back to the Obama administration’s cybersecurity plan, it happened.

What if all business entities that issue, accept or process personal information, regardless of their vertical, are required to have an information security assessment conducted (think GLBA meets NERC CIP meets PCI) by a Certified Information Security Auditor? Think about it; ISACA could be broken up with the subset that oversees the CISA process becoming federally chartered to both manage the framework and issue the certification (think PCI on steroids). The framework would include portions that are of the one-size-fits-all variety and others that are specific to an industry and would be scalable based on the size of an entity. The CISA practitioners would all be trained on the framework and how to apply it properly and would need to attend agency-sponsored seminars at least annually.

Rather than have multiple frameworks to wrestle with, business entities would be able to distill information security regulations down to a single, stronger entity (and reduce all the redundant activities that so many of my clients are forced to struggle with). It would bump the IT general controls audit up a level to encompass more than just bits and bytes and allow the entity to tie together related activities that are assessed through a single pass. And the icing on the cake is that the resulting report could also be used in place of a SAS 70 (and finally provide a modicum of consistency to the SAS 70 process as well).

But the best part of my idea is that the business entity could staff up with their own certified assessors that would not only conduct the required work, but also serve as internal advisors year-round. They’d still need to be properly certified and maintain that certification, but there would be no need to constantly pay premium prices for external firms and/or resources.

Maybe the idea was inspired by the fact that I’m just burned out a little from working on multiple compliance initiatives or maybe it stems from my concerns that true IT governance is a generation away. However, after my eureka moment and after sharing the idea with a few associates of mine I’m still liking it.

Does anyone have a direct line to the White House I can use?

  Bookmark and Share     0 Comments     RSS Feed     Email a friend


July 2, 2009  2:53 AM

2 for 1 sale: How governance leads to compliance.

David Schneier David Schneier Profile: David Schneier

A while back I’d written about the Unified Compliance Framework from Network Frontiers, which takes quite literally every regulation and framework within the IT domain and maps them in such a way where you can identify how a single control addresses multiple requirements. In this day and age, the era of regulatory overload, with even more regulations heading our way I consider the product an essential tool in managing the required work. However there’s in important caveat to throw out there; the benefits of the UCF product can only be fully realized if it serves as the underpinnings of an IT governance program.

Ah yes, IT governance, a favorite topic of mine and one that’s a sure-fire way to get me to whip out my soapbox and fire-up the accompanying rhetoric. I’m a practitioner first and a theorist second and the combined perspective provided by both has forced me to become a huge advocate of governance as not only the best way to achieve regulatory compliance but perhaps the only way. I’ve reached the end of my rope when it comes to the currently popular way to pursue compliance, which is to build silos and assign each its own regulation or industry framework. How does it makes sense to have, for example, two or more groups of people testing user account provisioning when a single test can be used to satisfy both? It doesn’t and by doing so it wastes time, resources and money.

And so now I’m getting to do something about it.

My current “big” project has multiple parts. The client is managing the consolidation of two business entities including their regulatory compliance initiatives. It’s resulted in their needing to build out a plan to merge four sets of existing regulatory compliance frameworks as well as taking over responsibility for another that’s brand new to their mix. Beyond the doubling up of the required work, it’s also resulted in a new compliance team that’s sizable and using headcount within an IT organization doing work that’s not really IT-specific. That’s the bad news.

The good news is that the client had empowered the team responsible for managing compliance to switch to a governance approach a few years back. Rather than serve as an after-the-fact function that tests to make sure controls are working effectively, this group has served as both an adviser to IT, helping strengthen controls and has streamlined the testing process so that stakeholders pass along evidence of their daily activities, thus reducing the need for the typical testing cycle fire drill that most of know. It’s served two purposes for the IT organization: It eased their burden in the compliance process and made them more trusting of the audit and assessment function.

But in the short term, the consolidation has dramatically increased their workload and at a time when management is looking for ways to reduce expenses and get more for less. How do they proceed? How do they consolidate the related frameworks, assume oversight for the new ones and continue delivering the value and efficiencies that they’ve come to be known for? There’s only one way: by taking IT governance to the next stage of its evolution.

They already understand and practice the basic elements of IT governance and so the foundation has been laid. Now it’s time to take it up a notch to the next level. Thus the tie-back to the UCF approach. If you have multiple frameworks to comply with, the commonalities to be found between them are significant. I know this based on my own research and analysis and can now prove it courtesy of UCF. The manager of the IT governance function is also a believer of this approach and the plan is to build out a true IT governance program so that all in-scope frameworks are to be managed via a consolidated approach. All current and effective frameworks will be supported through the end of 2009 but along the way each control and related activity is being reviewed to identify opportunities for consolidation. Once done, all IT-based activity will be viewed through the lenses of the new governance framework so that compliance is maintained and changes to the infrastructure are evaluated for any potential regulatory impact. And the best part is that all of this will likely be done with less effort, thus freeing up resources to focus on more IT-centric tasks.

Imagine that, a world where compliance is achieved through a coordinated proactive governance approach and IT resources are free to focus on technology-based activities. It’s like solving two problems for the price of one with the added benefit of actually spending less money overall.   What CIO/CTO wouldn’t like that?


June 22, 2009  3:46 PM

Financial regulations and my crystal ball.

David Schneier David Schneier Profile: David Schneier

I had a great piece lined up for this week about a governance project I’m working on but was waylaid by all the news that hit the radar around regulatory reform.

In what may be the understatement of the year, the plans revealed last week by President Obama and his administration to overhaul the financial regulatory domain is stunning.   It was equal parts common sense (dissolution of the Office of the Thrift) , politics as usual (government intervention for distressed larger institutions) and forward thinking (creation of a consumer oversight body).  But for practitioners in the regulatory space such as myself the news was a warning that we all had better pay close attention to what’s about to happen.

The largest percentage of work my practice does has less to do with making sure our clients are in compliance with the broad range of regulations they operate under and more to do with educating them on what that means and how best to achieve it.  The very first step our practice takes with our clients is in understanding their profile, size and risks and then set about designing or assessing them based on what makes sense.  Take for example vendor management; not every vendor needs to be part of your vendor management program but because so many institutions form a baseline based on vendors in their accounts payable system they tend to add an enormous amount of work that’s just not necessary (a particular sticking point for my partner).  Regulatory compliance is not a one-sized-fits-all exercise and after nearly a decade of dealing with the regulatory alphabet soup of GLBA, SOX and PCI (in varying lengths of time) it’s amazing how little is truly understood about each framework and how best to apply their principles.

And now it’s all about to change… again.

Much like what occurred with the last major regulatory step forward with the Identity Theft – Red Flags law that went into effect in 2008, we’re going to need to work hard to get out in front and understand the new rules as they’re being rolled out.  Traditionally, much of what’s necessary to comply with any regulation already exists in large part within any organization.  The work that’s typically required is in identifying where it is and making sure it’s documented sufficiently so the work can be measured and assessed properly.   I’m sure that much of the work that’s going to result from the proposed changes will align with quite a bit of what’s already in place (or should have been in place).  But understanding the new rules is going to be a huge amount of work for those needing to comply and will require time and effort.  And all this at a time when headcount has already been thinned out and staff is working extra time to keep up with their day-to-day workload.

So for my fellow practitioners I’m putting it out there that we need to step it up too.  We need to make sure that we’re engaged in the dialogue early on and that we’re working quickly to interpret the new rules as they’re working their way through the system.  The current regulatory burden has proved to be challenge enough and with the likely musical chairs scenario that’s going to ensue as the rules shift around, it’s incumbent upon us to be prepared to ease the burden, flatten the learning curve, and help the affected institutions fall into line while keeping up with the speed of business.

The sad irony for me in all of this is that despite all the work that’s about to ensue, I’m somewhere close to certain that very little will improve as a result of the exercise.  I was looking through all of what’s been proposed and I mapped it back to the issues I’ve encountered over the years I’ve been toiling in the regulatory space and there’s still a gap.  The biggest problems originated from a lack of proper regulatory oversight resources in terms of both the hours and skills to conduct the necessary work.  You can have a strong set of rules that need to be followed but if the people assessing your performance against those rules either don’t understand what to look for or don’t have the time to conduct the necessary steps, what’s the point?  And consider what happened in the credit union space this year where, due to the onetime assessment, many CU’s fell below required reserve amounts and thus were considered to be at risk.  The NCUA instructed their examination teams to still assign an appropriately adjusted rating but to go easy on the report because there was a new normal (I’m paraphrasing a bit but that was clearly the gist of their message).  The rules were there for a good reason and the measurements tried and true but when circumstances called for it they were pushed to the back-burner; how is that going to change?   And finally, I offer my favorite broken control and one that’s potentially at the heart of this economic crisis we’re struggling with: real estate valuation.  When I bought my last house in New York, the appraiser conducted all his required steps (e.g. physical survey, square footage and finding recent comparable sales, etc.) and when all was said and done he declared the house was worth the purchase priced we’d offered.  I asked our real estate agent how it happened to be that his appraisal and our offer were identical and she told me that with the market so volatile it was impossible to conduct a meaningful appraisal and so they typically just went with the offer price.  How did that add any value to the process?  Will any of the new laws implement the proper checks and balances to assign accountability to lenders and their agents in the field?

Ultimately, I’m thinking the problem hasn’t been with the current regulatory rules but rather their inconsistent application and enforcement.  Regardless, change is a comin’ and it’s going to be an interesting and bumpy ride as we wend our way through it all so strap yourself in and hold on tight.


June 12, 2009  8:49 PM

Risk is at the heart of what matters most.

David Schneier David Schneier Profile: David Schneier

I had two great conversations this week regarding risk assessments (jeez, does that ever sound geeky).

The first conversation centered on what an associate was expecting  to accomplish via the risk assessment process and the second one was a general conversation about the proper approach to conducting one in support of PCI.  Fundamentally, at some higher level a risk assessment looks like a risk assessment regardless of its intended purpose.  But at the ground level, there are huge swings and variances between different types of assessments and how they’re conducted.

The first go-round centered on GLBA and was with a fellow practitioner I’ve worked with before on audits.  The conversation started because my associate was preparing to conduct an information security risk assessment later this month and was talking about all of the planning effort required before, during and after the fieldwork.  I commented that it sounded more like he was conducting an audit rather than an assessment; his reply “same difference.”  And I thought to myself “funny, it sounded like he just said that an audit and an assessment were the same thing.”  So I asked him to clarify and he did by stating that the only difference between the two was whether or not you needed to create work papers.  I was stunned by this shocking admission and obvious lack of understanding of our trade.  An audit tests in-place controls to determine if they’re working as designed and are sufficient to address the related risks those controls are intended to manage or mitigate.  An assessment examines objectives or assets, identifies risks to either and then looks to see if there are controls in-place to manage those risks.  I can get more granular than that but at its most basic level, that’s how it’s supposed to work.  Risk assessments identify risks and related control activities and audits test to see if the necessary controls are present and that they’re working properly.  He also shared with me that he uses the same general outline of questions and topics for both, that they’re both based on FFIEC guidance and that it’s exactly what the examiners are expecting.

OK, first of all, the examiners are not expecting that the same work is being done for both an audit and an assessment.  I’ve spoken with enough of them to know that plus I’ve heard several of them stand before large audiences and explain the nuances of both and so I know they get the differences.  Second of all, the entire purpose of a GLBA-based assessment is to ensure that at some acceptable frequency (which secretly means annually) each financial institution should conduct a thorough assessment of their infrastructure (not just IT, by the way) and identify and measure risks and threats to the customer data they manage and store.  If you move right past the assessment and conduct an audit you’ll never truly know if you’re missing something because you’re only testing what you can see.  I’ve participated in and conducted enough of these to know this to be true.  And if you conduct an assessment like it’s an audit, you have the same basic issue.  Thus the reason why you typically schedule an assessment first and an audit second.  I was about to explain this to my associate but decided against it; I somehow thought it would’ve fallen on deaf ears.

In the second conversation I participated in around risk assessments it was all about PCI.  This one was much easier because the team running the assessment had built a high-level approach based on guidance from the PCI DSS Council itself and displayed a clear understanding of what they were to do.  While they hadn’t developed any of the templates to be used (it’s not due to be conducted until next month) they were well into the planning stages.  My role was to simply listen and provide any meaningful feedback.   The one potential gap in their intended approach is that they were taking an asset-based approach to the assessment.  What that effectively means is that they only look for and attempt to measure risks around what they can see be it a piece of equipment or a documented process.  But if there’s something that doesn’t appear on an inventory, a network diagram or exist in a binder somewhere, they’ll potentially miss it.  A great PCI for-instance is what happens when a customer service rep writes down sensitive information on a pad because their system is hanging and they’re trying to keep the call within acceptable time frames?  It’s not supposed to happen but I’ve been in a dozen call centers over the past few years and have personally witnessed it happening in almost all of them.  If you were to rely exclusively on what’s documented, you would note that typically erasable whiteboards are to be used for such situations and so you would consider that to be low risk.  But the risk increases significantly if you move past the “thing” you’re looking at and poke around a bit.  Assuming that they do use a scratch pad on occasion, what happens to that piece of paper after the transaction is completed?   Is it thrown out and if so, is it placed in a secured bin or in the regular trash?  And so I advised the client to approach the assessment like it’s an Easter egg hunt.  Deviations and violations are always swirling about (we’re human, we make mistakes) and as part of the assessment process you should be looking for where that might be happening.  But the best part of the Easter egg approach is that it gets everyone involved in the conversation thinking a bit outside the box and that’s where the really neat information is found.  And really, in the end, that’s what a well-run risk assessment should be all about: seeking out and measuring risk wherever it may be hiding.

When the PCI conversation concluded I was feeling a bit energized because in brainstorming with the client I had the ideas flowing freely.   And than I recalled the first conversation and I could almost hear the screeching tires and smell the burning rubber where my creativity came to a complete halt.  It’s amazing to me how the term “risk assessment” can mean completely different things to different people.  I was as impressed with my client asking for guidance and wanting to get it right as I was disappointed in my colleague for having it all wrong.  This wasn’t an issue of principle but rather results; how can you properly manage this thing called risk if you don’t even know how to begin looking for it?

Speaking of risk, check back next week when I jump back onto the GRC train of thought and bring you up to speed with something I’m working on.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: