Regulatory Reality


April 8, 2010  2:24 PM

Online identify theft: One victim’s story

David Schneier David Schneier Profile: David Schneier

Last month I blogged about a phishing attempt that landed in my inbox.  The email account belonged to someone named Rebecca Keen who I had never heard of before (or so I believed at the time).  As I was finishing writing that post, I received a follow-up email from the same person indicating that all was well, that her account was hacked and asked that no one respond to the original phishing email.  As it turned out, Rebecca Keen was actually someone in my extended network, courtesy of a PTA email thread that I was part of.  Because she used Yahoo mail and went with their default settings, all of her outbound email addresses were added to her address book and so I was one of her contacts.

Ms. Keen was kind enough to share her story with me so that I in turn could share it with you.

Her bad day started with the most basic error in judgement: She responded to a Yahoo-branded email requesting that she confirm her account information or else her account would be closed.  She said that “despite my initial instincts, I fell for it.”  It’s not hard to understand why.  Like most parents with school-age children, she has too much going on, depends on email to keep things moving and if she is anything like my wife, is of a mind to address things as they arise; she was a perfect target for a hacker.

Ms. Keen first became aware that she was about to have a bad day when she received an early morning phone call from a friend indicating they’d received an email from her asking for help.  She attempted to sign on to her Yahoo account to see what was going on but the hackers had changed her password and she was locked out.  She explained what happened next:

I had to wait for Yahoo to open at 9:00am to resolve the issue and regain access to my account.  Yahoo was extremely helpful and we were able to take the account back quite easily.  The representative I spoke with knew to advise me to confirm if any of my personal information had been changed, which it had.  An alternate email address had been added by the hacker as a way to retain control of my account even after I had gotten back in.  And my understanding is this is how they would continue to log in and check to see if anyone was actually trying to send me money.  If I did not know to delete this alternate email, the hackers could continue to monitor the account and target anyone asking me where to send the money.”

I asked her if anyone actually attempted to send money or respond favorably to the hacker’s phishing attempt and fortunately no one had.  While she did receive a few calls and/or emails trying to confirm if the request was legitimate, because as Ms. Keen explained, “They did indeed want to help me if I really needed it,” no one actually took further action.  Apparently the majority of people who received the phishing attempt knew it was a hoax and ignored it (score one for security awareness in the private sector).

Was there a lesson learned from all of this for Ms. Keen to share?

Do not respond to emails requesting personal account information, no matter how reputable they may seem,” she said.  “As Yahoo explained, they would never request that sort of account information from me (they already have it and there is no need for it to be confirmed).”

To which I would add that you could easily replace the Yahoo name with literally any reputable business with which you have an online account.  I would also recommend that you print Rebecca Keen’s advice and tape it to your monitors and keyboards at both work and home for all to see.  Because whether it be the result of a successful phishing attempt, poor judgment or sloppy controls (e.g. sticky notes under the keyboard/phone/stapler, etc.) the number one entry point used by hackers to gain access to sensitive information remains password sharing.

Check back here next week. I have an interesting (if not scary) story to share about how some financial institutions are (mis)managing regulatory requirements.

March 22, 2010  3:20 PM

Information security awareness begins at home

David Schneier David Schneier Profile: David Schneier

Sometimes the best blog ideas just fall into my lap.

I was greeted by this status the other day on Facebook:  “Today’s game – PLACE OF BIRTH! Everyone please play! You will find it interesting to know where your FB friends birthplaces are. Copy & paste this on your profile, then put your place of birth at the end of this sentence…. Brooklyn, NY”.

Really?  I mean, really?

What’s next, have everyone post their Social Security number and date of birth to see how similar the numbers are?  Or even better, I suggested to someone that everyone post their Social Security numbers under the guise of seeing if people can guess where and when it was issued (that someone actually liked the idea).

So there I was, dumbstruck and amazed and started trying to figure out how to prevent this sort personal data exposure from happening in my own home.  I checked all of my PCs to see if the anti-virus software was up-to-date and functioning; it was.  I checked to make sure that all critical software updates were installed; they were.  I verified that each machine had a unique and strong password; they did.  And after conducting this basic sanity check it occurred to me that there’s still no automated solution to prevent ignorance or – dare I say it – stupidity.

Despite technology doing it’s best to prevent malicious or unwanted activity from occurring on your machine there’s nothing short of web-filtering to prevent people from doing what people do  best: act human.

When my family first became Facebook aware, I immediately instructed those who use it to avoid those lists that capture intimate details about your life (e.g. , 20 things no one would ever guess about you) and display it to all with access to your profile.  My family thought I was being paranoid but I explained to them how someone can take that information and guess password challenge questions or gain the trust of those who know you by making references to some of those details.  They weren’t happy with me because it all seemed to be in good fun but I assured them at some point, somewhere, it was a hacker’s mentality that came up with the idea.  You have to know, I’m the guy who refuses to use non-bank ATM’s, probes the card reader to see if it’s a permanent part of the ATM and checks the area for possible spy cameras that might capture my keypad input (no joke).  That same paranoia carries over to the online world we all spend so much time in these days.

It’s like the Trip-It application a number of my connections use on LinkedIn.  Here’s a great idea: Let’s advertise to hundreds of people when I plan to be away from home and for how long.  And while I’m at it, I’ll post some sensitive information about me on Facebook (because so many people mix their personal and professional networks) so that you could also potentially guess my alarm system access code or challenge question should the monitoring company call the house.

Really?  I mean, really?

Oh and hey, check back next week because I actually spoke with Rebecca Keen (see my March 2nd post) and will have an interesting update to share.


March 14, 2010  3:59 AM

Muddy waters: Governance, risk and compliance

David Schneier David Schneier Profile: David Schneier

I had an email exchange with a colleague last week in which GRC (governance, risk and compliance as a unified methodology) was central to the discussion.  She felt that there’s been a blurring of the lines in how people view GRC versus ERM (enterprise risk management) as disciplines and wanted to know my take on the two.  I fear my reply was more blog post than email but GRC is a topic I have strong opinions about and which has long been a favorite theme in my blogs through the years.

First of all, it’s mind boggling to me that anyone who earns a living anywhere in the GRC spectrum could ever confuse ERM for GRC.  My colleague clearly understood the differences but was overwhelmed by content provided by many GRC voices in the community and was wondering if maybe something had changed.  It hadn’t.  I too have noticed some subtle changes over the past year or so as many of the GRC thought leaders appear to be positioning themselves to pursue a broader range of services and expand their audiences and customer base.  With the economy in a shambles, it’s not hard to understand why.  But in doing so they may have diluted things to the point where GRC becomes a catch-all phrase for anyone with skills or offering services in audit, compliance or risk management.

What happened to the underlying premise of GRC though?

What happened to applying an integrated approach to how risk and regulatory requirements are managed?  What happened to all that promise I first started hearing about nearly a decade ago in which all three disciplines worked together so that risk was properly managed and compliance was achieved without duplication of effort or wasted activities?  A few years back, I ranted about how GRC was reduced to either being a complicated software solution or a dense, formulaic methodology that (almost) no one had an appetite for.  Now it’s not even that clear.

Maybe it’s just me or maybe it’s because so much of the work I’m involved in over the past few years is driven by GLBA (which really demands a basic GRC approach) that I see the simplicity that’s been lacking. This was a thought that came to mind very recently when reviewing content on the basic tenets of GLBA compliance, which my practice uses to educate our clients.  Truth be told, much of what’s required can be clear as mud to busy executives who generally want to do the right things to both comply with the various regulations and also protect their customer/member data but simply don’t know where to start or why.  So we lay it out for them in such a way that they can quickly understand the work that needs to be done and make informed decisions.

Think about it though.  GLBA requires governance in the form of board of director oversight supported by a framework including a wide range of policies and procedures (e.g. information security, BCP, vendor management, etc.), which needs to be supported by a regularly occurring risk assessment and validated periodically via audits and vulnerability assessments.  We’ve got the G, we’ve got the R and we’ve got the C and it’s all wrapped up in one encompassing regulation.  And the best part is that for those institutions that are fully compliant, it works.  The regulation, if properly implemented and supported, does the job it was intended to do; I’ve witnessed it myself time and again.

What does it mean that GLBA works?  It means that GRC as a concept also works.  Central to its viability is understanding  what you have to do, why and how you have to do it and when to get it done.  You can’t just plug something in or buy a template and fill in the blanks; you have to work through it in a logical sequence.  And the work being done should absolutely make sense to everyone involved from senior management on down to the line people.

Our practice is fortunate because we’re validated by what our clients hear from their examiners, but in the GRC space it’s not quite so clear-cut.  Not that it can’t be, only that as it stands today it isn’t, which is a shame considering that the original intention of the discipline was to simplify things.


March 2, 2010  8:18 PM

Something smells phishy

David Schneier David Schneier Profile: David Schneier

I received an email from Rebecca Keen this morning asking for help.  You see, Rebecca took an unexpected trip to the UK and while there lost her wallet and all of her financial resources and was hoping I could help.  She asked if I could float her a temporary loan of $1,540 so she could settle her hotel bill and make it back home safely.  It turns out that all of her other possible avenues for assistance have failed her and I’m something of a last resort.

Of course I don’t know anyone by the name Rebecca Keen and knew instantly that it was a phishing scam. It’s not the email by itself that made this a blog-worthy item.  What made Rebecca’s email this week’s topic was the reaction of someone close to me and their attitude about how to handle it.

At the risk of embarrassing anyone, I won’t go into specifics as to who the person is, but when I told them about the email as a way of educating them on how to identify and manage phishing attempts, they asked me how I knew it wasn’t legitimate.  Beyond the obvious fact that I don’t know now and have never known anyone by that name I’m not sure what else I’d need as proof this was a scam.

Here was the ensuing exchange:

“That may be true but what if they sent you the email by accident?  What if they misspelled the email address?”

To which I replied, “Still not my problem and I won’t respond because that establishes a dialogue which will only encourage the person further.”

“But shouldn’t you at least let the person know they reached the wrong person,” I was asked with a tinge of real concern.

“If I reply, that will send the message that they reached the right person. They’ll think I care, which will only open me up to additional pressures from the scammer”.

“People are so mistrusting these days.  I’d at least want to make sure this wasn’t someone who needed my help”.

And therein lies the problem: Despite this being a very obvious phishing attempt, it was only obvious to me. Despite the endless stories about people being exploited and robbed by an endless array of online and email scams, there are still people who respond favorably to these sort of things because of their basic decency. The person to whom I was talking wasn’t lacking in intelligence and isn’t typically naive, but when presented with these situations uses a different set of rules.

To make matters worse, the email from Rebecca Keen was properly formatted without spelling errors and actually looked like something I might receive from a legitimate source.  As a matter of fact, it presented itself so well that I actually opened it, which is a step further along than these things usually get.  But of course I knew instantly that it was just the latest example of how people are using the Internet to try and steal money.  And while the scam was obvious to me, there is at least one person I know who might actually have taken action upon receiving something similar.

You know what occurred to me today?  The reason that scammers continue to send out phishing emails is because they still generate the desired results.  Despite the endless marketing campaigns by a wide range of financial institutions to educate online users, there are still a large enough number of people who are victims waiting to happen.   And as long as even one person responds favorably to a phishing campaign, it’s considered a success.

I’m thinking that as a former New Yawker I should create a program for the FDIC based on my experiences growing up in New York City.

  • Do not engage in any dialogue with anyone you don’t know about money in an unusual or inappropriate setting e.g. street corners, subway platforms, etc.
  • If someone is selling something, offering to buy something or trying to distract you somehow when in an unusual or inappropriate setting (e.g. stopping you on the street, walking up to your table at a restaurant, etc.) immediately disengage and continue on your way or return to what you were doing without allowing the conversation to develop and/or continue.
  • And if at any time your instincts tell you that something is wrong, amiss, out of place or odd err on the side of caution and do everything and anything to remove yourself from that situation.

P.T. Barnum was often credited with having said that “There’s a sucker born every minute” and apparently online there are somewhere between two and too many scammers waiting to take ‘em.

P.S. As I was about to publish this post I received an email update from Rebecca Keen letting me know that someone temporarily stole her email account and that there’s no emergency whatsoever.   Glad to hear it but I still have no clue who she is.


February 23, 2010  4:17 AM

Rethinking compliance software

David Schneier David Schneier Profile: David Schneier

Here’s me about to eat crow.

After nearly a decade of railing against software as a solution to address the challenges of regulatory/industry compliance, I’m being forced to reconsider my position.

I’ve long advocated that an institution or organization could just as easily develop manual policies and procedures that would pass examiner/auditor scrutiny without having to spend additional money on a related software-based solution.  I’d encountered too many instances where an application was purchased to automate the related steps and it either fell short, was too complicated or was really designed to do something else and was poorly retrofitted just so the vendor could pursue a new market.

But my experiences this past year have revealed another side to this story that I hadn’t really considered.  In the spirit of full disclosure, I need to share with you that my company has developed a regulatory compliance solution and I’ve been heavily involved in its design and development.  Because of this experience and because of the myriad conversations and meetings I’ve participated in, I have a new found perspective on all of this.

First and foremost, it turns out that the foundation of my bias was formed on solid logic.  Many of the solutions that started popping up nearly a decade ago were born from desperation as an endless number of vendors watched their Y2K-centric solutions pushed to the edge of extinction as the new millennium came and went.  Many of these products morphed into network utility-ish software, some reinvented themselves and created a new market (e.g. portfolio management) and many others unfortunately died from the effects of irrelevance.  But some languished on the sidelines, barely existing and surviving on small crumbs of opportunity.

Then along came SOX and PCI and suddenly there were new revenue stream opportunities.  Vendors rushed to reinvent their products by re-branding, adding some cosmetic enhancements and the full force of a marketing campaign.  As I encountered many of these solutions out in the field, I formed a bias because so many of these “solutions” weren’t really solving anything, at least not the way they promised they would.  After developing less expensive and more effective manual compliance programs, I decided that was the better way to go.

But I sort of threw out the baby with the bath water.  What I didn’t consider were solutions that were designed by people who understood the purpose of the exercise and knew where and when to automate.  I hadn’t encountered very many, if any at all, that I felt were tightly aligned with the work that really needed to be done.  Those that I did like were often bloated, complicated and required a more advanced skill set than most of my clients possessed (e.g. disaster recovery).  At some point, I stopped even paying much attention to the details and more or less put all of them into a single category.

This past year, though, has served as a slap upside the head.  It turns out that there are some fantastic compliance-centric software solutions kicking around addressing a wide range of challenges that my clients are confronted with.  In hearing from the banking community what they liked, what they didn’t like and more importantly, what they needed versus what they didn’t, they shared real-world examples based on existing products.  It seems that some of the solutions I’d shunned since hanging my shingle in the banking sector years ago were making a difference for the better.

It didn’t take long to figure out what separated these vendors from the rest of the pack though.  Most of them were run by people who understood the underlying business drivers that created the market.  They also understood the behaviors of the user community and how they’d use the software.  Perhaps the most notable trait was that they were almost laser-like in what requirements they addressed and how that fit into an institution’s overall framework.  They were designed to address specific needs and solve specific problems.  They didn’t just solve part of a problem, they solved the entire problem, all of it with one well-designed solution.

Before recommending that anyone run out and purchase a “silver-bullet” solution to appease their examiners, I offer the following advice:

  • Know what you need a product to accomplish in order to meet regulatory demands.
  • Qualify a solution based on real results from recent customer experiences; did the software live up to expectations?
  • Test the vendor’s knowledge of the related regulation and gauge the depth of their expertise in complying with same.
  • Find out what the examiners thought of the solution.  While they won’t endorse a product/solution they are quick to let an institution know when they like something.
  • Be realistic in terms of how much effort is required to get a product installed and running right.  Even the best software needs to be populated with data and that takes time.
As for the possibility that this post may be perceived as self-serving, I would share with you that when conducting audits and assessments, I still advocate using manual programs to address gaps and deficiencies. Oh and another thing, don’t get used to my admitting I may be wrong; it’s a very rare event indeed.


February 12, 2010  11:38 PM

IT audit reports: Why you can’t handle the truth

David Schneier David Schneier Profile: David Schneier

I was reading the local newspaper this morning and was surprised to find a front page story ripped from the headlines of my professional life (ironic, I know).

Right there on the front page of today’s News and Observer was a story about how a recent audit claimed corruption at a local college (North Carolina Central University).  I’m sort of trained in a Pavlovian sort of way to notice anything having to do with audit and so I gave it a cursory read.  Cursory turned into focused when I reached the part about how the school’s chancellor Charlie Nelms called the report draft “sloppy” and went on to say that some of its harshest accusations might not be true.

One of the oldest tricks in the business book when it comes to audits is to start questioning the quality and veracity of reports that are perceived as not being favorable.  Instead of focusing on the audits findings and trying to validate them (because a good audit is your best friend if you really want to do things right) the auditee goes into a series of tactical maneuvers to deflect attention away from the report’s contents and feigns disgust and outrage.

The school chancellor went on to say that, after firing the auditor who produced the report, he “ordered his staff to gather more information before he releases a final version to the public.”   He went on to say that the “draft audit was so poor that he doesn’t trust it, and he does not want to damage the reputations of people who might not have done anything wrong”.

A few years ago, I conducted a risk assessment for a client with an odd configuration of infrastructure pieces that clearly defied anything close to typical, so it was difficult to measure them against the norm.  Just the same, I tried.  I took a step back after conducting all of my interviews and gathering as much information as was available and filtered it through the lenses of an examiner.  I surfaced gaps and issues that were likely to be viewed in a negative light, explained why that was and offered clear and concise remedial steps.  Senior management went bonkers (for lack of a better word) when they received the report.

They were outraged because the report was delivered a week late (which was true), they were insulted that there were typos (not factual errors, just a few grammatical/spelling hiccups which are common in draft versions) and charged that some of the issues listed were completely false.  In summary, they called into question the accuracy and reliability of the entire report.  It was startling for me because in my more than two decades working in the business world with more than 10 years conducting audits and assessment, I’d never had a client react anywhere near this way before.

But it was really more about using diversionary tactics intended to gain a negotiating advantage.  Their end game was to soften the report’s contents so that it looked better when the examiners came back around; by pushing us back into a defensive position, they were almost successful.  Fortunately, I’m stubborn when it comes to standing behind my findings and need incontrovertible proof that I was wrong about something before changing or removing things.  I may not be the best auditor but I have well honed instincts around IT, the myriad processes necessary to support the infrastructure, and I know good from bad.  I never put anything into my reports that doesn’t resonate with me and my peers (and typically the report’s audience).

So you can imagine where my head was at while reading the story today.  Mr. Nelms also said, “I want to see the source documents, and I want to see the field notes from the audit, because I want it to be accurate.  I don’t want it to be hearsay, because some of the allegations are just mind-boggling.”

Well that’s good to hear because any audit worth its weight in paper needs to be supported by solid work papers.  But considering that he fired the auditor, I’m hoping someone in his office thought to secure that beforehand.  And I’d need to understand why he’s gathering more information when all he really needs to do is use the work papers and have another independent auditor re-perform the tests.

Oh and another thing, who hired the auditor to begin with?

Also, now that the report’s findings are semi-public (it’s available despite not having been formally released), where’s the value in conducting a follow-up audit?  Anyone involved with any alleged wrongdoings now has a clear roadmap in front of them on how to cover their tracks.

Here’s my thinking on all of this: The audit is likely somewhere close to 100% accurate but far from perfect (I know that’s a contradiction).  If the chancellor was really interested in handling this properly, he’d quietly set about having independent people digging into the findings, not as a CYA exercise but simply to get to the bottom of things and deal with whatever is found.  I’m not saying that where there’s smoke there’s always fire but unless Mr. Nelms can offer a credible explanation why he would think that the fired auditor would fabricate stories or offer poorly formed conclusions I’d have no choice but to question his position on all of this.  I guess what I’m asking for is a credible explanation as to where the smoke is coming from and an explanation why he thinks it’s benign.

What I’d like is to hear the auditor’s side of the story.  I’m betting that would be an enlightening conversation.  But if Mr. Nelms was successful in his very public tongue lashing of this auditor, he/she will do anything and everything to avoid having their name outed.  And so the diversionary tactics score another point.

And the best part of this?  I almost never read the paper.


February 5, 2010  3:57 AM

How security aware is your organization?

David Schneier David Schneier Profile: David Schneier

Consider this post to be something of a (banking) community service announcement.

It’s February 2010, do you know when the last time was that your organization conducted a social engineering exercise?

I come across instances almost all of the time where financial institutions have obvious issues with regards to their staff and how they handle sensitive information.  I almost always find non-public personal information (NPPI) left unsecured on desktops, in printer/fax queues and displayed on computer monitors.  I can recall at least a half-dozen instances during the past year where I personally witnessed person-to-person exchanges where proper protocol was not followed in handling situations that are now supposed to be governed by the Red Flags rule.

It’s not hard to understand why these things happen; it involves human nature and that’s a wild card element that can’t be easily managed or controlled.  People are busy, people are inherently trusting and in their haste to help a customer or get their work completed, they lower their guard.  But it’s in those moments when good judgement is pushed to the side that an institution is most vulnerable.

A password is shared, a sensitive document is left exposed or a file loaded with account information is carted off on a USB storage device.  Ultimately, how it happens is never really the big story though, at least not for those impacted by the breach.  For the affected, it’s all about the damage, both potential and realized, that they’re confronted with.  And of course it then also becomes about the tarnished reputation of the institution connected to the breach.

Do something about it.

Schedule a social engineering exercise; it’s easy, it’s affordable and it works.  It tests the effectiveness of your security awareness program(s), illuminates what’s working and what’s broken, and allows you to adjust your training accordingly.  And you can vary the angles taken from year to year.  Start by seeing what happens when someone places calls to your staff trying to get them to share sensitive information.  Follow that up by doing the same with emails.  When you conduct your next internal vulnerability assessment, have the project include having the testing resources access secured facilities.  You can also mix in dumpster diving (a favorite of mine), fax phishing and eavesdropping (don’t laugh, it’s a great way to skim NPPI and impossible to trace).

The results will prove to be revealing – both good and bad – and will serve as remarkably effective fodder for your next round of training.  And the testing itself becomes an important tool because once people are aware that these tests are occurring, they begin to pay greater attention to their action. No one want to be tagged as being on the wrong side of the testing – trust me on this, I’ve seen this dynamic in play and it’s real.


January 27, 2010  12:13 AM

Banking regulatory reform is a comin’

David Schneier David Schneier Profile: David Schneier

I was scanning through emails the other day and almost missed a good one. It was from the FDIC on Friday, January 22. As we’ve all come to know Friday is the FDIC’s equivalent of “bring out the dead day” when they almost always announce the most recent slate of bank closings, so I didn’t pay it any attention at first. But it was issued early in the day and the barrage of bad news announcements typically doesn’t arrive until sometime around Happy Hour (you decide for yourself if that’s coincidence).

And so I popped it open for a read.

It was an announcement about how the FDIC and the Bank of England signed an agreement (they called it a Memorandum of Understanding or MOU) to cooperate with one another in the dissolution of cross-border institutions. Forgive a compliance geek his potentially misplaced enthusiasm, but I thought this to be a neat and somewhat intriguing bit of news. The biggest banks all operate on a global level and I know from first-hand experience that in many instances they do so not so much to tap into new markets but rather to exploit competitive and legal advantages (think Switzerland and their very favorable rules). One of the distinctive advantages of doing business this way is that what might go wrong in one marketplace is often insulated from the rest of their organization, thus reducing their risk; you may blow up one business unit but legally it doesn’t expose the remainder of the company. But regardless of the reasons, what this business model almost always creates is the overly complex monolithic banking monsters that have commonly been thought of as “too big to fail.”

This MOU is an important step towards doing something to simplify the global banking world. It potentially lays the groundwork for the oversight agencies that are often responsible for cleaning up the mess made by the banking giants to have wider authority to do what’s necessary to protect depositors (and tax payers) from absorbing the brunt of the blow. It also is the first salvo resulting from the recommendations of the Cross-border Bank Resolution Group (which operates as part of the Basel Committee) headed up by my favorite banking superstar, FDIC Chairman Sheila Bair. You might recall that she railed against the concept of “too big to fail” and as a result of her involvement with this group put herself in the position to do something about it.

I’m not sure how much further this sort of thing will extend itself, being bit of a cynic when it comes to banking oversight on an international level. You see, back in 2008, I conducted a fairly exhaustive amount of research trying to identify the FDIC’s counterparts around the world and was amazed and dumbfounded based on what I didn’t find. Outside of the U.S. and UK there really wasn’t anything even close on a functional level. Sure there were some government agencies in place but their role and powers weren’t anywhere close to what we have here. And I was all the more amazed that despite having the Euro currency in place and an organization to oversee its management there really wasn’t a related banking oversight group. When you think about Europe and how it’s laid out and how simple and obvious it would be for banks to operate across borders, you’d think they’d be among the first to coordinate efforts but it simply wasn’t there. So what we have at this point is an important agreement between the two most mature and best- organized nations regarding banking oversight. But this one was relatively easy; what remains to be seen is how the rest of the civilized world addresses this issue.

I’m hopeful that in light of the mess the global economy has been in due to mistakes made in the banking industry, that the various governments will move to get on board quickly but there’s little in the way of historical precedent to make anyone think that’s likely. Still, with Sheila Bair involved, you never know.

  Bookmark and Share     0 Comments     RSS Feed     Email a friend


January 15, 2010  6:05 AM

The best part of audit (yes, I mean audit)

David Schneier David Schneier Profile: David Schneier

A recent jobs survey released last week indicated that less than 50% of the work force is satisfied with their job. Me, I’m a lucky guy as I genuinely like what I do for a living. It’s funny in a way because over the first decade or so of my career, I held people like me in very low regard; I just didn’t much care for or respect auditors.

One of the key considerations in sorting through the irony that’s my place in this world is that I’m nothing like the auditors I used to deal with in my application development days on Wall Street. What I audit, how I examine related controls and activities and review supporting evidence is heavily biased by my first-hand knowledge of the IT infrastructure. I understand technology and how it’s used, and so when I’m conducting fieldwork, I’m able to see things from a blended perspective. Most of the auditors I dealt with understood audit way better than they understood technology and so they’d ask question after question, not really knowing if the answers made sense, only if they matched expected results. For me, if the answer doesn’t make sense or is the wrong one, I immediately switch gears and seek out compensating controls because they’re often there if you know where to look.

Audit is heavy on my mind this week because I’m in the process of wrapping up a report for a client about the exit meeting. It’s interesting how the names and faces change from engagement to engagement but the script rarely varies. You’d think it would get old or boring but curiously it never does. The client never likes to see anything negative in print and it usually sets off a flurry of activity from report issuance to the first review meeting. There are almost always a series of requests to move things around, change the way things are worded and occasionally to reevaluate ratings. And I can’t recall a single audit where additional evidence wasn’t submitted for review after the initial draft was distributed to offset findings – artifacts that often have that “new car” sort of smell. But that’s actually a good thing and I’ll explain why.

An auditor’s job is to find control gaps and weaknesses. I’ve often compared what we do to fishing: You cast your line, see what you can catch, and keep at it until you either fill up your basket or have exhausted all available time and resources. Sometimes the bounty is rich and sometimes not so much. But there are always things to catch (I’ve never been shut out yet) even in the very best managed IT shops. The payout for the auditor is to identify legitimate issues that resonate with client. You want for those who own the controls to understand what the issues are and take swift action to remediate. I know some auditors take offense to after-the-fact evidence being provided because they perceive it as if though it’s implied that they missed something. Not me. When the client comes back quickly with viable solutions to make the findings go away, I consider that a bonus even if they didn’t exist a week earlier. That means that real risk is being further mitigated and managed and that’s the only reason to ever conduct an audit, in my opinion.

The client I’m working with, as it turns out, has fast become a favorite of mine. They’ve made great strides over the past year or so in enhancing their security posture and have gone a very long way towards putting in place effective controls to protect themselves, which ultimately results in their better protecting their customers. They take this sort of thing very seriously and as such, they have earned my respect. So when they come back to me with newly available information to offset findings in the draft report I’m happy to factor that into my findings. I did my job, they did theirs and in the end, the world is a little more secure.

So I guess I’m a minority on a couple of fronts: I’m more than satisfied with my job and I’m an IT auditor who genuinely understands the technology infrastructure. So much for there being strength in numbers.

  Bookmark and Share     0 Comments     RSS Feed     Email a friend


December 29, 2009  5:30 PM

Was 2009 the year regulatory compliance became a good thing?

David Schneier David Schneier Profile: David Schneier

When I sat down to write my last blog post for 2009, I was planning to write either about my predictions for 2010 or a retrospective of 2009. But that’s just so clichéd; everyone does that or tries to. And as I’d wrote in a recent post about the Verizon report on security threats, sometimes there’s not a whole lot that changes year-over-year and so my bold predictions for 2010 would likely look almost exactly like those I made for 2009.

Instead what’s on my mind isn’t so much a revelation or prediction but rather an observation.

When I first started focusing on the banking vertical a few years back, I was often depressed by how so many institutions did what they did because they had to and not because they viewed any inherent value in the exercise. Information security policies were dust collecting binders, business continuity plans (BCP) were a collection of basic documents that described what needed to be done but rarely provided direction on how to do it, and vendor management was typically just a spreadsheet with a some key data elements. Rarely did any of these artifacts provide real value or come anywhere near close enough to address the spirit of the regulations that required them to exist.

That’s definitely changing.

I’m currently working on two projects that deal directly with business continuity planning. For the first project, I’m rewriting a clients plan and taking it from the aforementioned collection of basic documents and remodeling it into an actionable plan that will allow them to use it as a road map to steer them through various types of business disruptions (they’re not all disasters, sometimes it’s just a power outage or a burst water pipe). They knew they needed to make changes to their existing plan and management saw the inherent value in having something that would actually drive the decision-making process when necessary and increase their chances of successfully navigating through a crisis. The second project is actually an IT general controls audit in which the project sponsor has requested increased scrutiny be placed on BCP related activities. They wanted to make sure that their plan was well designed and properly tested and would at least meet, if not exceed regulatory requirements. The amount of work their management team poured into the plan and its related activities was clearly reflected in the evidence I reviewed. They not only have a viable plan but have tested it from several entry points, have identified potential glitches and fixed them so that should they need to implement the plan, they have a high degree of confidence that it will work. Both of these projects underscore something that’s significant: financial institutions are embracing the value of complying with the regulations. Having a formal BCP is not just a way to make the examiners happy; it’s the best way to successfully manage the unexpected.

And this more than subtle shift has been noted in other key regulatory areas.

Vendor management has moved from a clerical exercise into something that’s more day-to-day, where contracts are being scrutinized and performance measured. We have an ever-growing client list on that front where they aren’t concerned about the regulators so much as making sure that they’re protecting themselves and their customers/members from unnecessary risks and exposure. We’re seeing it with Red Flags – Identity Theft and Incident Response plans where the banks and credit unions want to leverage the “have to” and convert it into something that helps them manage risk better. Regarding Red Flags, we’ve found that most institutions have their plans up and running but haven’t implemented it properly and are using it more as a Suspicious Activity Reporting (SAR) mechanism rather than the its intended use. But when we bring that up in our reports, the response has been refreshingly positive where management wants to get it right. That’s what’s different from previous years.

I’ve long advocated for doing it right as long as you have to do it and now it appears that management is coming around to embracing that idea.  Perhaps that’s what 2009 will reveal itself to be in the rear-view mirror: The year when our industry matured from “have to” to “want to” when addressing information security. A year when it wasn’t about having a board-approved document but rather a viable plan or program. Now, that would have to be considered a good year.

Happy New Year. May it be a good one filled with an improved economy, fewer breaches and an increase in management oversight.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: