I just finished reading through the most recent report from Verizon Business, which offers a deeper dive into the most common security breaches identified during 2008 and quite frankly, I’m concerned. Turns out that there’s very little new to worry about beyond what we already know – and that concerns me greatly.
I am a bit relieved that the threats we already know about are still pretty much those that we’re dealing with; we know how they happen, why they exist and what to do about them. But that’s also why I’m worried.
If we know about these threats and have at our disposal a wide range of techniques and tools to prevent them, why are they still finding any measure of success?
For example, take a personal experience I had while using Facebook. Shortly after becoming an active user on the popular social networking site, I fell prey to a virus delivered by way of a URL that presented itself in the form of a video link sent from a friend. The link appeared suspicious and though I attempted to close the message without clicking on the link, something went awry and I navigated right into the steely, sticky jaws of a truly annoying virus. Fortunately, I was able to clean my machine and irradicate the virus eventually (many thanks to Trend Micro for some pretty good software on that front). But the experience served as a booster shot of sorts for my overall online strategy. Now, I won’t even open a message unless it presents itself correctly (e.g. proper spelling, contextually appropriate, etc.). It took me all of one bad experience to realize I had to use the same level of vigilance on Facebook as I did in the rest of my digital world.
In other words, I learned the lesson and have taken steps to not make the same mistake again. Why can’t the business world do the same thing?
Of the threats detailed in the Verizon report, the vast majority can all be addressed via proper system configuration and basic monitoring techniques. We’re not talking rocket science here. And the remaining threats – the ones involving the human element – can be greatly reduced by proper and consistent security awareness training. Honestly, if I can get my almost octogenarian mother to screen emails and only open those that come from trusted sources, I’m thinking corporate America can train its employees to do the same. If I can educate my wife on the dangers of skimming and give her the basic tools necessary to avoid suspicious ATM’s (e.g. only use bank-branded devices in well lit areas; always cover the keypad when entering PIN’s, etc.), I’m certain financial institutions can do so with their customers.
The criminal element can be a pretty sharp group and are always, always thinking of new ways to get to other people’s money. Why make it easier for them by leaving the same doors unlocked and windows opened? As I’ve already pointed out, it’s good that we have identified the threat but it’s not so good that we haven’t done enough to stop it.
And here’s a neat little addendum: I wrote this post earlier today while traveling and when I returned home this evening and sorted through my mail, I found a brochure for the SANS event scheduled for March, 2010 in Orlando. While flipping through the pages, I saw session after session all aligned quite nicely against the threats detailed in the Verizon report. Again, successfully dealing with this ain’t exactly rocket science.
I want to play a game with you, sort of like the compliance equivalent of the Rorschach inkblot test. I’m going to throw out a phrase and I want you to write down the first acronym that comes to mind.
Ready? Here we go……
1) Credit card numbers
2) Social security numbers
3) Bank account information
I’m betting most of you came up with the following answers:
I’ve been fascinated for a few years now as a regulatory compliance professional about how little attention one of the most significant risks remaining in the digital world receives anywhere in my industry. Everyone has gone absolutely nuts when it comes to credit card numbers. There’s a significant groundswell around personally identifiable information as is evidenced based on the growing number of state laws being passed to oversee such things. But try and find anything in the pipeline seeking to protect your bank account information in the public domain and there’s “ “ (aka crickets).
Earlier this month we celebrated a milestone event in our family and a fair number of gifts we received were presented in the form of personal checks. I had in my hands full name and addresses, bank routing details and individual account numbers from dozens of people at my disposal to do with as I pleased. And while I was honorable and simply deposited them (for some of my friends I apparently did so a little too quickly) the potential for fraud was huge. Consider that of the nearly two dozen companies I conduct business with each month (e.g. mortgage, car loans, utilities, etc), more than two-thirds accept online checks in lieu of credit cards. If you’ve never paid via an online ACH payment or check, all you need to do is provide the bank routing number (got it right there on the check), the individual bank account (got that too) and on occasion the name of the institution (but again, you have that right there in front of you). Really when you boil it down to bare essentials, it’s somewhat the equivalent of giving someone your credit card information on a piece of paper that than gets circulated through multiple touch points before it is completely processed.
It’s insane, it’s unnecessary and most of all it’s unacceptable, and I’m betting it’s the next big information security flashpoint in the banking industry.
One of my colleagues considers checks to be archaic and thinks they should be eliminated altogether. Another suggested that they should be protected exactly the same way credit card numbers are being handled these days courtesy of the PCI standard. I was thinking of something a little simpler to start with. Why don’t they simply eliminate printing the routing and account details and rely exclusively on barcode technology as a Phase 1 sort of exercise? It’s an embedded technology and while far from tamper-proof it would certainly eliminate the biggest exposure currently in existence. I’d like to think that my money is a little harder to access than by my simply writing a check to a local service company and having someone in their office copy down the details and use them to make an unauthorized purchase. It’s not likely they’d have a barcode reader and so that would be the first logical step to take.
But whatever steps the industry takes to remedy this problem, one thing’s certain: Something has to be done. At this point in the evolution of identify theft and online fraud, it’s not as if the oversight bodies can claim ignorance. They know that the criminal element continually pursues the lowest hanging fruit, and right here and now the exposure provided courtesy of the printed bank check is just about dragging on the ground.
It’s time for the industry to change and hopefully before this reaches epic proportions. Because if bank customers become afraid or reluctant to use personal checks as a method of payment, it’s going to become a huge problem for commerce in general. Considering where our economy is at the moment, I doubt we can easily withstand such an event.
I was talking with a client last week about a perceived gap in their organization. Despite having to address multiple regulations cutting across several oversight bodies, they were lacking a single point of contact or central coordinator for all information security related activities. Their sense was that they were long overdue for some form of a chief information security officer (CISO) and I had to agree.
The same point was underscored earlier this week during a kick-off meeting with a client regarding a pending audit. Almost all of the requests for information, including policy and procedure documentation were redirected to their most senior IT person. As we were wending our way through the items on the list and they kept verbally pointing to the IT person, I started wondering how he could be responsible for all of these information security related items and perform his regular IT duties. The answer of course is that he can’t, not effectively anyway.
There’s a discipline involved with regards to regulatory and industry compliance that requires someone be committed to both understanding what needs to be done and then making sure that it’s happening. This isn’t a new consideration; I’ve blogged in the past how we’ve moved from an age where you simply needed documentation to one where actionable steps are required. It’s not enough to have an information security policy in place, you also need to comply with it and then be able to prove that fact upon request. You can’t talk about how you restrict access to systems and information and not be able to provide a recent access review/report.
I’m routinely amazed by how few of my clients understand the growing need for the role of a CISO despite their awareness and sensitivity to the increasing regulatory burden. Many financial institutions will offer up that they have a BSA officer and some will introduce a compliance “person” who is almost always focused on AML/Patriot Act activities and not much else. I’ve interviewed several dozen people over the years who were included in the audit or assessment process because I asked to speak to their head compliance person and it turned out that they had very little if anything at all to do with information security and GLBA-related activities. How is that possible?
How can you expect someone who is an expert in technology to to also be an expert in information security and GLBA?
The answer is obvious, you can’t. First, there’s a very real conflict of interest in asking the person who owns many of the required controls to also monitor themselves. Second, I’ve yet to meet a technology person in all but the largest institutions who didn’t end the day with more to do than when they started it. Third, it’s very unlikely that a technologist will interpret and apply the myriad rules around information security for all in-scope regulations and apply them correctly. I’ve been doing this sort of work for more than a decade and it’s a full-time job just keeping up with the changes let alone figuring out how to properly comply.
There needs to be an assigned gatekeeper for information security, plain and simple. And the size of your institution doesn’t matter. I’ve worked with very small financial institutions (under $100m in assets) that had a single, non-IT person in charge and it worked out quite well. In one case the individual was also responsible for business continuity and vendor management, which oddly enough isn’t so odd. Both of those require a certain degree of expertise that exceeds what you’d expect a technology person to have and more importantly, both of those activities need to cover the entire organization, not just what runs on the network. When I worked within the technology infrastructure, I never understood why these things always got dumped there and now that I’m on the other side of things I know that it doesn’t make sense.
When the examiners or auditors ask to speak to your CISO, ISO, head security person, compliance officer or compliance manager, you need to have a name to give them not some vague answer or explanation about how it’s done piecemeal. This is 2009 and the demands of compliance are great and they’re real. Ignoring the obvious or incorrectly assuming that this is a part-time job is no longer acceptable.
I was shocked and saddened today to learn of the unexpected passing of David Taylor, founder of the PCI Knowledge Base. My deepest sympathy goes to his family.
Dave founded the PCI Knowledge Base, a research community that shares information to help organizations achieve PCI compliance, after a 14-year tenure as an analyst at Gartner Inc. He was a wealth of information for my articles on PCI and provided his expert advice in an article we published this summer. Dave was always ready to help and never shied away from expressing his opinions. I really enjoyed working with him and will miss him, as will the industry.
Many years ago I found myself in one of those awkward moments where I needed to pay for something but didn’t have enough cash on hand to cover the bill. Rather than do the smart thing and find an ATM I instead elected to rip through my car and dig up all of the change that had been accumulating over the months and miles. After about five minutes and some disturbing encounters (food can morph into some bizarre forms when left under a car seat for too long) I somehow managed to come up with enough change to cover the shortfall. It’s amazing what you can pull together when you scavenge around and piece together disparate parts into one coordinated effort.
And so it goes with this week’s post. Here are some nuggets that I’ve gathered over time:
Policy and procedure: I was talking to a client today about password reset lengths. Turns out for one of their products they changed the password frequency to expire after 1,000 days. Their logic was that it was low risk because the application didn’t store NPPI and the security was really only necessary to ensure proper segregation of duties. So I asked them if they had a password policy (they did) and if so were they in compliance with the policy (they weren’t). After a momentary silence, their quiet reply was “good point.” Being the auditor that I am I couldn’t help point out that the worst thing any institution could do was to deviate from a documented policy or procedure, regardless of the reason. Once an examiner discovers something like that, they figure it’s an indication of related issues and wind up digging a bit deeper. Document what it is you do and than make sure you’re doing it; while it may seem simple enough, you’d be surprised how many companies fail on that point.
Pandemic planning: There’s still heightened concern regarding the swine flu and my industry continues to beat the drum about needing to have a pandemic response plan in place. While it’s a valid point, I’ve been polling my clients over the past few months regarding their first hand experiences with the flu epidemic. Only a few have been confronted with any legitimate outbreaks and none of them have experienced an absentee rate that required unusual planning or intervention. While I’m not advocating that a pandemic response plan is superfluous, I am questioning my peers who are pushing this as a top of the list agenda item. For my money I’d rather spend time making sure that a properly vetted and tested business continuity plan is in place and spend less time and effort getting caught up in the hype.
SOX: Banks that are required to be SOX compliant need to take some time to make sure that they’re thinking things through. GLBA is a fairly rigorous and encompassing regulation and extends deeply into a financial institution’s infrastructure. To a certain extent, it serves to drive a bank’s general controls framework, be it informal or otherwise, and as a byproduct goes a long way towards establishing controls typically associated with SOX. So when I encounter clients who are tackling SOX as if though it’s its own separate set of requirements I throw up the caution flag and try and force a reset. While it may be true that larger institutions need to extend significantly from GLBA to controls around financial reporting within the infrastructure, that would only represent a subset. Before doing anything different, the bank should bring in someone who has experience working with both SOX and GLBA to identify the (many) commonalities and produce a consolidated framework so that efficiencies are both identified and realized.
Year-end activities: In my last post I discussed how there’s an uptick in services work this time of year when many banks and credit unions remember that they still need to conduct a wide range of audits and assessments in support of GLBA/NCUA regulations. If you spend some time reading through FFIEC guidance (seriously, it’s not nearly as dry and boring as you might think) there are multiple references to “your most recent audit or assessment.” For those of you who think that the need to conduct this work is suggested rather than required, consider how it looks to your examiner(s) when they discover that your most recent risk assessment was either conducted several years ago or not at all. Do you really think it reflects well on your institution that you haven’t taken a serious look at the myriad risk factors swirling about your infrastructure for any considerable length of time? In a day and age when new threats emerge almost daily if not hourly how can you justify neglecting such a critical task? The examiners expect a current set of reports not only because it’s required but also because it’s a clear indication of solid management and oversight activities.
And on a final note, I’d like to share this link to the FDIC website. You’ll find a video message from Chairman Bair on the current state of both the FDIC and the banking industry. It’s really more of a “happy recap” (with all due respect to Mets fans) of similar messages she’s released over the last year. But I think it’s worth your time (about four minutes total) to hear it for yourself and gain a sense of calm about the security of your own deposits. And for those of you who might think I’m keeping to some sort of schedule regarding Sheila Bair references, as long as she keeps doing the right things I’m going to keep bringing her name up.
A favorite cliché of mine is “if it wasn’t for the last minute nothing would ever get done.” Personally it’s sort of the way I’m wired and in my industry it’s an unwritten rule when it comes to many annual activities. There’s an appreciable uptick in services work each year beginning in early fourth quarter as banks and credit unions wake up to the realization that the audits and assessments they are committed to conduct have yet to be done. And examiners typically don’t pay much attention to the timing of the work; they only care that it’s done during the expected time frame, so oddly enough this approach works.
But this leads to another interesting quirk about how the examiners often operate. Generally speaking, if the reports are available, they don’t dig much (if at all) beyond the reports contents. And so the information security and IT components of many exams become more about inventorying recent reports and not much else. We see evidence of this all of the time when we conduct a first-year audit or assessment and discover gaps or issues that have been in place for years and which the exams never picked up on.
I’ve written in the past about how surprisingly few institutions maintain a current business continuity plan and even fewer properly test that plan. But what surprises me more is that these conditions have existed for years spanning many exam cycles. How is that even possible?
I’ll tell you how: There’s a documented plan that is provided upon request and by and large the examiner conducting the fieldwork checks off that they received it and voila, you have a non-issue. And because the people in the field are typically given too few hours to cover too much landscape, they don’t have enough time to dig in deeper. Sometimes it happens where an examiner happens to actually open the document and vet it for key details – every now and again we come across a DoR or an MoU where the absence of a recent business impact analysis was tagged – but that happens almost never.
I’m fond of advising clients that you conduct much of the required compliance work for one of two reasons: You do it because it’s the right way to manage your institution and reduce your risk or because you have to. Because of the approach taken by examiners, way too many institutions lean towards the latter and simply want to have a report available to hand over when asked. But is this really the right way to run a financial institution?
And when you consider that the value of the report is largely defined by its contents and the competency of the practitioners conducting the fieldwork behind it, isn’t there an increased likelihood that there are important issues that go undetected? If all you do is pay for the report (often issued by the firm submitting the lowest bid) and all the examiners do is check off that the report was available and issued during the appropriate time frame, is there any real value in even bothering with this process?
I’m a bit biased regarding the value of reports. My firm is on a constant hunt for real risk and not just simply working our way down a checklist to kick out a document and collect our money. We tend to examine our clients infrastructure as if though we have our own money deposited with them and tie what we see straight back to GLBA and NCUA requirements. The value in this approach is that we produce a report that the board of directors can relate to, not just the IT folks.
But again, if no one really even cares about the content of the report and only that it exists, why bother doing a good job?
Maybe our industry needs to adopt an approach similar to the PCI folks. Maybe the FDIC and NCUA should issue certifications to practitioners validating them as properly trained and educated experts with regards to GLBA. There would still be a variance from firm to firm to a certain degree, but at least there would be a recognized standard and an increased likelihood that if an examiner is going to rely on the competency and completeness of a report there’s some justification behind that decision.
Something’s going to have to change though and hopefully sometime soon. Because using the “last minute” logic is flawed and only serves to reinforce my own bad habits.
I have an associate who has an addiction to certifications. He’s one of those “too smart for his own good” geniuses who often decides to change his career course and starts by obtaining whatever accreditation or cert is needed to do so. When he lists all of these accreditations and certs after his name it looks as if though someone tossed their alphabet soup lunch. But his logic is that having the appropriate governing body’s seal of approval is akin to knowing the secret password needed to gain access to the right job.
Sometimes I think COBIT is used much the same way.
For those of you who aren’t familiar with COBIT, it’s a framework that has revolutionized the world of governance and compliance for the better. It was the only beacon in the vast, dark ocean of SOX insanity a few years back, providing much needed guidance for corporate America to follow and continues to serve as the best source when designing controls within the infrastructure. It’s comprehensive, well organized and when understood and applied properly, it can be very effective.
But it’s not akin to the Bible and it’s definitely not an IT audit framework or program.
And yet I often hear fellow practitioners dropping COBIT references like it somehow validates them as legitimate members of the IT audit club (which by the way is called ISACA and only requires an annual membership fee).
Just this week, I heard that someone discussed conducting a COBIT-based audit when asked about their approach to conducting an IT general controls (ITGC) audit. Two weeks ago, my partner asked me about an RFP we received in which the institution wanted to know if we based our ITGC audit on COBIT or any other recognized framework. It’s gotten to the point where the term “COBIT-based” has become ubiquitous within the IT audit domain. Years ago during the aforementioned SOX insanity, there was a running joke with a client in which every sentence was laced with a SOX reference (e.g. Good SOX morning, Happy SOX New Year, etc.). Now it seems as if though COBIT has replaced SOX in that regard.
Um, has anyone actually read the framework? I mean actually sitting down and reading it from executive summary through to ME4 (the last of the control objective areas in the PDF). And how many people have actually tried to implement COBIT as it’s intended to be used? It’s a mountain of information that requires a ton of analysis and customization prior to being implemented. And it’s not intended for organizations both big and small. For many of the community banks and similarly sized credit unions that I commonly work with, it’s simply overkill.
But again, it’s not an audit framework and it’s not an audit program. And it’s entirely possible to build out an IT controls framework and never once rely upon COBIT to do so.
By the way, for those of you who aren’t familiar with the IT Governance Institute (ITGI), it’s a research think tank that exists to be the leading reference on IT governance for the global business community. In the time since COBIT made its inroads into corporate America and the audit vernacular, ITGI has amped it up a notch. Now they also publish Val IT and more recently Risk IT.
So now I’m bracing for the onslaught of risk assessments that are “Risk IT” based. But I never had a problem conducting a risk assessment before this standard existed and I doubt I’ll crack it open when conducting one in the near future. Did we really need this? And how will this drive the audit and compliance industry?
Frameworks have a place in this world, don’t get me wrong. But it’s like when I bought my Roto Zip hand saw a few years back; I walked around my house looking for things I could use it for rather than simply using it when it made sense. COBIT is awesome and it’s helped provide clarity in many, many ways. But it isn’t the official book of record for audit and compliance within IT; it’s just another tool in the toolbox. I realize that on the planet of ISACA that’s akin to blasphemy, but I offer no apologies. I refuse to build an audit program for a community bank that’s supported by two IT resources based on the 200 plus control objectives in COBIT.
And on that note I bid you a good COBIT day.
Every day, I receive a semi-deluge of industry related emails. Between the various agencies, media sites, organizations and associations I tend to receive more communiqués than I know what to do with. But I developed an interesting habit last year when the banking industry first started its tailspin dive by making certain to read every single issuance from the FDIC.
Going back to at least last September I have read and saved each and every one of them (several hundred I might add). I’m sure some of my peers will beg to differ, but for me this is where anyone in the industry should’ve been looking during the crisis for the best indicators of what’s going on.
Yesterday, I was glad for this somewhat addictive habit of mine. For what may be the very first time since Lehman went belly-up, I may have found the first true concrete piece of evidence that we’re on the road to recovery, if only in some small way.
The FDIC agency alert yesterday announced plans to bolster the Deposit Insurance Fund (DIF) by requiring insured institutions (mostly the banks you and I know) to prepay on their quarterly premiums so that the fund remains viable and liquid through the still unfolding resolution of the banking mess. And that’s significant because unlike a year ago, this time around the plan calls for the industry to take responsibility for itself and not go running to Capitol Hill for help, an option FDIC Chairman Sheila Bair has denounced on several occasions.
Here’s what Bair had to say in the announcement:
“The decision today is really about how and when the industry fulfills its obligation to the insurance fund. It’s clear that the American people would prefer to see an end to policies that look to the federal balance sheet as a remedy for every problem. In choosing this path, it should be clear to the public that the industry will not simply tap the shoulder of the increasingly weary taxpayer. This proposal is a vote of confidence for the banking industry’s resilience, and it will continue to recover its strength as we work through the significant challenges ahead.”
The reason for my optimism is that this action shifts control back to the banking sector to fix its own mess. It puts greater emphasis on each individual institution to fulfill its obligations to the DIF in advance of using those same funds for more traditional activities commonly associated with generating profits. I think accountability is necessary, if not essential, to repairing the damage inflicted on the industry and repairing its reputation with depositors, investors and borrowers (something the NCUA had figured out much sooner). And so I’m feeling a little better about where we’re heading, economically speaking.
Oh, and Comptroller of the Currency John C. Dugan (that’s the OCC head honcho in case you didn’t recognize the handle) agrees with me. Mr. Dugan said of the FDIC plan: “The actions we are taking today represent a balanced approach to raising needed money for the deposit insurance fund without impairing the ability of our banks and thrifts to support economic recovery.” He added, “I think this is a very positive proposal. The staff did an excellent job, and I support the way you handled it”.
I’d like to chalk it up to “great minds think alike.”
By the way, if anyone knows of a Sheila Bair Fan Club or is thinking of starting one I’d appreciate if you would let me know. She won my admiration last year (no surprise to my regular readers) and has routinely found ever more ways to score points with me. She continues to step up and talk straight, smart and to the point about what’s going on with the banks and what to do about it. I look forward to the President acting out on the banking reform plans announced earlier this year and I sincerely hope he put Bair in charge of the new entity.
For now, though, I have to go; seven more FDIC email alerts have landed in my inbox and I need to check ‘em out.
I had one of those odd moments yesterday regarding the banking industry that I wanted to share with you.
On the homepage of a major news website were two headline stories. The first was about how Ben Bernanke believes the recession we’re in is coming to an end. Immediately to the left of the story was the following headline: “Don’t be surprised to see more banks failures.” I don’t know if the site editors were funnin’ with us or just simply didn’t realize the irony in how they stacked the items, but it certainly caught my eye.
How can the recession be ending while more banks are expected to fail?
I’m not an economist but I’m reasonably certain I don’t need to be in order to grasp the financial fundamentals of the situation. If the banking crisis is far from over, if there are still significant cash shortfalls that need to be flushed out of the banking system, how can we begin a recovery? And as if though the contradicting stories weren’t enough to make me rush to my digital soapbox, there was another headline a short while ago that read “Banks’ commercial real estate exposure probed” with the subhead, “Delinquency rates on commercial loans have doubled in the past year.”
More bank failures expected, commercial real estate portfolios tanking at an accelerating rate….. sure sounds like we’ve turned the corner to me, wouldn’t you agree?
I’m onsite at clients all of the time and one of my favorite pastimes is to spend time with the people who pretty much run their institutions be it from the front or backseat position and get their take on both the banking industry and state of the economy. These are the people who understand how a fractional increase in an interest rate can make or break an institution and see in the dense pile of numbers a pattern that must be very much like tea leaves. They know what they know and don’t much care for the headlines or industry pundits who tell us what to think. And so I look to them for guidance on what to expect and gauge where we are based on what they see.
They’re still freaked out.
One recent conversation was a mini-dissertation on the looming collapse of the commercial real estate market. There are empty storefronts everywhere you look and even emptier office buildings. How many construction sites sit idle with partially constructed buildings waiting for an infusion of cash to get them finished? What happens to the banks that provided the loans for these empty or incomplete structures? You now hold paper on structures that are worth much, much less than what you estimated and there’s no market to sell that paper or move those properties. What do you do next?
Another conversation was with someone who is about as expert as you get on residential real estate and they shared their opinion that the worst is far from over. Too many saturated markets have failed to yield sufficient reductions to bring things back into alignment and that needs to happen before the healing can begin. That means there are more foreclosures looming on the horizon, which will only grease the slippery slope the banking industry is currently on. And when you factor in that President Obama has said there will not be anymore bailout activities beyond what’s already been made available you have to assume that we’re in for even more tough times ahead.
Again, I’m no economist but I get to shoot the breeze with some fairly bright bulbs and they’re not lining up behind Mr. Bernanke.
I’ll admit that I’m ready to see the light at the end of the recession tunnel. I’m ready to stop reading about bank failures and predicting how many more are going to fail (is that even newsworthy anymore?) and start reading about how the industry is going to be regulated in the future to prevent this from happening again. Because the real story to me is that over a year has passed since this financial free-fall first started and nothing has changed to keep it from happening again.
I suppose you can say I’m looking for closure of a different variety.
The recent news about a social engineering exercise gone awry serves as a lesson on how not to conduct these kinds of tests. An information security firm had sent a credit union NCUA-branded media to install in order to test if the employees would react appropriately and first attempt to validate that the request was legitimate. The problem was that no one notified the NCUA about their role in the exercise and so when they were contacted by the institution, they assumed this was a legitimate threat and issued a warning to all of their credit union members.
At first blush, this story might appear amusing. Beyond some embarrassed people at the security firm conducting the social engineering exercise and some likely annoyed folks at the NCUA (a federal security alert is not a common or simple occurrence), it would appear you could chalk this up to “no harm, no foul.” From certain vantage points you might even consider this a wildly successful test. After all, the client reacted appropriately by contacting the NCUA and the NCUA reacted appropriately by identifying the actions as unsanctioned and potentially harmful and alerting their member institutions. But for those of us who practice in the industry, this is far from amusing and actually somewhat disturbing.
When one of our member practitioners or firms does something that brings negative attention to the industry or conducts themselves in a way the results in a black eye, it’s always extended to a certain to degree to all of us. At some point in the process, I would have thought that someone working for the offending firm would have reviewed a draft of the plan and flagged the part about including an uninvolved third party as unnecessary or inappropriate (and quite possibly illegal). And besides, grand and elaborate schemes aren’t really necessary. Most breaches that occur aren’t of the James Bond variety, so subtle tactics work best.
I’ve managed and conducted social engineering tests many times in the past and so I speak from experience. On one project, I had a renegade auditor who wanted to test data center physical security by trying to either force or talk his way into the facility. He was told in no uncertain terms that doing so was not authorized or acceptable and that if he did it and was arrested (a very likely scenario), we would not bail him out and he would be fired immediately. It was just a bad idea and not even remotely necessary to test the related controls. And then I was reminded of a story in which a well-known national security firm had their practitioners dress up in firefighting gear and arrive at a bank branch claiming there was a possible fire/smoke condition to see if they would be allowed access to private/protected areas of the bank. Of course they were granted the access and as a result were written up in an industry magazine and considered to be innovative and imaginative.
Social engineering is intended to examine how the human element reacts to a variety of scenarios designed to gain access to sensitive information or secured areas. There are many, many simpler and less obvious techniques available to poke and prod and test the effectiveness of related controls. So why was this test even necessary?
The short answer is that it wasn’t. It was a bad idea in design and execution.
At a basic level, I don’t understand is how this test was even conducted. A common element when executing any form of security work is to inform the key stakeholders of the plan so that things like this don’t happen. When the appropriate party was notified about the suspicious material received from the NCUA they should have known what to do (beyond escalating to the NCUA). We inform the primary security contact of our activities so that they know not to escalate outside of their own institution. We provide specific start and end times, all key details, and status updates along the way.
The wrong messages are sent as a result of these wayward tests. I’m thinking that credit unions will now require all sorts of crazy validation before trusting anything from the NCUA. I’m also concerned that for the bank involved in the firemen scenario, they may not properly evacuate the facility in the event of a real fire because they’ll wait for confirmation that it isn’t another test. Is that really the desired outcome of these exercises? Last year, I managed an exercised involving a phone based phishing test. Two days after we concluded the fieldwork, I received a message from our client sponsor asking if we were still executing the test. Turns out they were the target of a legitimate phishing attempt and because our activities had raised awareness, the situation was escalated appropriately. Doesn’t that make a bit more sense?