Regulatory Reality

Aug 16 2010   2:43PM GMT

Data security risks in the new age of banking

David Schneier David Schneier Profile: David Schneier

Earlier this month, I blogged about my concerns regarding a drop-off in information security oversight by banking regulators. In this age of safety and soundness first, everything else is second, if at all.  It’s more than a week later and I’m not feeling any better about things; as a matter of fact, I’m feeling measurably worse.

I participated in several conversations in which a recurring theme was the challenges presented from a surge in merger and acquisition activity.  It’s the other side of the banking crisis that doesn’t get as much press as it probably should.  Think about how this plays out: An institution acquires the assets of another institution and in a remarkably short period of time, has to absorb that information into its own infrastructure so that it can properly service the accounts.  In a normal merger, this is an activity that would be planned out over several months with all forms of testing involved before the official cut-over.  But we’re in an age where on Friday your account belongs to Bank A but on Monday is being managed by Bank B.  How much time is allowed to cut things over between the two separate infrastructures?  And when you consider that it’s rare for the two institutions involved to share a common banking platform, how do you seamlessly and accurately convert the customer data?

Back in my infrastructure days, I recall all too well the various activities that were involved with figuring, configuring and reconfiguring elements from disparate systems in order to determine the best way to bring them together.  There were delimited files extracted, spreadsheets created and all manner of repositories generated to analyze the data.  Back then we didn’t have CD/DVD burners as standard equipment to easily offload full repositories (we were handcuffed by 3.5″ floppies with a max of just over one megabyte of storage) or USB storage devices attached to our key chains.  Laptops weren’t yet pervasive and it just wasn’t as easy to walk entire databases of customer data out the door without detection.  Circa 2010, it’s just so darn easy to take huge digital piles of sensitive information outside of the secured infrastructure.  Be it the result of overworked IT workers trying to meet deadlines, careless employees not realizing the sensitivity of the data on their laptops or people with actual ill intent, it’s rather simple for non-public personally identifiable (NPPI) data to find its way into the wrong hands.  And with the remarkable spike in all the merging and acquiring going on the likelihood of a breach or data theft skyrockets.

And that’s only one part of the risk equation.

Every week, the industry publications are full of stories about cloud computing: The conflagration of multiple computer resources of which you only use slices that you need.  In this new age of mass storage and processing, you don’t build out an isolated subset of your infrastructure to handle specific processes but rather plug your process into the cloud and it simply uses what it needs.  I remember back in the 90’s working for Metlife when they launched their first true e-commerce sites and how the company struggled to find ways to monitor all of the components necessary to deliver secure content to its customers.  There were typically a half-dozen handshakes required to process a request in either direction and they all existed on different platforms running different software.  It was impossible to accurately measure each transaction, estimate load and response time, and calculate capacity needs.  At that time, I wasn’t yet concerned with security so much but that would’ve been equally impossible to manage.  But at least you could isolate each tier in the infrastructure and identify where the transaction was flowing.  Now with the cloud, you don’t even have that degree of control.  And when you consider that almost everyone I talk to about technology within the banking sector wants everything to run on the Web, even if it’s an application that requires only internal users, the risk factors increase exponentially.

So now I’m wondering how secure is all this NPPI with the constant rush to merge account information combined with corporate America pushing to move things onto the Web and into the cloud.

When I first started out in corporate IT well over 20 years ago, one of the managers had a sign hanging in his office that read “If you don’t have time to do it right, when will you have time to do it over?”  Fast forward to 2010 and the same logic applies.  The only difference is that this isn’t about application programming but rather loss of data and once that cat’s out of the bag you can’t simply put it back in.

 Comment on this Post

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: