IT Compliance Advisor


August 18, 2009  4:53 PM

3 social media questions for compliance officers to consider

GuyPardon Guy Pardon Profile: GuyPardon

My recently published series on online privacy and social media compliance is resulting in some feedback from our audience, as you might imagine. Scott Crawford, managing research director for Enterprise Management Associates, posed three questions that I believe are useful for anyone working and using social media to consider.

Navigating these boundaries will be a tricky dance for all as advice and professional services are offered over social media platforms, whether they are Twitter, Facebook, LinkedIn or [X] other social network. Crawford’s comments follow.

“I personally think one of the biggest issues with social networking will not be as cut-and-dried as a lot of these recommendations make it sound, however – namely:

  • What clearly distinguishes personal from professional information shared via social networking sites?
  • What are the boundaries of personal expertise and corporate IP? Can those necessarily be deduced in all cases where social networking is the vehicle?
  • How much of the individual’s personal identity overlaps with corporate identity?

This is likely a particular concern where personal expertise is the primary stock-in-trade of the enterprise, as with consulting organizations.

Despite the apparent conflict, social networks are popular outlets for consultants, for example, since they not only help promote personal expertise but showcase it in a very personal way.

Although you’ve offered some excellent examples of common sense distinctions between personal and acceptable corporate use, I suspect a number of cases will come forward that are not so clear-cut — and even where the case appears to be clear-cut, I would fully expect legal counsel to vigorously exploit any lack of clear distinctions that may be found in a particular case.”

If you have answers to Scott’s questions, please leave a comment, send feedback to editor@searchcompliance.com or @reply to @ITCompliance on Twitter.

Reblog this post [with Zemanta]

August 17, 2009  9:22 PM

201 CMR 17 FAQ: Updates to Massachusetts data protection law

GuyPardon Guy Pardon Profile: GuyPardon

Earlier today, the Massachusetts Office of Consumer Affairs and Business Regulation (OCABR) issued an update to 201.CMR.17, the Massachusetts data protection law. The deadline for implementation has now been extended until March 1, 2010.

In an interview with SearchCompliance.com, Undersecretary Barbara Anthony stated that “consumer protections have not been weakened in this amendment. Monitoring, reviewing the scope of security measures – and encryption is still required if you are going to transmit resident PII over public networks. What we’ve tried to do here is to not impose additional burdens which weren’t involved in the consumer protections.”

With the permission of the OCABR, we are posting the FAQ released by the office in full and unedited below. We will post the rest of Anthony’s comments tomorrow, along with further analysis of the changes referenced below.

What are the differences between this version of 201 CMR 17.00 and the version issued in February of 2009?

There are some important differences in the two versions.

First, the most recent regulation issued in August of 2009 makes clear that the rule adopts a risk-based approach to information security, consistent with both the enabling legislation and applicable federal law, especially the FTC’s Safeguards Rule. A risk-based approach is one that directs a business to establish a written security program that takes into account the particular business’ size, scope of business, amount of resources, nature and quantity of data collected or stored, and the need for security. It differs from an approach that mandates every component of a program and requires its adoption regardless of size and the nature of the business and the amount of information that requires security. This clarification of the risk based approach is especially important to those small businesses that do not handle or store large amounts of personal information.

Second, a number of specific provisions required to be included in a business’s written information security program have been removed from the regulation and will be used as a form of guidance only.

Third, the encryption requirement has been tailored to be technology neutral and technical feasibility has been applied to all computer security requirements.

Fourth, the third party vendor requirements have been changed to be consistent with Federal law.

To whom does this regulation apply?
The regulation applies to those engaged in commerce. More specifically, the regulation applies to those who collect and retain personal information in connection with the provision of goods and services or for the purposes of employment.

The regulation does not apply, however, to natural persons who are not in commerce.

Does 201 CMR 17.00 apply to municipalities?
No. 201 CMR 17.01 specifically excludes from the definition of “person” any “agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof.” Consequently, the regulation does not apply to municipalities.

Must my information security program be in writing?
Yes, your information security program must be in writing. The scope and complexity of the document will vary depending on your resources, and the type of personal information you are storing or maintaining. But, everyone who owns or licenses personal information must have a written plan detailing the measures adopted to safeguard such information.

What about the computer security requirements of 201 CMR 17.00?
All of the computer security provisions apply to a business if they are technically feasible. The standard of technical feasibility takes reasonableness into account. (See definition of “technically feasible” below.) The computer security provisions in 17.04 should be construed in accordance with the risk-based approach of the regulation.

Does the regulation require encryption of portable devices?
Yes. The regulation requires encryption of portable devices where it is reasonable and technically feasible. The definition of encryption has been amended to make it technology neutral so that as encryption technology evolves and new standards are developed, this regulation will not impede the adoption of such new technologies.

Do all portable devices have to be encrypted?
No. Only those portable devices that contain personal information of customers or employees and only where technically feasible The “technical feasibility” language of the regulation is intended to recognize that at this period in the development of encryption technology, there is little, if any, generally accepted encryption technology for most portable devices, such as cell phones, blackberries, net books, iphones and similar devices. While it may not be possible to encrypt such portable devices, personal information should not be placed at risk in the use of such devices. There is, however, technology available to encrypt laptops.

Must I encrypt my backup tapes?
You must encrypt backup tapes on a prospective basis. However, if you are going to transport a backup tape from current storage, and it is technically feasible to encrypt (i.e. the tape allows it) then you must do so prior to the transfer. If it is not technically feasible, then you should consider the sensitivity of the information, the amount of personal information and the distance to be traveled and take appropriate steps to secure and safeguard the personal information. For example, if you are transporting a large volume of sensitive personal information, you may want to consider using an armored vehicle with an appropriate number of guards.

What does “technically feasible” mean?
“Technically feasible” means that if there is a reasonable means through technology to accomplish a required result, then that reasonable means must be used.

Must I encrypt my email if it contains personal information?
If it is not technically feasible to do so, then no. However, you should implement best practices by not sending unencrypted personal information in an email. There are alternative methods to communicate personal information other through email, such as establishing a secure website that requires safeguards such as a username and password to conduct transactions involving personal information.

Are there any steps that I am required to take in selecting a third party to store and maintain personal information that I own or license?
You are responsible for the selection and retention of a third-party service provider who is capable of properly safeguarding personal information. The third party service provider provision in 201 CMR 17.00 is modeled after the third party vendor provision in the FTC’s Safeguards Rule.

I have a small business with ten employees. Besides my employee data, I do not store any other personal information. What are my obligations?
The regulation adopts a risk-based approach to information security. A risk-based approach is one that is designed to be flexible while directing businesses to establish a written security program that takes into account the particular business’s size, scope of business, amount of resources and the need for security. For example, if you only have employee data with a small number of employees, you should lock your files in a storage cabinet and lock the door to that room. You should permit access to only those who require it for official duties. Conversely, if you have both employee and customer data containing personal information, then your security approach would be more stringent. If you have a large volume of customer data containing personal information, then your approach would be even more stringent.

Except for swiping credit cards, I do not retain or store any of the personal information of my customers. What is my obligation with respect to 201 CMR 17.00?
If you use swipe technology only, and you do not have actual custody or control over the personal information, then you would not own or license personal information with respect to that data, as long as you batch out such data in accordance with the Payment Card Industry (PCI) standards. However, if you have employees, see the previous question.

Does 201 CMR 17.00 set a maximum period of time in which I can hold onto/retain documents containing personal information?
No. That is a business decision you must make. However, as a good business practice, you should limit the amount of personal information collected to that reasonably necessary to accomplish the legitimate purpose for which it is collected and limit the time such information is retained to that reasonably necessary to accomplish such purpose. You should also limit access to those persons who are reasonably required to know such information.

Do I have to do an inventory of all my paper and electronic records?
No, you do not have to inventory your records. However, you should perform a risk assessment and identify which of your records contain personal information so that you can handle and protect that information.

How much employee training do I need to do?
There is no basic standard here. You will need to do enough training to ensure that the employees who will have access to personal information know what their obligations are regarding the protection of that information, as set forth in the regulation.

What is a financial account?
A financial account is an account that if access is gained by an unauthorized person to such account, an increase of financial burden, or a misappropriation of monies, credit or other assets could result. Examples of a financial account are: checking account, savings account, mutual fund account, annuity account, any kind of investment account, credit account or debit account.

Does an insurance policy number qualify as a financial account number?
An insurance policy number qualifies as a financial account number if it grants access to a person’s finances, or results in an increase of financial burden, or a misappropriation of monies, credit or other assets.

I am an attorney. Do communications with clients already covered by the attorney-client privilege immunize me from complying with 201 CMR 17.00?
If you own or license personal information, you must comply with 201 CMR 17.00 regardless of privileged or confidential communications. You must take steps outlined in 201 CMR 17.00 to protect the personal information taking into account your size, scope, resources, and need for security.

I already comply with HIPAA. Must I comply with 201 CMR 17.00 as well?
Yes. If you own or license personal information about a resident of the Commonwealth, you must comply with 201 CMR 17.00, even if you already comply with HIPAA.

What is the extent of my “monitoring” obligation?
The level of monitoring necessary to ensure your information security program is providing protection from unauthorized access to, or use of, personal information, and effectively limiting risks will depend largely on the nature of your business, your business practices, and the amount of personal information you own or license. It will also depend on the form in which the information is kept and stored. Obviously, information stored as a paper record will demand different monitoring techniques from those applicable to electronically stored records. In the end, the monitoring that you put in place must be such that it is reasonably likely to reveal unauthorized access or use.

Is everyone’s level of compliance going to be judged by the same standard?
Both the statute and the regulations specify that security programs should take into account the size and scope of your business, the resources that you have available to you, the amount of data you store, and the need for confidentiality. This will be judged on a case by case basis.

This FAQ was posted with the direct permission of OCABR. For more information on the regulation, visit Mass.gov/consumer.

Reblog this post [with Zemanta]


August 13, 2009  6:38 PM

Standards aren’t security: PCI compliance and Heartland’s data breach

GuyPardon Guy Pardon Profile: GuyPardon

As of Aug. 10, the Identity Theft Resource Center had reported 333 data breaches in 2009, exposing over 13 million records in the process. Given that context, it’s no wonder that information security professionals and compliance officers are receiving increased pressure and scrutiny from their executive teams about whether IT systems are truly secure.

As several recent essays on PCI compliance and security suggest, however, no one should be looking to standards or compliance audits alone to certify that an organization is protected against a data breach.

In this photo illustration...
Image by Getty Images via Daylife

CSO senior editor (and former TechTarget-er) Bill Brenner’s interview with Heartland CEO Robert Carr drove home precisely this issue. In the interview, Carr asserts that “the audits done by our QSAs (Qualified Security Assessors) were of no value whatsoever.” The assessment of the CEO and industry is reflected in Eric Ogren’s column from earlier this year, “Heartland breach highlights PCI limitations.”

Rich Mogull, founder of Securois, posted an “Open Letter to Robert Carr, CEO of Heartland Payment Systems“on his blog that questions the blame Carr places on external auditors. As Mogull points out, “PCI compliance means you are compliant at a point in time, not secure for an indefinite future … standards like PCI merely represent a baseline of controls.” He recommends that executives “treat their PCI assessment as merely another compliance initiative — one that does not, in any way, ensure their security. As an industry professional I see all too many organizations do the minimum for PCI compliance.”

Mogull’s critique was followed by a more incensed response from security expert Mike Rothman, SVP of strategy, eIQnetworks and chief blogger at Security Incite. Rothman observed that “you cannot outsource security. An auditor or assessor is only there to substantiate the technical controls implemented to meet a regulation … any regulation is on the beginning of a comprehensive security program, and PCI is no exception.”

All of this was preceded by a much-discussed essay by security consultant Nick Selby at Fudsec.com, “Showing the Oblomovs the door.” Selby posits that the PCI Data Security Standard (PCI DSS) is a “Pyrrhic victory,” suggesting that “well-intentioned businesspeople at PCI, seeing their money walk out the door at an exponentially increasing rate, thought they’d, ’Raise the bar’ by setting forth some highly specific tasks. Unfortunately they were specific to a paradigm gone by, and those who don’t comply get their credit card privileges popped. Thus have they managed not only to not raise the bar but in fact to substantially lower the ceiling — PCI is not the minimum standard, it’s the maximum effort that many organizations make.”

His “anti-compliance” rant earned substantive contributions in the comments on the post by security analysts and professionals, including Verizon’s Alex Hutton, on whether PCI DSS holds any value.

Are there any conclusions to draw from the discussion, timely as it may be given our recent publication of a PCI DSS FAQ?

Here’s a shot:

1. PCI compliance should be a baseline for security, not a ceiling.

2. PCI compliance does not equate to being secure.

3. PCI compliance does provide a checkpoint for auditors and an organization at a given point in time.

4. That checkpoint does not substitute for a holistic approach to risk management and security.

If you have more to add to the discussion, please comment, @reply to @ITcompliance on Twitter or send your thoughts to editor@searchcompliance.com.

Reblog this post [with Zemanta]


August 5, 2009  2:13 PM

Compliance officers discuss business, IT alignment at ISACA conference

GuyPardon Guy Pardon Profile: GuyPardon

This guest post is from Joe Hewitt, an IT compliance specialist for American Honda Finance Corporation.  His views do not represent those of Honda, any of its divisions, or employees.

The 2009 ISACA International Conference held in Los Angeles had a much different feel than those of the past.  While IT controls were consistently a primary talking point, the emphasis was on how to better align business and IT goals.  Even though theoretical concepts like risk and value information technology were discussed at length, many of the presenters addressed real-world issues with respect to advancing along the compliance spectrum.

Oracle representatives Mark Sunday, CIO and SVP, and Gail Coury, VP of risk management, kicked off the festivities with a detailed and insightful keynote address that outlined the challenges of compliance amid heavy acquisition periods.  Attendees then proceeded to presentations along one of four tracks:

  1. IT governance
  2. IT compliance audit practices
  3. Information security management
  4. IT risk management and compliance

While useful information was abundant and widespread, here are some of the more interesting discussion points:

  • Risk is often counter-intuitive
  • Privacy regulations are here to stay…and will only become more strict
  • Reputation risk is increasing for all businesses
  • Financial return and value of governance is realized across silos, not from within them
  • IT should be used to reduce business costs, not IT costs
  • Acceptance of authority in younger generations has gone down, increasing the need for control automation
  • The current economic environment emphasizes the need for controls over fraud at every level
  • Business = Demand; IT = Supply
  • ACCOUNTABILITY IS KEY!

If controls are the key, governance is the lock

Much discussion was held about progression beyond creating a control environment and moving towards overall governance.  With compliance budgets decreasing at a record pace, governance is the only way that auditors will be able to show value of audit activities.

Risk was the real elephant in the room.  Discussions concluded that, while we cannot fully eliminate risk in a cost effective manner, the process of implementing a monitoring or review process provides an eye opening set of data for many businesses.

Even though attendance appeared to be down, the group was very diverse and included representatives from all over the globe.  ISACA members from international companies enlightened the group with unique and challenging regional issues.

Overall, the conference delivered as promised.  It had legacy theory, risk management theory, international diversity, and real-world solutions for almost any IT compliance issue.  ISACA continues to be on the cutting edge of IT governance.


August 4, 2009  2:55 PM

What online privacy expectations exist for social media use at work?

GuyPardon Guy Pardon Profile: GuyPardon

If you read Professor Jonathan Zittrain’s rebuttal on cloud computing to Bernard Golden at CIO.com today, you know that both agree that privacy is the No. 1 concern for cloud computing. Compliance officers have to worry about more than just privacy, of course, but protecting the private information of employees and customers alike is a crucial component of any enterprise-class security regimen.

Given, say, Twitter security risks, I knew the premise for SearchCompliance.com contributor Andrew Baer’s recent tips on social media use in the enterprise holds considerable merit: Social media platforms demand a clear employee Internet use policy.

privacy is dead
Image by striatic via Flickr

When it comes to the details, however, I was left with more questions than answers. I understand that as a lawyer and e-discovery expert, Baer is naturally risk-averse. Moreover, I recognize that he’s forgotten more about e-discovery and the law than I currently know as a journalist.

That said, Baer’s position on online privacy and the rights of the employer to access the online activity or posts of employees veers into more ambiguous territory. Baer writes that a “policy should also state prominently that employees have no expectation of privacy in anything they store or transmit using corporate IT resources or post on the Internet, and that the enterprise reserves the right to monitor all usage of IT resources and Internet postings without notice and does so periodically.”

I imagine most observers can agree that enterprises need to create a Web 2.0 usage policy that extends existing rules and reminds employees of established guidelines for electronic communications and expectations for online privacy. Such guidance is even more crucial in regulated environments, as explained in ″Compliance concerns dog enterprise 2.0 collaboration software.″

Baer acknowledges the privacy issue: “Monitoring employee Web 2.0 use and terminating or disciplining an employee based on that use can raise legal privacy issues if an enterprise’s Web 2.0 strategy is not well planned and administered.”

The bottom line, however, is that Baer’s advice to compliance officers would appear to extend far beyond IT compliance into something else that he appropriately calls “Big Brother”-like action. As Baer observes, “Some employers may not want to go this far, since policing what employees say outside of work may seem Orwellian and lead to image problems.”

Image problems may just be the tip of the iceberg. I’m left wondering what other e-discovery experts, attorneys, security experts and compliance officers think about online privacy in this context.

George Moraetes, an independent security consultant for Securityminders Inc. in Illinois, agreed via email with Baer that “employees should have no expectation of privacy in anything they store or transmit using corporate IT resources.”

Moraetes wrote “that is a correct assumption, most companies treat email the same way. Employees have separate accounts using own resources. The only way to assure privacy is to encrypt your transmissions, in addition to using aliases. Most users are not techies and lack sophistication. Many companies do not implement DLP and NAC systems, although this in itself will not stop it.”

Moraetes went on describe the issue further:

“I demonstrated to the IRS a project back in 2004, the ability to leak information and not be caught. They told me they would catch anyone — or so they thought.

“In my demonstration to them, I advised that perimeter firewalls all must have ports 80 and 443 open bi-directionally. Otherwise, how would your staff and external users access resources? Obviously, when someone goes to Gmail or even Playboy their network captures and blocks them, reporting them to security — which is a serious offense. In saying that, I launched OpenVPN, communicating directly to my proxy/VPN server from Washington, D.C., to Chicago. I went anywhere that was prohibited and the internal traffic from their DLP systems could not detect or see me. There was nothing they could do about it. There are more ways to skin a cat to breach and leak out information, including Web 2.0 and using TweetDeck, email and the Web. Funneling encrypted traffic can bypass the majority of corporate systems.”

I’m writing an article about online privacy that will capture more viewpoints of other IT practitioners and e-discovery experts. If you have opinions about the use of social media on corporate systems and the online privacy expectations the surround them that you’d like to share, please comment here, @reply to @ITcompliance on Twitter or relate them directly to ahoward@techtarget.com with instructions on whether you’re willing to see them published.

Reblog this post [with Zemanta]


July 29, 2009  9:59 PM

Government bodies’ dueling legislative answers to data protection laws

SarahCortes Sarah Cortes Profile: SarahCortes

When it comes to data security legislation, do you prefer the perspective of the White House, Capitol Hill or Beacon Hill? This is not a trick question.

While the White House refined its philosophy in the Cyberspace Policy Review (CPR) released in May, legislators in Washington had already introduced draft legislation in April embodying different approaches to data security.

The House of Representatives’ version, H.R. 2221, also known as the Data Accountability and Trust Act, appears to be a vehicle with which the executive and legislative branches of government will debate their differing cybersecurity philosophies. How those approaches differ could have a big impact on state laws.

The Cyberspace Policy Review focuses on long-term security policy and strategy rather than immediate solutions. We recently wrote about several significant recommendations from the report, which include:

  • A proposal to consider federal issuance of national authentication credentials, similar to a passport.
  • Increasing liability for failing to implement level-playing-field security controls.
  • A recommendation to align federal and state laws to eliminate confusion and contradiction.

The White House report, overseen by Melissa Hathaway, states that government legislation has been “focused on the particular issue or technology of the day” and that current law and policy is a “complex patchwork,” while recommending an “integrated approach that combines … flexibility … and the protection of civil liberties.”

Proscribing specific technical approaches and technologies such as encryption has already generated controversy in data privacy and security laws, including Massachusetts’ 201 CMR 17.

One aspect that makes Massachusetts regulations in their current form the most onerous or far-reaching in the U.S., depending on your point of view, is mandated 128-bit encryption. However, mandating specific methods and technologies could prove inflexible and, rapidly, obsolete.

The White House report did not take a hard and fast position one way or the other, but its position is revealed in the CPR: “Privacy enhancing technologies such as encryption or controlled access authentication could ameliorate some risks in sharing information.”

Meanwhile, HR 2221 defines encryption as:

“data in storage or in transit using an encryption technology that has been adopted by an established standards setting body which renders such data indecipherable in the absence of associated cryptographic keys necessary to enable decryption of such data. Such encryption must include appropriate management and safeguards of such keys to protect the integrity of the encryption.”

What are your views and concerns about state data protection laws vs. federal legislation or polices from the executive branch? Do you think encryption should be included? If so, what kind? I’d like to hear. Write to editor@searchcompliance.com or reply to @SecuritySources on Twitter.


July 29, 2009  2:27 PM

Cloud computing data security creates challenges for compliance officers

Scot Petersen Scot Petersen Profile: Scot Petersen

Cloud computing is just another form of outsourcing, and like outsourcing, it comes with its own set of risks and compliance challenges. As the data center begins to disappear into the cloud, data security tops the list.

But is encryption, specifically public key infrastructure, up to the task of protecting data that could reside anywhere? Will standards emerge that will govern the relationship between data owners and cloud service providers?

In this Compliance Advisor podcast, security expert Steven Ross discusses the compliance issues of the “disappearing data center” with SearchCompliance.com Executive Editor Scot Petersen.


July 22, 2009  2:29 PM

Compliance resources: Tips and news from around TechTarget

GuyPardon Guy Pardon Profile: GuyPardon

Did you know that TechTarget now has more than 60 different websites, each of which focuses on a different form of technology? You can find compliance resources on nearly every one of them.

As a former editor at WhatIs.com, I’m familiar with the thousands of tips, news stories and learning resources around the network. For the time-starved reader, especially a busy compliance professional, simply being aware of what compliance resources are available can be a challenge. Here’s the best of what you’ll find on our sister sites from the past months:

CIOs and compliance

On SearchCIO.com, senior news writer Linda Tucci writes that according to research consultancy Gartner, IT security jobs will morph into risk management. The work of our contributors and the IT practitioners we talk to here at SearchCompliance.com confirm this trend. The staff at SearchCIO.com also put together a briefing on enterprise risk management solutions for CIOs and a selection of information security and IT governance guides for CIOs.

SearchCIO-Midmarket.com’s associate site editor, Kristen Caretta, also recently interviewed iRobot CIO Jay Leader. During the video interview, addresses the importance of a solid IT strategy – no small issue for this midsized company that must maintain a high-level of security and secrecy given its defense contracts.

Compliance in the cloud

Tucci is similarly focused on the compliance issues that are presented to the enterprise CIO considering cloud computing for data backup and storage. In addressing compliance requirements in cloud computing contracts, as Tucci makes clear, regulatory compliance requirements must be both expressly defined and then addressed – “or the data brought back down to earth.”

One of TechTarget’s newest websites, SearchCloudComputing.com, naturally has published stories on similar issues. In “Cloud computing skepticism: IT security and compliance,” research director Andi Mann explores whether security and compliance concerns in the cloud can be reconciled.

Compliance and Security

Over at SearchSecurity.com, you’ll find dozens of resources in its audit, compliance and standards topical section. You can watch instructional videos about testing PCI compliance requirement 11 or using IAM tools to improve compliance.

Recent news included coverage of MasterCard’s increase in PCI compliance requirements for some merchants (Visa says it won’t follow suit) or the increasing risks to identity theft, in “Researchers predict SSNs, crack algorithm putting identities at risk.”

Security expert David Mortman recently addressed the recent changes to HIPAA regulations that resulted from the HITECH Act in “HIPAA compliance: New regulations change the game.” Enterprise security teams charged with safeguarding PHI will find his insights useful. Mortman has also written this month about how to find virtual machines for greater virtualization compliance.

We’ve also partnered with SearchSecurity.com to produce both events and in-depth content like the recent log management e-book. Download the e-book (free registration required) to learn how automation can reduce the operational burdens of regulatory compliance.

SearchFinancialSecurity.com, given its focus on the financial industry, naturally features content to help security officers in that highly regulated vertical manage compliance. For instance, in “Tokenization and PCI compliance,” Ed Moyle explains what this relatively new technology may mean for the protection of sensitive credit card data. Our sister website also includes a video on Red Flags Rule compliance featuring John Carlson, senior vice president of regulatory affairs for BITS, a division of the Financial Services Roundtable.

Compliance and the channel

Our colleagues at SearchSecurityChannel.com are also covering the security aspects of compliance. As Neil Roiter writes in “Vulnerabilities, regulatory compliance drive data protection market,” while risk and vulnerability management are the two headings under which security spending often falls, the ultimate goal of both is data protection.

SearchSystemsChannel.com also features compliance coverage, in particular the specific U.S. laws and regulations that represent compliance and security concerns for Microsoft Office SharePoint.

Compliance and storage

Over at SearchSMBStorage.com, contributor Kevin Beaver recently wrote about making sense of regulatory compliance and data storage for SMBs.

Feedback

If you found this roundup useful, please let us know at editor@SearchCompliance.com or at @ITCompliance on Twitter. If so, I’ll do it again in August.

Reblog this post [with Zemanta]


July 21, 2009  5:58 PM

Freerisk financial risk modeling services challenge S&P, Moody’s

Scot Petersen Scot Petersen Profile: Scot Petersen

In the wake of the financial meltdown trigged by the subprime mortgage crisis in the fall of 2008, credit ratings agencies like Moody’s and Standard and Poor’s became the focus for some of the blame. Did they ignore key risk indicators that would have alerted investors much earlier to the house of cards that would come crashing down? In this Compliance Advisor podcast, Jesper Andersen, co-founder with Toby Segaran of Freerisk.org, discusses their open financial services project, which will offer data, algorithms and tools to perform financial risk modeling.

Find out the origins of Freerisk and its philosophy, its position on XBRL and how it plans to work with Moody’s and S&P to create a more transparent ratings process.


July 20, 2009  7:26 PM

Managing e-discovery and compliance: What would Eliot Spitzer do?

SarahCortes Sarah Cortes Profile: SarahCortes

E-discovery – or electronic discovery – has many technical aspects. Questions of available tools, case law, regulations and scope are critical. One of the most important and often overlooked elements, however, is managing e-discovery and compliance.

As a senior manager at Putnam Investments, bizarre coincidences and convergence of fate with the soon-to-be famous marked my tenure. Few chapters embodied all these elements as thoroughly as the following e-discovery anecdote, for reasons that are obvious now, but were less so in 2003.

On Monday, Nov. 3, 2003, Putnam Investments fired its CEO, Larry Lasser, following a probe into market timing. Eliot Spitzer, New York’s attorney general, and William Galvin, the Massachusetts state regulator, had brought significant pressure to bear regarding market timing charges.

Spitzer, then known best as U.S. Attorney for the Southern District of New York, issued a subpoena two weeks later for Putnam documents. In the process, he indicated that criminal charges were being considered. From that day onward, senior managers at Putnam had a critical new IT project: managing e-discovery and compliance.

Unlike other IT projects, which include a feasibility analysis, budgeting and decision-making process prior to kickoff, e-discovery really starts from subpoena receipt. Spitzer’s reputation for a “take-no-prisoners” approach to investigations and prosecutions, not atypical for situations many firms face during litigation, had implications for IT.

From the moment a subpoena is received, senior technology managers should be called in. From IT’s viewpoint, e-discovery then becomes a new IT project on the list that requires reprioritization of existing resources.

The first step in managing e-discovery is to assign an IT project manager. Given that this will be a high-risk project, a seasoned individual is required. That means either hiring a backfill candidate for an existing project, or cancellation or delay of exiting work. E-discovery is usually a good example of a project that has no real, measurable ROI. This is a handy data point for all those IT projects that you, the IT manager, have to argue for each year during the budgeting process. That process demands an ROI even for operating system, database and other major software upgrades, which are also projects that evade calculating an ROI.

The next step in managing e-discovery is stakeholder and requirements identification. While vendor or tool selection usually comes later in the process, for a specialized project like e-discovery, identifying requirements should be fast-tracked from Day One. Firms and experts specializing in e-discovery are crucial for this type of project, which typically will be handled only once in a company’s lifetime – you’re lucky. Your staff is likely to lack experience with e-discovery, a reality best addressed by selecting an advisor immediately after selecting a project manager.

In the next post, I will address how to adapt standard project management techniques to the e-discovery project.

Questions? Write to editor@searchcompliance.com or reply to @SecuritySources on Twitter.

Reblog this post [with Zemanta]


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: