IT Compliance Advisor


July 30, 2010  1:35 PM

When compliance-related best efforts for data archiving aren’t enough

Fohlhorst Frank Ohlhorst Profile: Fohlhorst

When it comes to compliance, regulations often dictate that an organization must demonstrate “best efforts” for archiving data. The term best efforts is vague, at best, and can mean different things to different people.

But for regulators, the term best efforts has its roots in the ability to retrieve and audit data. For CTOs, it means a backup and archiving platform. For CFOs, it means the lowest-cost solution that meets the minimum requirements. Defining best efforts in a meaningful fashion is usually a task that IT managers responsible for compliance technology find themselves assigned.

Luckily, those IT managers can dissect the term best efforts to figure out an applicable definition by keeping one other technology term in mind: e-discovery. The requirements behind e-discovery make it easy to see that best efforts must go beyond merely storing relevant information. The e-discovery process dictates that data must be archived securely in a protected fashion that supports auditing — the key word here being auditing.

For all intents and purposes, best efforts means much more than just archiving data. It also means the ability to retrieve the data in a relevant fashion, and that is where things start to get complicated. Retrieving the data, especially if it is years old, often requires access to the applications that can report on the data. This, in turn, means old email clients, accounting systems and other relevant applications must be maintained, as well as the platforms that support those applications.

This is a major challenge when one considers that audit windows can range from a few months to 20 years or more, depending on the type of data and the regulations that apply. So what does all of this mean?

Simply put, IT managers need to plan for the retrieval of data, not just its archiving. Luckily, technologies such as virtualization make the process a little easier today. When creating an archive, IT managers can do a physical-to-virtual conversion and store all of the needed elements as a virtual machine, which can be accessed at a later date using a hypervisor.

Frank Ohlhorst is an award-winning technology journalist, professional speaker and IT business consultant with more than 25 years of experience in the technology arena. He has written for several leading technology publications, including Computerworld, TechTarget, PCWorld, ExtremeTech and Tom’s Hardware, and business publications including Entrepreneur and BNET. Ohlhorst was also executive technology editor at eWEEK and director of CRN Test Center.

July 23, 2010  3:14 PM

How to meet compliance regulations with Windows Active Directory

Fohlhorst Frank Ohlhorst Profile: Fohlhorst

IT Compliance Advisor welcomes our new blogger, Frank Ohlhorst:

Meeting the needs of compliance regulations effectively means that IT staffers must be able to monitor and report on any activity traversing the network. Luckily for many Microsoft shops, the compliance beast has been tamed with the help of Microsoft’s Active Directory (AD), which can be extended to store many of the data elements associated with compliance requests.

What’s more, dozens if not hundreds of compliance tools that integrate with AD are readily available on the market. While that is good news for Microsoft Windows administrators, it is bad news for those looking to innovate. Simply put, AD plus compliance kills innovation.

How can this be? It’s simple: Many administrators are using compliance as an excuse to not deploy alternate capabilities. For example, take a moderately sized organization that wants to add a dozen Macs to the network to support the art department. The request goes in, and is immediately shot down because of a compliance issue — namely, the inability to apply policies to the Mac systems and report on activity, configuration and so on.

Pretty much the same thing can be said about Linux. Organizations looking to save bucks and deploy Linux are finding that compliance has become a powerful tool to prevent a deployment. Nowhere is this more true than on Windows Server networks using Active Directory.

The basic argument goes like this: “We can’t deploy the new desktop OS, because we are unable to monitor logons, apply policies, audit and report on compliance.” So, does that mean it must be the end for non-Windows systems being attached to the network? Well, not exactly. AD proves to be extensible, allowing new leaf objects and data elements to be added. It does take a bit more than modifying AD however, to handle compliance for non-Windows systems. In fact, it will take thinking outside of the Windows box.

Lets look a little more closely at the problem. For a desktop PC to be compliant under the rules of PCI, SOX and HIPAA, you will need to do a few things. At the top of that to-do list is authentication. You will need a way to maintain logon security, regardless of whether it has a local connection or not. Next on the list is applying policies to the system, policies that enforce rules about access and the types of data available. After that, you will need methods to inventory, monitor and report on the system. Finally, you will need to audit the system, which includes looking at usage and history over a period of time.

Miss any of these elements, and you will not be compliant. AD proves to be the perfect tool for backing compliance, and those leveraging AD will never want to see a non-Windows system on their networks. Where does that leave those non-Windows systems? Unfortunately, out in the cold.

But it doesn’t have to be that way. There is a solution to the problem, and we can once again thank Active Directory. There are a few products on the market that bring AD-based authentication to Linux, Unix and Macintosh systems, solving one of the biggest security issues of those systems (under the eyes of compliance). This is a good start.

However, authentication is only part of the puzzle. You will also need to enable policy enforcement and implement change management. In some instances, some of those same products will provide the answer. Finally, you will need to audit and report on those systems, and that is where a third-party product really pulls its weight.

So the moral of the story is to not let IT staffers pooh-pooh the possibility of integrating Linux, Unix and Mac into the enterprise, and begin to research products such as LikeWise Enterprise, Quest Authentication Services and Centrify’s DirectControl. Currently, Likewise Enterprise appears to have all the bells and whistles anyone could need and includes compliance reporting built right into the product.

Frank Ohlhorst is an award-winning technology journalist, professional speaker and IT business consultant with more than 25 years of experience in the technology arena. He has written for several leading technology publications, including Computerworld, TechTarget, PCWorld, ExtremeTech and Tom’s Hardware, and business publications including Entrepreneur and BNET. Ohlhorst was also executive technology editor at eWEEK and director of CRN Test Center.


July 14, 2010  6:29 PM

Security professionals: How will Mass. data privacy law be enforced?

Fohlhorst Paul F. Roberts Profile: Proberts

IT Compliance Advisor welcomes our newest blogger, Paul F. Roberts:

I recently had the pleasure of speaking to a group of security professionals in New York about Massachusetts’ toughest-in-the-nation data privacy and protection law. It was one of those mutually beneficial events that sometimes comes along: New York security professionals learned a little more about the guts of the Massachusetts law, and I got to pick their brains about what the law means for their employers, which rank as some of the largest IT shops in the nation.

My takeaway: Folks are only now starting to pay attention to this law and are very anxious about one big question — its enforcement.

There’s good reason for this concern. While the data protection law has been on the books for a couple of years, specific guidance on implementing it (201 CMR 17.00) just took effect at the beginning of March. The law’s passage was the culmination of a long and contentious fight among business leaders, state legislators and regulators over the scope and provisions of the law.

But now that 201 CMR 17.00 is “live,” the focus has shifted to the question of enforcement, as organizations with customers in Massachusetts try to divine how this law is different from all other laws. The questions and comments I fielded from top IT security practitioners in New York suggested there is lots of grey area. Here are some areas where enforcement actions by the Massachusetts AG can add some color.

Will there be any enforcement of this law, and if so, what for?

This is the big question. Word is that the state attorney general’s office is looking into violations of Massachusetts General Law (M.G.L.) 93H, but no actions against specific organizations or individuals have yet been taken. One likely possibility is that enforcement will follow disclosure of a breach, in accordance with M.G.L. 93H, or after details of a breach have been made public. Failure to comply with 201 CMR 17.00 used to punish firms retroactively. The Massachusetts Office of the Attorney General declined to comment on the question of enforcement.

Who is covered by the Massachusetts data protection law?

The guidance offered by 201 CMR 17.00 is pretty clear about the fact that this law applies to both individuals and corporate entities that manage data concerning Massachusetts residents, including both employee and customers. But legal experts who follow the law say there’s still considerable uncertainty about which entities will be the focus of enforcement actions — companies that manage consumer data, or just their own employees’ data, or both? According to one attorney at a prominent Boston law firm, “we still see the basic ‘We don’t have consumers — do we really have to comply with this?’ question.”

A key question is what kinds of data will get the attention of law enforcement. Mega breaches affecting consumers, like the breach at TJX, are at the root of M.G.L. 93H. There is no reason however, that regulators won’t take an equally tough stand on companies that are loose with employee data.

Also unclear is whether those charged with enforcing the requirements in 201 CMR 17.00 will focus on large corporations with customers in Massachusetts, or on smaller in-state firms first. The attorney I spoke with said that if a case involving an out-of-state entity presents itself (such as a major data breach), the AG has made clear that she will enforce the regulations in order to protect the interests of the affected residents of the commonwealth. This means that out-of-state firms are at risk of making Massachusetts’ law a de-facto national standard — at least until a tougher state law comes along.

What about mobile devices?

Of the eight IT-focused requirements in 201 CMR 17.00, one of the most contentious involves the security of wired and wireless (i.e., mobile devices) that contain information on Massachusetts residents. The IT pros I spoke with were understandably nervous about this one, and for good reason.

Many large enterprises are in the early stages of tracking and managing employee mobile devices. Yes, there are systems in place to enforce basic policies, but it’s an imperfect art and nobody I spoke with would say for sure they know what devices employees are using to check their email, or to log into work applications. With poor visibility into their mobile infrastructure, it’s hard to say which devices do and don’t contain personal information covered under M.G.L. 93H.

To ease tensions with the private sector, legislators in Massachusetts inserted the idea of “technically feasible” into the wording of the Massachusetts data privacy law concerning the security of data on mobile and wireless devices. This means that if there is a “reasonable means through technology to accomplish a required result, then it must be used.” What is a “reasonable means through technology?” That’s right, you ask the attorney general.

Is there any safe harbor for companies?

There was much disagreement on this among members of the New York audience, with IT security pros relating different messages from their own corporate counsel. In some cases, the opinion seems to be that encryption of personal information constitutes safe harbor from prosecution. In others, there’s a belief that if organizations take reasonable steps to protect customer data, such as layered security protections, they’ll have shown due diligence.

The attorney I spoke with said that companies can get safe harbor from M.G.L. 93H by encrypting covered data, and by complying with the many requirements of 201 CMR 17.00. But, like other regulations, organizations can have no “safe harbor” from the law itself. They can only be in compliance or out of compliance with it.

Paul F. Roberts is a senior analyst at The 451 Group in New York. Let us know what you think about the post; email editor@searchcompliance.com.


May 24, 2010  7:25 PM

Paychex risk management analysis method shoots and scores

Scot Petersen Scot Petersen Profile: Scot Petersen

Final Four bracket pools are not just for basketball fans anymore.

In an unusual risk management analysis methodology, payroll and human resource services provider Paychex breaks down its risk factors into four regions and pits them against one another.

Frank Fiorille, director of risk management for Paychex, presented an overview of the company’s risk analysis process at the 18th Edition SOX Compliance & Evolution to GRC conference in Boston on May 17.

In this version of March Madness, the Final Four is comprised of financial risks, hazard risks, strategic risks and operational risks. In the example Fiorille presented, each region is assigned 16 risks, which then compete against each other in live votes among the company’s leaders, to determine the risk champions in each region.

The brackets are whittled down to a Sweet 16, four in each region, before being put through more rigorous tests, vetting and quantitative analysis, and then ranked again on heat maps.

“Risk management is part art and part science,” Fiorille said, “but the business units know their risks the best.”


May 17, 2010  8:04 PM

Using personally identifiable information is gonna cost you

Linda Tucci Linda Tucci Profile: Linda Tucci

The era of businesses playing fast and loose with people’s personally identifiable information (PII) has passed — and not because of standards like PCI DSS or compliance mandates. The public at large is awakening to the reality that information is currency.

This is something that CIOs, of course, have known for a long time. IT executives owe their livelihoods to the fact that there is barely a company in the world that doesn’t do business in this material known as information.

Now the rest of us — from computer cave dwellers like me to the oversharing Facebook generation — are on to the fact that our PII comes at a price. And, one way or another, companies will pay up. The uproar over Facebook’s shape-shifting privacy “rules” and the anger in Europe over Google’s collection of private data are two current and noisy examples.

To get a sense of the change in public attitude in a few short years, consider the evolution of the Netflix Prize. Back in 2006, the company that changed the way people consume movies announced an open competition to improve Netflix’s algorithm for predicting which movies its customers might like to watch based on their past viewing habits. In September 2009, to breathless media reviews about tapping into the wisdom of the smart crowd, Netflix awarded the $1 million prize to BelKor’s Pragmatic Chaos, a seven-man (yes, man) multinational team of computer scientists and machine learning experts, and promptly announced a second contest. By March, the Netflix Prize 2 was called off. Netflix’s chief product officer, Neil Hunt, reported that that the company had decided not to pursue round 2 after reaching “an understanding” with FTC investigators and settling a class action suit on whether the contest violated customer privacy. The investigation and suit were prompted by a research study by two University of Texas at Austin scientists showing that the anonymity of the Netflix prize data set was not so anonymous.

Gabriel Helmer, an attorney and privacy expert at Boston firm Foley Hoag, said the reaction to the second contest points to two important issues related to data privacy: re-identification and a company’s data privacy policy. For the second contest, Netflix promised contestants access to “a lot more demographic information,” and that information would be hosted in the cloud.

“You can take somebody’s name off their personal data, but the more personal information you provide, the easier it is to re-identify that person. Anonymized is never truly anonymous,” Helmer said. The FTC started to investigate because it wanted to know what Netflix told its customers their personal data would be used for when they turned it over.

“Most likely, Netflix did not say they would take the name off and give that personal information to the entire world in order to create a better algorithm,” Helmer said.

Helmer finds the case a “fascinating example” of the strengths and weaknesses of cloud computing — of the enormous gains that that can be realized by making real data available for analysis to large groups of people, along with the obvious dangers of doing that. As a consumer, he said he likes that companies will do the work to tell you which media or products you might like to consume. But the people who brought the class action suit against Netflix are realizing that those services “come with a price.” They are demanding that the price for personally identifiable information be borne by the business, whether it means paying for personal information in exchange for a service, or guaranteeing the data will remain private, or cease and desisting. But the lawyer says it is still early days for knowing how such transactions will play out.

”The reason why it is such an exciting time is that people really have not decided what they will put up with, and what they like and don’t like about personally identifiable information,” he said.

His bet? Just as technology got us into this quandary, technology will quickly point us the way out. It will likely do so in the form of biometric scans or ways to identify people other than using a Social Security number, an address or your mother’s maiden name, much of which is already widely available on the Internet.


May 8, 2010  7:50 PM

Financial reforms won’t fix the computer terrorism on Wall Street

Linda Tucci Linda Tucci Profile: Linda Tucci

I am not the only one who wondered if the stock market “jitterations” Thursday were caused by an act of computer terrorism. Like a lot of people apparently, I pondered whether the theoretical fat-fingered trader sitting at his desk deliberately applied a fat finger to the wrong key to cause mayhem in the markets.

But it doesn’t matter really whether fat fingers, clumsy or deliberate, were or were not the cause of the 1,000-point stock plunge. The threat to capitalism is upon us, and it doesn’t have much to do with a typo. When the anomaly occurred, some of the super-fast computer systems that pounce on such deviations of the norm whirred into action, impervious to electronic checks and balances. Billions of dollars were lost and made in a matter of a minutes.

The Obama administration is investigating the “unusual market activity” with a focus on the disparate rules of the various trading platforms. Congress also wants a review of the selloff. The event may hasten action on financial reforms. The government powers investigating the event might find out that the computer trading systems were in fact tricked into responding by a big bank or hedge fund or some other financial terrorist looking to make a killing.

But whatever the discovery — or remediation — that comes about as a result of the May 6 stock crash will pale beside what could be in store. At some point, it will occur to many people that the super-fast computer systems that pounce on anomalies, whether governed by effective or ineffective rules, are in the hands of a few. The means of production of very big profits are controlled by a very rich few. And when that reality sinks in, Marx my words, many middle-class working schmoes like me who pay taxes, keep up with mortgage payments and invest a portion of their bonusless salaries to fund their precarious retirements will decide that this capitalist game has run its course.


May 3, 2010  4:05 PM

Getting serious about PCI DSS compliance

Linda Tucci Linda Tucci Profile: Linda Tucci

A survey of Qualified Security Assessors (QSAs) on how businesses are dealing with the 12 mandatory requirements of the Payment Card Industry Security Standard (PCI DSS) contains a number of interesting nuggets. For example, according to assessors:

  • The largest merchants spend an average of $225,000 just on auditor expenses to comply with PCI DSS standards.
  • Only a tiny percentage of organizations fail a PCI audit (2%), but 41% rely on compensating controls, or mechanisms not related to the PCI DSS standard, to pass the audit.
  • Restricting access to card data (Requirement No. 7) is the most important PCI DSS compliance requirement, but also the most difficult to achieve.
  • Firewalls and encryption trump website sniffers, credentialing systems and intrusion detection/prevention systems for achieving compliance.
  • Encryption is the best way to ensure end-to-end protection of card data (60%), but tokenization is gaining ground (35%).

QSAs are the certified auditors who validate PCI DSS compliance at large merchants and service providers. PCI DSS is the set of baseline security requirements introduced in 2006 for businesses that accept and process credit cards. The findings are from PCI DSS Trends 2010: QSA Insights Report.

But what is the message for IT security organizations in the survey?

Four years after the standard was introduced and many credit card security data breaches later, a great many businesses are still operating under the delusion that data security is something for the IT security detail to take care of, rather than a core business initiative.

In fact, according to assessors, while business units (40%) or legal departments (28%) most often own the budget for annual compliance assessments, IT security is most often responsible for ensuring compliance (30%). The organizations the assessors deal with are not making data security a strategic priority (42%); not proactively managing data prosaic and protection (51%); are overwhelmed by the cost of compliance (54%); and don’t believe PCI DSS compliance improves data security (44%).

Maybe it’s time to do the equivalent of what women did way back in the last century (burning bras) to make the business understand that data security is their business. IT security liberation!

A note on the survey: The 155 QSAs participating in the study were culled from 3,005 respondents who had to be certified or working toward a certification. On average, the QSAs surveyed had participated in eight PCI DSS assessments over the past 12 months. Some 59% of the assessments involved Tier 1 merchants and another 28% involved Tier 1 service providers, or the largest acceptors and processors of card transactions. (Tokenization refers to the process of replacing sensitive data with unique identification symbols that retain all the essential information without compromising its security; e.g., replacing all but the last four digits of your credit card number with alphanumeric symbols representing miscellaneous cardholder information and details about the current transaction.)


April 27, 2010  3:58 PM

Principal agent risk needs to be in your risk management model

Linda Tucci Linda Tucci Profile: Linda Tucci

The courts will eventually determine whether the profiteers at Goldman Sachs who spun toxic securities into gold were extremely skilled players in the legal gambling dens of Wall Street, or whether they rigged the house (mortgages). Meantime, even companies that are not being sued for fraud by the U.S. Securities and Exchange Commission might want to bone up on principal agent risk — and waste no time in making it part of their risk management models.

I got a primer on principal agent risk a couple months ago from risk expert Ali Samad-Khan, the CEO of Stamford Risk Analytics and a keynote speaker at the Risk Management Association’s GCOR IV operational risk seminar in Boston. Principal agent risk speaks to the reality that employees of a company are sometimes in the position to do things that are not in the best interest of the organization. Managers who are agents don’t always do things that are in the best interest of the stakeholders or the principals.

For example, if an employee’s bonus is based on the amount of money he makes, as opposed to the amount of money to be made on a risk-adjusted basis, Samad-Kahn explained, that employee can end up making a huge amount of money by taking a huge of amount of risk. When thinking about principal agent risk in terms of payout metrics, he advised his audience of risk managers to ask two questions: Who is the intended beneficiary, and who is the intended loss sufferer? In a criminal event, where the perpetrator is intent on benefitting himself, there is a clear intended loser. It’s a zero-sum game. So, how is that situation different from principal agent risk? When an employee of a firm takes excessive risk, the intent is not to harm the firm, usually, he said. The intent is not to harm anyone but to benefit the firm and himself.

“But if you look at the distribution of all potential outcomes of this excessive risk-taking, the expected value of all the outcomes is negative. What does this mean? It means that you know full well that what you are doing is not in the best interest of the firm on a risk-adjusted basis. It means this transaction you are engaging in destroys value, and yet you go ahead and do it because it benefits you,” Samad-Kahn said.

In an environment where compensation is frequently on a “heads-I-win-tails-somebody-else-loses” basis, you wind up with an environment where there is a lot of principal agent risk,” Samad-Kahn said. Principal agent risk — now not even part of the risk taxonomy — must be factored into risk management models.

Making piles of money on a risk-adjusted basis is fine, provided the risk model is not intent on doing harm. Making piles of money by taking excessive risk that could harm the principals and the shareholders is not okay.

Of course, when the agents are the principals, and the shareholder being harmed is the American taxpayer, that’s a matter for the courts.


April 16, 2010  7:58 PM

Follow the money in GRC management platforms

Linda Tucci Linda Tucci Profile: Linda Tucci

I’ll start with the possibly infuriating hypothesis: There’s money to be made from governance, risk and compliance (GRC) software by vendors, of course, but also for enterprise IT shops. And it is probably not in the standalone GRC management platforms that focus on documentation, but in platforms that focus on automated controls and continuous monitoring of risk.

SAP and Oracle have sniffed out where the money is in GRC, and maybe IBM. (Thomson Reuters, the global information company that acquired Paisley in 2008, and Wolters Kluwer, which bought Axentis last year, are also betting on a GRC jackpot, but it’s a different pot from the one pursued by the ERP players.)

To back up: The marketplace for GRC management platforms is premised on the belief that the problem with enterprise governance, risk and compliance is that the three areas are typically handled in separate parts of the enterprise and shouldn’t be. Coordinating the many compliance obligations that an enterprise faces — to reduce redundancy, improve efficiency, etc. — is hard. Coming up with an effective governance structure for the legions of corporate employees who have some hand in mitigating risk and meeting compliance requirements is also hard. GRC management platforms promise to identify, coordinate and document all the IT, operational and financial functions involved in GRC.

Compliance is good. Documentation is good. Fixing is even better. Back to the money part.

ERP vendors Oracle and SAP are upping their stakes in the GRC market. Oracle, which has offered GRC software for a number of years, announced in December its Oracle Enterprise GRC Manager (the acquired Stellent product re-architected to run on Fusion middleware) and its latest release of Oracle Enterprise GRC Controls for, if it should say so itself, “a unique, closed-loop approach to regulatory compliance, risk management and controls automation.” SAP’s focus in GRC has always been on the automated controls, rather than documentation, claiming that extensive manual reporting is unnecessary if problems are corrected as they arise. When this kind of continuous controls monitoring is applied to enterprise performance for the purpose of detecting and preventing violations of business rules — duplicate or late payments, for example, out of warranties — companies are not only staying in compliance, they are saving and making money. If, as some analysts believe, this is where the GRC market is heading, SAP and Oracle have a leg up — at least among their customers.

That’s the hot air. There’s a plan to bring the argument down the earth.

Starting soon, SearchCompliance.com will devote a section to new products and product developments in GRC, not just from the names mentioned here but also from the flotilla of vendors (about 40) that address the many aspects of this $30 billion spend. (Suggestions for a catchy name for our product news section, preferably with a double entendre, are welcome.) As we follow the products, we hope it will become clearer to you (and us) where GRC technology is headed and how these products can not only help keep your company out of jail but also contribute to its bottom line.

Meantime, rebuttals to the follow-the-money hypothesis are welcome. As well as any inside tips on which GRC vendors will end up owning the space. (I already got one from someone who’s followed Oracle for two decades. To paraphrase: Once they’ve made up their mind to go after this space, they’ll crush everyone in their path.)


March 31, 2010  3:52 PM

Coalition pushes ECPA update for online privacy in cloud computing age

GuyPardon Guy Pardon Profile: GuyPardon

A powerful collection of organizations has formed a new coalition to push for an update to the Electronic Communications Privacy Act (ECPA). Members of the coalition include Google, Microsoft, AT&T, AOL, Intel, the ACLU and the Electronic Frontier Foundation. The guidance from the coalition would enshrine principles for “digital due process,” online privacy and data protection in the age of cloud computing within an updated ECPA.

“The traditional standard for the government to search your home or office and read your mail or seize your personal papers is a judicial warrant,” said Jim Dempsey, vice president for public policy at the Center for Democracy & Technology, who has led the coalition effort. “The law needs to be clear that the same standard applies to email and documents stored with a service provider, while at the same time be flexible enough to meet law enforcement needs.”

Updates to ECPA may also be relevant to the public sector, as CIOs answer the call to arms issued by U.S. CIO Vivek Kundra for cloud computing services. Cloud computing compliance figures to be a growing concern as enterprises turn to rapidly maturing Google Apps.

The coalition has set up a website, DigitalDueProcess.org, containing its proposals for updating ECPA in the face of new cloud computing security and online privacy challenges. Google Public Policy released a video, embedded below, describing the concept of “digital due process”:
[kml_flashembed movie=”http://www.youtube.com/v/AYYjr3XNaGs” width=”425″ height=”350″ wmode=”transparent” /]

At issue is the reality that for many consumers and enterprises, sensitive data in email and other electronic communications no longer resides solely on a hard drive at home or in the office. “The statute was passed in 1986 and doesn’t reflect how people use online services today,” said Microsoft spokesman Mike Hennessey. “The U.S. Constitution protects data at home and on your computer at a very high standard. We don’t believe that that should be turned on its head.”

As more and more people embrace the benefits of cloud computing, there are challenges in terms of compliance, as well as friction in terms of law enforcement access. Congress has heard testimony on location-based services and online privacy as usage of mobile social networks has exploded.

“The majority of court decisions have found that the government needs to get a warrant if it needs to track citizens in real time,” said Kevin Bankston, a senior staff attorney at the ACLU.

The coalition is pushing for a set of simplified standards that defines legal protection for data in the cloud, and that also addresses the increasingly blurred distinctions among data provided through GPS, cell sites and network triangulation.

“The reality is that technology has advanced over the last 20 years, so that better technology and intrusive technology has become available to both government and consumers,” said Catherine Sloan, vice president of government relations at the Computer & Communications Industry Association. “When you’re talking about the government, folks don’t have a choice of whether to deal with the government or not. They can’t just change a provider. That’s why the statute, in terms of government, needs to be updated.”

The coalition has issued principles for updating the ECPA that would define the rules for government access to email and other files stored online. These include requirements that:

  • ”A governmental entity may require an entity covered by the ECPA (a provider of wire or electronic communication service or a provider of remote computing service) to disclose communications that are not readily accessible to the public only with a search warrant issued based on a showing of probable cause, regardless of the age of the communications, the means or status of their storage or the provider’s access to or use of the communications in its normal business operations.”
  • ”A governmental entity may access, or may require a covered entity to provide location information regarding a mobile communications device only with a warrant issued based on a showing of probable cause.”
  • ”A governmental entity may access, or may require a covered entity to provide, prospectively or in real time, dialed number information, email to and from information or other data currently covered by the authority for pen registers and trap and trace devices only after judicial review and a court finding.”
  • ”Where the Stored Communications Act authorizes a subpoena to acquire information, a governmental entity may use such subpoenas only for information related to a specified account(s) or individual(s).”

The complete list of coalition members include: ACLU, American Library Association, Americans for Tax Reform, AOL, Association of Research Libraries, AT&T, Center for Democracy & Technology, Citizens Against Government Waste, Competitive Enterprise Institute, Computer and Communications Industry Association, eBay, Electronic Frontier Foundation, Google, Information Technology and Innovation Foundation, Integra Telecom, Intel, Loopt, Microsoft, NetCoalition, The Progress & Freedom Foundation and Salesforce.com. The coalition will continue to add new members.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: