As the effective date of Jan. 1, 2010, approaches for Massachusetts’ data protection regulation, business owners and information security managers are getting a little bit edgy about compliance with MA 201 CMR 17.
Witness this week’s Compliance Decisions conference. There were two main questions on the minds of the attendees: Enforcement (how strict?) and encryption (what to encrypt and how?). However, no easy answers are available — yet.
The answers that were given by a pair of experts — Gerry Young, secretariat chief information officer for the Massachusetts Executive Office of Housing and Economic Development, and David Murray, general counsel of the Massachusetts Office of Consumer Affairs and Business Regulation — provided a wealth of information about compliance, but in the case of enforcement and encryption, not quite enough information.
It’s not really their fault. While Young and Murray helped craft the data protection regulations, promulgated by Massachusetts General Law 93H in 2007, enforcement will fall to the Massachusetts Attorney General, Martha Coakley, and, lawyers being lawyers, Murray could not speculate as to how Coakley will seek to prosecute data breach violations. Will she come down hard on all businesses, or will small businesses be spared? What will she consider to be “reasonable” steps — cited four times in the regulation — to comply?
Coakley’s office has not been available for comment on this topic, and likely no one will know for sure what will happen until the first data breach of 2010 occurs.
That leaves business owners of all sizes with no choice but to comply with the letter of the law (or make their best attempt to). But even what that means is not clear. Young and Murray have been on the road for months, talking up the regulation to business and consumer groups, and have a well-rehearsed presentation with slides. But when asked about what data needs to be encrypted, they said everything — “data at rest” and “data in motion.”
Now, MA 201 CMR 17 is clear about data in motion, mandating “encryption of all personal information stored on laptops or other portable devices” and “encryption of all transmitted records and files containing personal information that will travel across public networks, and … transmitted wirelessly.”
This all makes sense so far. The parties responsible for the infamous TJX data breach in 2007, which gave rise to 93H, exploited the weak WEP encryption protocol for wireless networks, not TJX company servers or databases. Currently, WPA and WPA2 are considered the minimum security standard, but even that has to be implemented correctly, with strong passwords.
As far as data at rest is concerned, there’s no such language, in the Code of Massachusetts Regulations or the Massachusetts General Law, a fact pointed out by a third participant in the conference, consultant Richard Mackey, vice president at SystemExperts. Young then responded: “There is a requirement for encryption of data at rest in 93H that radiates forward [to MA 201 CMR 17].”
After poring through the text of M.G.L. 93H over lunch, Mackey confirmed that data at rest is not an issue, and later in the day, Young and Murray recanted their statement and said encryption of data at rest should be considered a “best practice” only.
Attendees were relieved to hear this. But the concept of a “best practice” opens up even more issues. Encryption of data at rest — in databases, backup tapes, servers, SANs, etc. — is no simple task. Key management, disaster recovery and application performance pose difficult problems for even large companies, let alone small businesses. The best practice of storage encryption may be a worthwhile goal, something to be phased in over time, but shouldn’t be something that gets in the way of the immediate requirements of compliance.
It is not surprising then, that enforcement of MA 201 CMR 17 was delayed from its original May 2009 date to Jan. 1, 2010. Nor that a bill, No. 173, was introduced in the Massachusetts Senate earlier this year seeking to amend the underlying M.G.L. 93H law. Senate Bill 173 would change the law to say that businesses will not be required “to use a specific technology or technologies, or a specific method or methods for protecting personal information.” In addition, it would “create separate regulations for small businesses … that reflect said small businesses unique situation and resources.”
So where are we? No one really knows for sure. What we do know is that Massachusetts employers, and out-of-state businesses that employ Massachusetts residents, must have a compliance plan well under way by January 1.
I received a range of answers, depending upon whether I talked to vendors, end users, analysts or. Later today, I’ll be publishing a feature that examines precisely this issue.
When asked about whether CIOs should worry about “implementing Web 2.0 tools in the enterprise because of security and compliance,” Professor McAfee said he didn’t have any horror stories to relate – and that he asks for them, whenever he talks to big business. His “quick and dirty explanation” for that is:
”People know how to do their jobs. By this point, none of these tools are a week old, so the rules for using them aren’t unclear. We know the stuff that will get us fired if we talk about it. If you work in an investment bank, for example, you have it drummed into you, before any enterprise 2.0 tools even showed up, what you can and can’t talk about, and to whom.”
I asked McAfee a similar question: “Where do you see the intersection between enterprise 2.0 and regulatory compliance?” His answer:
I do not think these tools substantially alter the compliance risk profile of organizations. Employees today are acutely aware of compliance issues, and I don’t see that they’ll be tempted to disobey policy or break the law simply because 2.0 tools become available.
There may be some slight risk of inadvertent noncompliance, but the fact that contributions to 2.0 environments are so visible means that any such breaches are likely to be detected quickly.
When it comes to enterprise 2.0, I agree heartily with Thomas Jefferson, who wrote, “I know of no safe repository of the ultimate power of society but people. And if we think them not enlightened enough, the remedy is not to take the power from them, but to inform them by education.”
After reporting on the story for a week, it’s clear to me that CIOs, privacy and security professionals need better tools to monitor, log and filter communication with external social networking platforms. Data loss prevention (DLP) will be a line item in enterprise security budgets, driven by the need to reduce new risks posed by social messaging.
Even if political gaffes on social networking sites don’t cease — like Battle Creek Mayor Mark Behnke accidentally tweeting Social Security numbers or continued Congressional missteps on Twitter — compliance concerns about the use of enterprise 2.0 platforms are likely to increase with continued data leaks, from whatever vector they take.
Insider threats are a significant concern, given increased economic pressures stemming from the recession. As Forrester senior analyst Andrew Jacquith observed earlier this year, “as auditors have gained more experience assessing compliance with Sarbanes-Oxley and other statutes, they have become increasingly aware of the perils of excessive entitlements. Greater awareness has led to tougher audits. Now enterprises must be prepared to explain who got access to what application features, and why.”
What Professor McAfee’s answer reveals to me, primarily, is that the people aspect of compliance is a crucial consideration. The technology matters but, in the end, your security and ability to meet regulatory requirements rests on the mind-set and education of those entrusted with the sensitive data of an enterprise or its customers. Thanks to the good professor for his answer.
Last week, a collection of trade organizations announced the release of a set of privacy principles for the use and collection of behavioral data in online advertising. The public adoption of these principles moves the industry towards self-regulation, though adoption measures that substantially improve protections for the online privacy of consumers will remain an open question for implementation.
Given the vast amount of data that is being collected online daily, the move can’t come soon enough. Whether the move is enough to head off regulation from Congress is likely a moot point; as my colleague Linda Tucci blogged last week, a national data privacy law is coming. As she noted, “the proposed federal electronic data privacy bill, known as H.R. 2221, was introduced in April with little fanfare but is generating a bit more buzz in the wake of recent hearings on Capitol Hill.”
Concern over online privacy is reflected by the relevant regulatory bodies, particularly at the federal level. To whit, as Pamela Jones Harbour, commissioner of the Federal Trade Commission, notes in the release:
“Consumers deserve transparency regarding the collection and use of their data for behavioral advertising purposes. I am gratified that a group of influential associations – representing a significant component of the Internet community – has responded to so many of the privacy concerns raised by my colleagues and myself. These associations have invested substantial efforts to actually deliver a draft set of privacy principles, which have the potential to dramatically advance the cause of consumer privacy. I commend these organizations for taking this important first step. I am hopeful that successful implementation will follow. In the meantime, I encourage the entire privacy community to continue a dialogue that places the interests of consumers first.”
According to the announcement on AAAA.org, these principles were developed by a “cross-industry self-regulatory task force” that included the American Association of Advertising Agencies, the Association of National Advertisers, the Direct Marketing Association (DMA) and the Interactive Advertising Bureau. The Council of Better Business Bureaus, has agreed, along with the DMA, to implement accountability programs “to protect consumer privacy in ad-supported interactive media that will require advertisers and websites to clearly inform consumers about data collection practices and enable them to exercise control over that information.”
Such protections are commendable but perhaps somewhat less laudable, in the context of looming regulation. In this case, release of such guidance for the protection of online privacy may help head off potential sanctions or fines under new legislation, demonstrating some good faith by the industry. Adoption is another matter. Electronic publishers used to collecting reams of data about Internet audiences are likely to have a new kind of compliance to address in 2010: privacy.
President Obama met with business leaders on July 2 to discuss not only how businesses can reduce their carbon footprint and energy consumption, but also how these efforts can be of benefit to the businesses financially and to the economy in creating the opportunity for new jobs.
In this Compliance Advisor podcast, Hara CEO Amit Chatterjee, who was at the Obama meeting, discussed why sustainability is important now and how businesses can use sustainable business practices to improve operations and fuel growth.
Momentum seems to be growing for a federal electronic data privacy law that would pre-empt the 44 state data breach notification laws already on the books and is more in line with European data privacy laws.
“If you work for an information broker, you definitely should be paying attention to this,” said Miriam Wugmeister, who chairs the global privacy and data security practice at law firm Morrison & Foerster. “But if you’re just a CIO at a national retail chain or at a financial institution, then this really is not that different.”
With this important caveat: The bill, like laws in states such as Massachusetts and Oregon, is moving toward what Wugmeister calls the next evolution in data privacy — a preventative approach with specific requirements for protecting data in the first place.
The proposed federal electronic data privacy bill, known as H.R. 2221, was introduced in April with little fanfare but is generating a bit more buzz in the wake of recent hearings on Capitol Hill.
Last week, representatives of the nation’s biggest brokers of online information — Google, Yahoo — appeared before House subcommittees on communication and consumer protection to answer questions about behavioral targeting, the tracking of users’ online behavior for various kinds of gain. Debate focused on the conflict between the individual’s right to privacy online and the advertising industry’s ability to make money.
Privacy advocates argued that most Internet users don’t understand the extent to which their online behavior is being monitored or how much electronic personal identifying information (PII) is being collected by large data brokers, such as Yahoo and Google. Nor are users aware of their ability to opt out of these data collection systems. Therefore, users need regulations that would require their consent to be tracked — or an opt-in (not opt-out) provision.
Advocates for the advertising industry argued these provisions would upend an industry already seriously weakened by the economic recession.
Another aspect of the law, if passed, would strengthen consumers’ ability to access and correct any personal information collected by businesses.
“In the U.S., unlike in the European Union, we don’t typically have the right to call up Amazon and say, ‘Tell me everything about me,’” Wugmeister said.
For CIOs at businesses that do not collect PII for sale to others, Wugmeister has two pieces of advice.
“I were a CIO, I would read Massachusetts,” she said. The law is among the nation’s most stringent for data protection and is proactive, requiring a comprehensive written security program and employee training. It also applies to any business, in or out of the state, that collects personal identifying information from a Massachusetts resident.
“The other thing you could read is the federal safeguards rule of the FTC,” she said. The rules are forming the consensus used by enforcement authorities, including the drafters of this bill, she said.
As for the increasingly anxious discourse on online behavioral tracking by data brokers, Wugmeister is a bit more mystified. “Those profiles of us for our offline behavior already exist. Every time you walk with your cell phone you are constantly transmitting your location. Your cell phone carrier has a log of every place you’ve been. Every time you use your credit card, there is a record of every place you’ve been and every place you’ve shopped.” In other words, Big Brother is already here.
In the coming months, I’ll be writing a lot more about H.R. 2221 and other IT compliance and security in weekly news articles for SearchCompliance.com. Let me know what compliance issues you’re grappling with and what kinds of information would be useful.
Last week’s 140 Characters Conference presented dozens of examples of how people are using Twitter creatively, effectively and disruptively. What didn’t get as much attention are the security risks and compliance challenges Twitter presents as the wildly popular microblogging platform continues to see adoption by enterprise users.
I talked with Erin Jacobs, chief security officer for UCB Inc., about Twitter security. If you haven’t found her on Twitter yet, she tweets as @SecBarbie. She sent her list of top information security threats about Twitter to us via email, which we published below.
Corporate networks try to protect themselves from email, IM and other means of sending information outside of the network. There are new services for updating Twitter popping up daily, so it is impossible at this time to completely block the ability to access Twitter. Network security professionals are constantly racing to fill in the holes to ensure that information cannot be leaked. Information leaks could include:
- Identity information from inside organizations.
- Identity theft
- Credit card fraud
- Account numbers
- Business IP leakage.
- Business plans
- Code leakage
- Copyright infringement
- Facility information.
- Business operating hours could be used in targeted physical theft attacks.
- Personnel locations or schedules.
Since Twitter communicates over port 80 and 443, there really isn’t much to protect users from inadvertently bringing malicious code into the network. Bit.ly and other URL shorteners can easily send users to different addresses than the user expects.
Improper use of Twitter
Direct messages are not secure email. Education about potential vulnerabilities is essential for executives and top-level management to understand that they must keep business off of Twitter. Issues around human resources and online harassment are also a consideration.
After Erin wrote in, I used Twtpoll to ask my followers on Twitter the same question, using her list and adding a few other options.
You can vote on what your primary Twitter security concern is on Twtpoll. The results, as of today, are embedded below:
As you’ll see, insecure third-party apps leading to stolen accounts is (currently) the top answer – it’s an issue of natural concern to Twitter users. Coming in second, however, was Erin’s concern over data leaks of confidential or proprietary information. Information security threats are at the top of on any CISO’s list; add Twitter security to the list.
Each of these information security threats are valid for other social networking platforms or services as well, like LinkedIn and, in particular, Facebook. Issues around Twitter security and social media in general were frequently discussed at this past week’s Enterprise 2.0 Conference in Boston and, at the RSA Conference earlier this year, where Web application security was at the top of the information security threats list.
Booz Allen Hamilton won the Open Enterprise Award for 2009 at the Enterprise 2.0 Conference in Boston today for their innovative internal collaborative environment. The Open Enterprise research project, led by Stowe Boyd and Oliver Marks, conferred the award to a company that was “truly transforming their organization at its core through deep, enterprise-wide adoption.” Walton Smith, a senior associate at the Virginia-based consulting firm, presented “hello.bah.com” to the crowd.
Smith described how Hello was built around people, focusing on connecting associates to each other and activity streams to profiles. According to Smith, more than 40% of the firm has added content to the system, rapidly forming connections with one another. Booz Allen Hamilton used agile development to create their Enterprise 2.0 platform, a methodology that now allows the team to roll out a new function every two weeks. Smith said that “functionality is driven by the users.” One upcoming feature, for instance, will allow users to rank and rate the quality of content entered into the system.
One initial roadblock that Smith noted was human resources, which viewed itself as the “official source” of data. In fact, the new intranet actually allowed employees to clean up bad data entered by HR into PeopleSoft on the back end.
When asked about security and compliance concerns – critical to a consulting firm that deals with government data or works with corporations with sensitive intellectual property – Smith noted several aspects of the system that are designed to prevent data leaks. First, only Booz Allen employees are allowed on Hello – not contractors. Second, data that comes under regulatory compliance actually resides in SharePoint, which Booz Allen uses for document-based collaboration for restricted content. Users can link to content from blogs, Confluence wikis or other pages but are confronted with an access control layer. Within the restricted environment, familiar compliance tools used in knowledge management are employed, like access management, monitoring and logging.
Smith is aware of the possibilities for a data breach, noting that “our weakest link is our people – we spend a lot of time making sure they know which tools to use.” He’s also cognizant of potential regional compliance issues, such as European Union laws that require that employees must opt-in to share information like pictures or work history with others.
The creators of Hello also had thought through employee departures. Smith allowed that departures weren’t “so much of an issue, given the economy,” but that there is a process in place. When someone moves on, a banner is added to the top of his or her profile page indicating the departure. That person won’t show up on the dropdown menu, which only includes actives employees for searchers, but the profile page itself, including connections and intellectual property created for Booz Allen, remains.
The proliferation of data security and privacy laws from state and federal agencies has created challenges and complexities for all entities that store and use data. One of the most controversial areas for these laws is whether or not they should specify data encryption as a requirement.
- Which laws currently specify encryption and which do not? What, exactly, do they specify?
- Should encryption be included at all in these laws?
- If so, what, exactly, should be specified?
- If not, what should the laws require?
One viewpoint holds that data encryption is a fundamental protection and strengthens consumer protection and privacy. From this viewpoint, laws that fail to specify encryption are weak, overly slanted toward business’ interests and inadequately protective of consumers and individuals’ privacy rights.
The counterpoint to that view, held by others, is that:
- Encryption as specified in current laws is a vague term, and thus somewhat meaningless.
- Specifying current encryption standards more concretely likely ensures the laws will quickly become outdated as technology advances.
- Mentioning encryption vaguely, without clear standards, creates business risk and uncertainty for those doing business in the commonwealth.
- Deviating so far from legislation in other states and federal approaches, in areas such as encryption and certification of third-party vendors, creates a situation where those third-party vendors may find it not worth implementing these capabilities just to do business in Massachusetts, leaving organizations at a competitive disadvantage without providing real benefit to consumers and individuals.
“Encrypted” transformation of data through the use of a 128-bit or higher algorithmic process into a form in which there is a low probability of assigning meaning without use of a confidential process or key, unless further defined by regulation of the department of consumer affairs and business regulation.
However, this definition does not set forth any circumstances under which data must actually be encrypted. When detailed regulations were issued in the form of 201 CMR 17.00: Standards for The Protection of Personal Information of Residents of the Commonwealth, regulators further specified that:
Every person that owns, licenses, stores or maintains personal information about a resident of the Commonwealth and electronically stores or transmits such information shall include in its written, comprehensive information security program the establishment and maintenance of a security system covering its computers, including any wireless system, that, at a minimum, shall [include] the following elements: Encryption of all transmitted records and files containing personal information, including those in wireless environments, that will travel across public networks.
An amendment currently under consideration in the Massachusetts Senate, SB 173, would seem to reverse that:
The department shall not in its regulations, however, require covered persons to use a specific technology or technologies, or a specific method or methods for protecting personal information.
What do you think? Should data security and privacy laws specify data encryption?
Melissa Hathaway spoke to a crowd of over 1,000 at a lunchtime address during the Symantec Government Symposium last week in Washington, D.C. President Obama appointed Hathaway on Feb. 9 as White House Acting Senior Director for Cyberspace for the National Security Council (NSC), and, until it was merged out of its painful existence on May 26, the Homeland Security Council (HSC), a Bush-era creation.
Obama directed Hathaway to conduct a comprehensive 60-day Cyberspace Policy Review, which was released on May 29. Obama is expected to name a permanent “cybersecurity czar” to implement the report’s recommendations.
The White House quelled turf speculation over the reporting structure for the impending U.S. cybersecurity position by quietly “merging” the HSC into the NSC on May 26, just three days before releasing the cybersecurity policy review.
The CSIS cyberspace review group, which was commissioned in August 2007 during the Bush presidency, delayed publication of the review until immediately after the 2008 presidential election. As readers of the document know, it contains significant criticism of the Bush-era DHS.
Hathaway’s report had been critical of the Homeland Security Council, again echoing the December 2008 CSIS report, which, among many others, was critical of the DHS. The HSC, with a staff of 250 mirroring NSA’s “twin” staff of about 250, produced almost identical “directives,” and seemed to many a duplicative and redundant Bush-era institution.
In her remarks, Hathaway raised several key issues with the audience, including:
- Private-sector data sharing: Although required to effectively detect and combat cybercrime, this can be wrongly, in her view, seen as an antitrust violation.
- Whether, when an organization puts its data in the cloud, it gives up its fourth amendment privacy rights.
- The unfinished legislative review work cited in a footnote in the 60-day cybersecurity review and the need for comprehensive legislative reform, which can be interpreted as a signal to backers of evolving state and federal legislation that their initiatives may be superseded.
- A national ad campaign on cybersecurity awareness, like the Smokey the Bear campaign.
- In terms of immediate priorities, that a national incident response plan is to be completed by end of year.
- That government also needs to work with the international cybersecurity community.
Hathaway, a top contender for the permanent White House post, confirmed that she is currently “in the interview process” for that position, which, she stated in an interview Tuesday, she hopes “will conclude in the next few weeks … and be resolved favorably.”
The daylong symposium consisted of 20 separate breakout sessions instructed by over 100 panelists, a veritable “who’s who” of highly influential cybersecurity-related officeholders in the current administration or Congress, plus a few luminaries in the world of IT security.
As a measure of industry optimism regarding future government spending on cybersecurity, Enrique Salem, CEO of Symantec’s $5 billion business, was among the symposium speakers, who also included:
- Steven Shirley, executive director, Department of Defense Cyber Crime Center
- Eran Feigenbaum, director of security, Google Apps
- Mischel Kwon, director, United States Computer Emergency Readiness Team (US-CERT), National Cybersecurity Division, Department of Homeland Security
- Jeremy Warren, chief technology officer, Department of Justice
- Peter Mell, senior computer scientist, National Institute of Standards and Technology
- Jacob Olcott, subcommittee director, U.S. House of Representatives Homeland Security Committee
- Jim Jaeger, director, cyber defense and forensics, General Dynamics
Other panels included key contributors to the highly influential December 2008 CSIS report on securing cyberspace. Hathaway’s White House Cyberspace Policy Review footnotes the CSIS report eight times, more than any other source listed among the document’s 67 total footnotes. On June 1, CSIS released a comparison of its 25 original recommendations with Hathaway’s report, noting that 17 of the 25 were adopted by the White House report.
When questioned Tuesday at the Symantec symposium, former CSIS commission members smiled knowingly and declined to name any of the other individuals currently under consideration for the permanent White House post besides Hathaway.
These panelists, cited in the CSIS report as contributors, included:
- Sameer Bhalotra, a career professional staff member of the U.S. Senate Select Committee on Intelligence who leads the SSCI cyber study team.
- Dan Chenok, senior vice president, Pragmatics and former OMB security policy executive.
- Bruce McConnell, former NSA senior executive, director of $100 million ArcSight and of Sun Microsystems’ federal subsidiary.
- Amit Yoran, CEO, NetWitness Corp., and former director, National Cybersecurity Division, DHS, and US-CERT.
The MIT Sloan CIO Symposium on May 20 in Cambridge, Mass., featured several panels on the top issues affecting CIOs. But one panel on governance, risk and compliance afterwards produced the most interesting discussion of the day, for me at least, when I caught up with two Patni Americas Inc. directors, Amit Sen and John Vaughan, also in attendance.
The two management consultants are proponents of expanding the definition and practice of risk management to include business model risk — that is, risk introduced into your company by new or changed capital ventures or business processes. In their view, business process automation has run amok, leaving the business (as well as the IT organization), exposed to risks that it might not be aware of.
“What we need to understand is where are we are introducing risks, and the risk is understood and planned and not a byproduct of a lack of knowledge or visibility into what actually goes on in the organization,” said Sen in the following podcast, recorded this week. In the podcast, Sen and Vaughan explain what business model risk is, how to measure and understand it, and how to make business model risk a key part of any risk management and IT governance strategy.