To help with this lack of privacy, especially online, the FTC today released a preliminary staff report titled “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers.”
“Technological and business ingenuity have spawned a whole new online culture and vocabulary … that consumers have come to expect and enjoy,” Leibowitz said during a conference call to discuss the report. “The FTC wants to help ensure that the growing, changing, thriving information marketplace is built on a framework that promotes privacy, transparency, business innovation and consumer choice.”
One method the FTC endorses is a “Do Not Track” mechanism that customers can use to opt out of the collection of information about their Internet activity for the development of targeted advertisements.
FTC representatives said the most practical method would most likely involve the placement of a persistent setting, similar to a cookie, on the consumer’s browser signaling the consumer’s choices about being tracked and receiving targeted ads.
“A ‘do not track’ browser setting would serve as an easy, one-stop shop for consumers to express their choices, rather than on a company-by-company or industry-by-industry basis,” said Leibowitz, adding that a coalition of organizations that includes Microsoft, Google, Mozilla and Apple has experimented with such a setting.
Other recommendations presented by the FTC to enhance consumer privacy include:
The FTC is seeking public comment on the report now through Jan. 31, and will issue final recommendations next year after working with stakeholders to refine and implement the policy recommendations.
“At this point we are making recommendations for best practices,” Leibowitz said. “We are putting this out for comment — we want feedback, then we will move forward.”
Leibowitz added, however, that at least from his perspective, “a legislative solution will surely be needed if industry does not step up to the plate.”]]>
Speaking from the FTC’s second roundtable on online privacy at the University of California, Berkeley, today, Vladeck expressed concern that consumers have little awareness of how data is being collected or used online. That concern extends to social media privacy, mobile data, manufacturing and cloud computing security.
Vladeck summarized the lessons from the first FTC privacy roundtable, held last year in Washington, D.C. Consumers are “unaware of whether and how they can exercise control” over online data, he said, including practices in data broker industry. The “practice of behavioral advertising may be unfamiliar to consumers.”
The fact that consumers do care about online privacy is driven home in many ways, said Vladeck. He cited the popularity of a popup blocker for the Firefox Web browser and interest in resources for managing social media privacy settings. “The No. 1 most-emailed article from The New York Times was about how consumers can change privacy settings on Facebook,” said Vladeck. “That speaks volumes.”
The FTC privacy roundtable will examine both how technology enhance consumer privacy and how it can challenge or circumvent it, said Vladeck.
The FTC sees a “troubling technological arms race” between consumer empowerment tools and technologies that enable more data collection, he said, with countermeasures developed each time a means to protect privacy is developed.
In his remarks, Vladeck broke the FTC’s privacy roundtable into four areas:
A full privacy roundtable agenda is available from FTC.gov.
The roundtable is being streamed online. Follow the conversation at #FTCprivacy on Twitter to read commentary in 140 characters or less or tune in to this list of privacy experts, workshop audience attendees and other commentators.]]>
On Monday, Yahoo launched a new online privacy tool that, in theory, allows users to gain more insight into the data that the media company has gathered about their interests. According to the press release, the tool provides users with the ability to “assert greater control over their online experience,” providing Yahoo’s “educated guesses about their interests” and granular controls for those users to opt out of those categories or out of interest-based advertising entirely.
The “Ad Interest Manager” was announced and released in beta on the same day the Federal Trade Commission held its first roundtable on privacy in Washington, D.C. The privacy workshop agenda (PDF) for the FTC privacy roundtable includes academics, advocates and representatives from media, data mining, software and analytics companies.
This introduction of an online privacy tool for consumers by Yahoo follows the addition of an online privacy dashboard from Google last month and the July release of self-regulatory online privacy principles for the use and collection of behavioral data for Internet advertising.
Whether such efforts are enough to preemptively address attention from the FTC will be an open question in 2010. As FTC chairman Jon Leibowitz stated earlier in the year, this is “industry’s last chance to get its act together on behavioral targeting.”
Capitalizing on this regulatory focus, the Center for Democracy & Technology (CDT) also began a consumer online privacy campaign last week called “Take Back Your Privacy.”
“All social media should have granular privacy controls,” said Leslie Harris, president and CEO of CDT. An important element of the CDT’s online privacy campaign effort includes the release of an online privacy Compliant Tool that allows people to register online privacy concerns with the FTC and share that action with connections with social media.
Harris, who was at the FTC’s privacy roundtable yesterday, says that next year will be “the first time there will be serious consideration of consumer privacy legislation in many years.”
According to the CDT’s Ari Schwartz, Rep. Richard Boucher has put forward an outline of a consumer privacy bill that will be a framework for action in January.
As Kara Swisher pointed out at CNET, the addition of this online privacy tool by Yahoo coincides with a “bigger backdrop” of “the pending regulatory approval of the massive search and advertising partnership between Yahoo and Microsoft. The two companies announced last week that they had completed the definitive agreement for the deal.”
As Swisher observed, “one of the key issues for regulators, of course, is the privacy implications of combining the search and online ad technologies of the No. 2 and No. 3 players.” Google, Yahoo and the online advertising industry as a whole will be watching carefully to see what FTC compliance and action from Congress will mean for all in the year ahead.
For insight into the way that the regulator sees the relationships here, review the FTC graphic below describing a user’s “personal data ecosystem.”
[Image source at FTC.gov. (PDF). For specific industry relationships, review these data flow charts. (PDF).]
The most immediate impact is the provision for an additional 60 days to comply with the regulations. The deadline for implementation is now March 1, 2010.
Individuals and municipalities have expressly been removed from guideline jurisdiction, with a clarification that the “regulation applies to those engaged in commerce.” Guidelines on the requirement for a written information security plan are now simplified.
A new definition for the term service provider was added. The Office of Consumer Affairs and Business Regulation also amended third-party vendor rules. There is now a two-year grace period, relative to existing contracts, and requirements for those third parties to be in compliance.
Encryption requirements have been clarified. The apparently strict but, practically speaking, vague 128-bit specification from the prior version was replaced by “technology-neutral language.”
Further, a “technical feasibility” standard has been incorporated, acknowledging that methods to securely encrypt data on portable devices may not yet be available. Email encryption now falls under the technical feasibility standard. Additionally, encryption of backup tapes has been clarified to include prospective encryption. So you may safely cancel your firm’s plans to encrypt existing backup tapes. Encrypting new backup tapes will still be required, along with any personal data that travels over the public Internet or wireless network.
In another change that I believe will ultimately enhance consumer protection, 201 CMR 17.00 has been brought in line with certain federal regulations. Specifically, the Massachusetts data protection act now cedes authority to the Federal Trade Commission‘s (FTC) standards established under the Gramm-Leach Bliley Act (GLBA). GLBA utilizes a risk management approach to data security.
The patchwork of 44 different state health data protection laws has delayed electronic automation of, and therefore overall security for, health records. Adopting a federal standard, starting with the FTC’s risk-based approach to data protection, avoids this pitfall and may make widespread compliance both more feasible and more likely in the near future.
On one hand, a risk management approach should be familiar to IT professionals. It shifts resources from “check-the-box” controls that may or may not address a particular organization’s specific risks to controls that make more sense in context. On the other hand, given the concrete definition of the personal information in scope, it is difficult to see where risk management would not be present whenever such personal data is stored.
“Mandating every component of a program and requiring its adoption, regardless of size and the nature of the business and the amount of information that requires security, makes little sense in terms of consumer protection,” said Bradley MacDougall, of Associated Industries of Massachusetts. Risk management and assessment will afford more consumer protection by matching a given business’ actual risks with required security investments.
In an interview with SearchCompliance.com, Undersecretary Barbara Anthony stated that “consumer protections have not been weakened in this amendment. Monitoring, reviewing the scope of security measures – and encryption is still required if you are going to transmit resident PII over public networks. What we’ve tried to do here is to not impose additional burdens which weren’t involved in the consumer protections.”
With the permission of the OCABR, we are posting the FAQ released by the office in full and unedited below. We will post the rest of Anthony’s comments tomorrow, along with further analysis of the changes referenced below.
What are the differences between this version of 201 CMR 17.00 and the version issued in February of 2009?
There are some important differences in the two versions.
First, the most recent regulation issued in August of 2009 makes clear that the rule adopts a risk-based approach to information security, consistent with both the enabling legislation and applicable federal law, especially the FTC’s Safeguards Rule. A risk-based approach is one that directs a business to establish a written security program that takes into account the particular business’ size, scope of business, amount of resources, nature and quantity of data collected or stored, and the need for security. It differs from an approach that mandates every component of a program and requires its adoption regardless of size and the nature of the business and the amount of information that requires security. This clarification of the risk based approach is especially important to those small businesses that do not handle or store large amounts of personal information.
Second, a number of specific provisions required to be included in a business’s written information security program have been removed from the regulation and will be used as a form of guidance only.
Third, the encryption requirement has been tailored to be technology neutral and technical feasibility has been applied to all computer security requirements.
Fourth, the third party vendor requirements have been changed to be consistent with Federal law.
To whom does this regulation apply?
The regulation applies to those engaged in commerce. More specifically, the regulation applies to those who collect and retain personal information in connection with the provision of goods and services or for the purposes of employment.
The regulation does not apply, however, to natural persons who are not in commerce.
Does 201 CMR 17.00 apply to municipalities?
No. 201 CMR 17.01 specifically excludes from the definition of “person” any “agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof.” Consequently, the regulation does not apply to municipalities.
Must my information security program be in writing?
Yes, your information security program must be in writing. The scope and complexity of the document will vary depending on your resources, and the type of personal information you are storing or maintaining. But, everyone who owns or licenses personal information must have a written plan detailing the measures adopted to safeguard such information.
What about the computer security requirements of 201 CMR 17.00?
All of the computer security provisions apply to a business if they are technically feasible. The standard of technical feasibility takes reasonableness into account. (See definition of “technically feasible” below.) The computer security provisions in 17.04 should be construed in accordance with the risk-based approach of the regulation.
Does the regulation require encryption of portable devices?
Yes. The regulation requires encryption of portable devices where it is reasonable and technically feasible. The definition of encryption has been amended to make it technology neutral so that as encryption technology evolves and new standards are developed, this regulation will not impede the adoption of such new technologies.
Do all portable devices have to be encrypted?
No. Only those portable devices that contain personal information of customers or employees and only where technically feasible The “technical feasibility” language of the regulation is intended to recognize that at this period in the development of encryption technology, there is little, if any, generally accepted encryption technology for most portable devices, such as cell phones, blackberries, net books, iphones and similar devices. While it may not be possible to encrypt such portable devices, personal information should not be placed at risk in the use of such devices. There is, however, technology available to encrypt laptops.
Must I encrypt my backup tapes?
You must encrypt backup tapes on a prospective basis. However, if you are going to transport a backup tape from current storage, and it is technically feasible to encrypt (i.e. the tape allows it) then you must do so prior to the transfer. If it is not technically feasible, then you should consider the sensitivity of the information, the amount of personal information and the distance to be traveled and take appropriate steps to secure and safeguard the personal information. For example, if you are transporting a large volume of sensitive personal information, you may want to consider using an armored vehicle with an appropriate number of guards.
What does “technically feasible” mean?
“Technically feasible” means that if there is a reasonable means through technology to accomplish a required result, then that reasonable means must be used.
Must I encrypt my email if it contains personal information?
If it is not technically feasible to do so, then no. However, you should implement best practices by not sending unencrypted personal information in an email. There are alternative methods to communicate personal information other through email, such as establishing a secure website that requires safeguards such as a username and password to conduct transactions involving personal information.
Are there any steps that I am required to take in selecting a third party to store and maintain personal information that I own or license?
You are responsible for the selection and retention of a third-party service provider who is capable of properly safeguarding personal information. The third party service provider provision in 201 CMR 17.00 is modeled after the third party vendor provision in the FTC’s Safeguards Rule.
I have a small business with ten employees. Besides my employee data, I do not store any other personal information. What are my obligations?
The regulation adopts a risk-based approach to information security. A risk-based approach is one that is designed to be flexible while directing businesses to establish a written security program that takes into account the particular business’s size, scope of business, amount of resources and the need for security. For example, if you only have employee data with a small number of employees, you should lock your files in a storage cabinet and lock the door to that room. You should permit access to only those who require it for official duties. Conversely, if you have both employee and customer data containing personal information, then your security approach would be more stringent. If you have a large volume of customer data containing personal information, then your approach would be even more stringent.
Except for swiping credit cards, I do not retain or store any of the personal information of my customers. What is my obligation with respect to 201 CMR 17.00?
If you use swipe technology only, and you do not have actual custody or control over the personal information, then you would not own or license personal information with respect to that data, as long as you batch out such data in accordance with the Payment Card Industry (PCI) standards. However, if you have employees, see the previous question.
Does 201 CMR 17.00 set a maximum period of time in which I can hold onto/retain documents containing personal information?
No. That is a business decision you must make. However, as a good business practice, you should limit the amount of personal information collected to that reasonably necessary to accomplish the legitimate purpose for which it is collected and limit the time such information is retained to that reasonably necessary to accomplish such purpose. You should also limit access to those persons who are reasonably required to know such information.
Do I have to do an inventory of all my paper and electronic records?
No, you do not have to inventory your records. However, you should perform a risk assessment and identify which of your records contain personal information so that you can handle and protect that information.
How much employee training do I need to do?
There is no basic standard here. You will need to do enough training to ensure that the employees who will have access to personal information know what their obligations are regarding the protection of that information, as set forth in the regulation.
What is a financial account?
A financial account is an account that if access is gained by an unauthorized person to such account, an increase of financial burden, or a misappropriation of monies, credit or other assets could result. Examples of a financial account are: checking account, savings account, mutual fund account, annuity account, any kind of investment account, credit account or debit account.
Does an insurance policy number qualify as a financial account number?
An insurance policy number qualifies as a financial account number if it grants access to a person’s finances, or results in an increase of financial burden, or a misappropriation of monies, credit or other assets.
I am an attorney. Do communications with clients already covered by the attorney-client privilege immunize me from complying with 201 CMR 17.00?
If you own or license personal information, you must comply with 201 CMR 17.00 regardless of privileged or confidential communications. You must take steps outlined in 201 CMR 17.00 to protect the personal information taking into account your size, scope, resources, and need for security.
I already comply with HIPAA. Must I comply with 201 CMR 17.00 as well?
Yes. If you own or license personal information about a resident of the Commonwealth, you must comply with 201 CMR 17.00, even if you already comply with HIPAA.
What is the extent of my “monitoring” obligation?
The level of monitoring necessary to ensure your information security program is providing protection from unauthorized access to, or use of, personal information, and effectively limiting risks will depend largely on the nature of your business, your business practices, and the amount of personal information you own or license. It will also depend on the form in which the information is kept and stored. Obviously, information stored as a paper record will demand different monitoring techniques from those applicable to electronically stored records. In the end, the monitoring that you put in place must be such that it is reasonably likely to reveal unauthorized access or use.
Is everyone’s level of compliance going to be judged by the same standard?
Both the statute and the regulations specify that security programs should take into account the size and scope of your business, the resources that you have available to you, the amount of data you store, and the need for confidentiality. This will be judged on a case by case basis.
This FAQ was posted with the direct permission of OCABR. For more information on the regulation, visit Mass.gov/consumer.
“If you work for an information broker, you definitely should be paying attention to this,” said Miriam Wugmeister, who chairs the global privacy and data security practice at law firm Morrison & Foerster. “But if you’re just a CIO at a national retail chain or at a financial institution, then this really is not that different.”
With this important caveat: The bill, like laws in states such as Massachusetts and Oregon, is moving toward what Wugmeister calls the next evolution in data privacy — a preventative approach with specific requirements for protecting data in the first place.
The proposed federal electronic data privacy bill, known as H.R. 2221, was introduced in April with little fanfare but is generating a bit more buzz in the wake of recent hearings on Capitol Hill.
Last week, representatives of the nation’s biggest brokers of online information — Google, Yahoo — appeared before House subcommittees on communication and consumer protection to answer questions about behavioral targeting, the tracking of users’ online behavior for various kinds of gain. Debate focused on the conflict between the individual’s right to privacy online and the advertising industry’s ability to make money.
Privacy advocates argued that most Internet users don’t understand the extent to which their online behavior is being monitored or how much electronic personal identifying information (PII) is being collected by large data brokers, such as Yahoo and Google. Nor are users aware of their ability to opt out of these data collection systems. Therefore, users need regulations that would require their consent to be tracked — or an opt-in (not opt-out) provision.
Advocates for the advertising industry argued these provisions would upend an industry already seriously weakened by the economic recession.
Another aspect of the law, if passed, would strengthen consumers’ ability to access and correct any personal information collected by businesses.
“In the U.S., unlike in the European Union, we don’t typically have the right to call up Amazon and say, ‘Tell me everything about me,’” Wugmeister said.
For CIOs at businesses that do not collect PII for sale to others, Wugmeister has two pieces of advice.
“I were a CIO, I would read Massachusetts,” she said. The law is among the nation’s most stringent for data protection and is proactive, requiring a comprehensive written security program and employee training. It also applies to any business, in or out of the state, that collects personal identifying information from a Massachusetts resident.
“The other thing you could read is the federal safeguards rule of the FTC,” she said. The rules are forming the consensus used by enforcement authorities, including the drafters of this bill, she said.
As for the increasingly anxious discourse on online behavioral tracking by data brokers, Wugmeister is a bit more mystified. “Those profiles of us for our offline behavior already exist. Every time you walk with your cell phone you are constantly transmitting your location. Your cell phone carrier has a log of every place you’ve been. Every time you use your credit card, there is a record of every place you’ve been and every place you’ve shopped.” In other words, Big Brother is already here.
In the coming months, I’ll be writing a lot more about H.R. 2221 and other IT compliance and security in weekly news articles for SearchCompliance.com. Let me know what compliance issues you’re grappling with and what kinds of information would be useful.]]>
But not everyone is laughing. In April 2008, Andrea Smith, age 25, of Trumann, Ark., was convicted of privacy violations under HIPAA, as was Fernando Ferrer Jr., of Naples, Fla., in January 2007. As of today, a total of eight cases have resulted in criminal convictions with jail time for data privacy violations under HIPAA.
The U.S. Department of Health and Human Services (HHS) has served notice (as of Feb. 18) that organizations can also expect substantial fines like the one extracted from CVS. That $2.5 million fine, coupled with others won by OCR or the FTC against Providence Health & Services, demonstrate that the risk of penalties is significantly more realistic going forward.
The probability of criminal convictions and risk of substantial penalties doesn’t, however, correlate to the likelihood of other serious compliance issues. “Stricter internal controls mandated by Sarbanes-Oxley have made it more difficult for improper payments to be concealed,” notes CorpWatch.
Consider the case of Richard Scrushy, founder of HealthSouth. Although theoretically acquitted of Sarbanes-Oxley (SOX) charges, he nevertheless sits in a Birmingham, Ala., prison. Although Scrushy was technically jailed for probation violations related to a vacation on a Miami yacht when he was supposed to be under house arrest in Birmingham, SOX materially contributed to Scrushy’s imprisonment. Some commentators have pointed to the few convictions under SOX when dismissing likelihood of consequences. But, as anyone involved with the legal system can attest, likelihood of conviction and fines barely begin to measure likelihood of serious problems. Let’s look at some other data:
HIPAA Enforcement Results by Year
Source: U.S. Department of Health and Human Services
Simply receiving notice of an investigation requires firms and individuals to incur the costs of retaining counsel and allocating time, energy and resources to preparation. That’s a nerve-racking process with an unsure outcome. The investigation alone can be a big headache. And while only 10 cases have resulted in major fines or jail time, significantly more cases were prosecuted.
Preparing and presenting a criminal or civil defense in a legal case is, again, a costly undertaking with an unsure outcome, where even acquittal can leave an organization or an individual at a huge financial loss for attorney’s fees and energy, resources and the uncertainty that legal action causes.
How about nonconviction convictions? Plea deals can result in CWOF results, or Continued Without a Finding, and result in probation. Home-free, right? That’s what Richard Scrushy thought. The reality is that each step along the legal path increases the likelihood that subsequent or related, seemingly minor developments will result in jail time or fines. Organizations and individuals amass track records, which work against them over time.
SOX and HIPAA are only two of dozens of statutes under which privacy violations can be prosecuted. Try these for a few:
Health privacy laws
1974—The National Research Act
1996—Health Insurance Portability and Accountability Act (HIPAA)
Financial privacy laws
1970—Bank Secrecy Act
1998—Federal Trade Commission
1999—Gramm-Leach-Bliley Act (GLB)
2002—Sarbanes-Oxley Act (SOX)
2003—Fair and Accurate Credit Transactions Act
Online privacy laws
1986—Electronic Communications Privacy Act (ECPA), pen registers
1986—Stored Communications Act (SCA)
Communication privacy laws
1978—Foreign Intelligence Surveillance Act (FISA)
1984—Cable Communications Policy Act
1986—Electronic Communications Privacy Act (ECPA)
1994—Digital Telephony Act – Communications Assistance for Law Enforcement Act (CALEA), 18 USC 2510-2522
Education privacy laws
1974—Family Educational Rights and Privacy Act (FERPA)
Information privacy laws
2001—USA Patriot Act, expanded pen registers
2005—Privacy Act, sale of online PII data for marketing
Still skeptical? California alone has over 88 data privacy laws — and it actively investigates and prosecutes violations.
Twenty-three thousand HIPAA investigations over five years x 100 laws = over 2 million investigations. Your chances are looking worse and worse. And the cost of voluntary compliance is looking cheaper and cheaper by comparison.
Last week, the FTC said it will delay enforcement until Aug. 1, “to give creditors and financial institutions more time to develop and implement written identity theft prevention programs.”
This is the second enforcement delay of a major data protection law. Massachusetts extended enforcement of its 201 CMR 17.00 law until Jan. 1, from the original enforcement date of May 2009, also to give constituents more time to get into compliance.
Security expert and SearchCompliance.com contributor Paul Roberts of The 451 Group sees a pattern developing, which he relayed in an email:
I think the decision to delay Red Flag Rule enforcement is yet more evidence that the public sector has a lot to learn about formulating and then implementing data privacy regulations. What’s so interesting is how closely the FTC’s Red Flag Rule headache parallels Massachusetts regulators’ headaches trying to implement their “toughest in the nation” data privacy laws.
“The lesson in both cases is that regulators need to put down the sledgehammer when writing these new rules and spend more time refining their scope and soliciting input from the private sector so that they understand the practical impact of new requirements on businesses, nonprofits and individuals. Practically: Some kind of phased-in approach to enforcement would seem to make sense. And, as with the PCI regulations, it might be smarter to have an iterative process to writing these kinds of regulations, rather than trying to fix a complex problem (data theft, data privacy) in one fell swoop. So you might start with small-bore regulations that have teeth, but are focused on clear problems and easy to implement, then expand and refine them over time, as conditions change.
Seems like smart advice. Perhaps security, compliance and risk managers from corporate America should start calling for a change of strategy from federal and state lawmakers. But on the other hand, he’s also right about the fact that the “public sector has a lot to learn about formulating and then implementing data privacy regulations.” As we have also pointed out, many compliance, security and risk managers are finding themselves out of the loop, creating a major disconnect between the new laws and the efforts many companies are putting forth to get into compliance.]]>