IT Compliance Advisor


March 31, 2016  2:44 PM

ACLU of Mass. director: Accountability required for data privacy law reform

Fran Sales Fran Sales Profile: Fran Sales
Consumer data, Continuous Data Protection, Data privacy, Data protection, Privacy rights

Drawing the line between protecting consumers’ right to data privacy and giving the government access to that data to keep the public safe isn’t as simple as looking at legal cases that have dealt with this issue, various data privacy experts explained at this month’s Forum on Data Privacy hosted by the Massachusetts Attorney General’s Office at MIT.

One major obstacle to drawing this line is a lack of transparency. While there is much publicity regarding the ongoing debate between Apple and the FBI and whether the tech company should build a backdoor into the smartphone of one of the San Bernardino shooters, this is not an isolated case. There have been dozens of cases prior to San Bernardino in which the government has sought to compel private companies to turn consumer data over for public safety reasons, but these incidents have happened behind closed doors and outside the public spotlight, said panelist Carol Rose, executive director of the American Civil Liberties Union of Massachusetts.

This secrecy and lack of a public debate is a huge problem when it comes to strengthening existing laws and policies that address consumer data privacy, said Rose. She added that an open, democratic public debate is the only way to settle where the line should be drawn.

Rose warned that a lack of transparency, combined with the legal precedent set by the government by secretly recruiting private companies to create backdoors to access consumer data, is the recipe for losing consumer trust. This will start a dangerous domino effect for companies.

“When people don’t know, then they really lose trust, and they don’t become early adopters because they don’t think that the tech companies are on their side,” she said. This could result in consumers losing out on the benefits of these technologies, becoming more vulnerable to hackers.

The stance technology companies, lawmakers and other parties should be thinking about when discussing data privacy law reform is not the idea of “technology versus liberty” but rather “technology in the service of liberty,” said Rose, who launched the Technology for Liberty Project in 2013 to promote this point of view.

These are the three major areas privacy law reform needs to address, according to Rose:

  • Access and control. This is the notion of giving users the right to access their information and correct it if it’s wrong. “We get those kinds of reports all the time, where the data is wrong and put it in incorrectly, and [users] have no way to access it,” said Rose.
  • The “third-party doctrine.” This is the idea of questioning whether users or the third-party providers that hold their data are the rightful owners of that information, whether that third party is a bank, a doctor, healthcare provider or Internet service provider.
  • Notice. Related to the third-party doctrine, Rose said that lawmakers also need to address, with special rules, consumers’ right to get notified if another party is going to use their data, particularly if that party is the government. This is where U.S. policy varies from other countries. In Europe, for example, data protection laws dictate that consumers technically own their data. “If [an EU] government needs your data, they have to tell you at least, and in some cases get your permission. In the U.S., that does not have to happen,” Rose said.

Most importantly, these policies should distinguish between the government and private sector, said Rose, warning about the dangers of the U.S. government going further down the road of recruiting private companies to do its data collection bidding in secret.

“Invariably, abuse will happen; invariably people will find out about it; [there will be] loss of consumer trust and damage to the high-tech community, and the potential promise of all these innovations could really be lost if we go down that road,” she said.

March 24, 2016  10:41 AM

Cybersecurity questions get the boardroom’s attention

Fran Sales Fran Sales Profile: Fran Sales
board, CISO, Compliance, Cyberattacks, cybersecurity, Data breach, Data breach disclosure, Information security, Microsoft, RSA Conference

“Security has transcended from an IT issue to a boardroom issue.” This was how Microsoft corporate vice president and CISO Bret Arsenault opened his panel discussion at last month’s RSA Conference in San Francisco. He made it clear that security is no longer solely the responsibility of the IT department; it’s one that’s shared with business units, users and members of the supply chain.

Yet other than a very few edge cases, the types of security compromises of the past two years that have hit high-profile retail brands and other companies aren’t really that much different from those of the past, argued Arsenault. So why has the security conversation changed? Namely from the convergence of these three factors: the scale of compromises, laws that require the disclosure of breaches related to customer data, and the political motivation of malicious actors.

“We absolutely see [these factors] creating a different news cycle … and a different level of interest for the board,” Arsenault said.

These factors, combined with increased regulatory attention from agencies such as the Federal Trade Commission and the Securities and Exchange Commission, as well as increasing class-action lawsuits filed by consumers and shareholders, have caused today’s boards of directors to pay greater attention to cybersecurity questions.

“One of the big questions I see most boards asking — one of the big differentiators in the maturity of the security program — is, ‘Are you holding IT accountable or are you holding the whole company and the supply chain accountable? And how do you go do that?'” said Arsenault.

The board’s duty now includes determining previously unimagined cybersecurity risks, but balancing this with its other, more pressing responsibilities is no small feat.

“Cybersecurity really isn’t the most important thing in the grand scheme of things — strategic risk, financial risk, operational risk and legal risk — until something goes wrong,” he said.

Arsenault advised boards to focus on the following cybersecurity issues when scoping potential risks:

Security strategy and budget review. It’s not the board’s job to produce a detailed, 40-page budget review, but it does need to make sure it’s aligned with management on the security budget, Arsenault said. A good board will ask one key question during a discussion on cybersecurity with the IT and security teams: Do you have everything you need? “The answer to that would be, ‘Yes, yes I do. Or, I do, but here are the things I see coming,” he added.

Security leadership. “Most companies now list CISOs as a critical role relevant to the performance of the company,” said Arsenault. Questions the board should ask regarding senior leadership include who security teams report to, and whether the right person is employed for the job.

Incident response plans. Look at whether the company has adequate firewalls and intrusion detection tools in place, as well as how to allocate resources between detection and response tools.

Ongoing assessment. This doesn’t mean the board needs “a ton of KPIs,” said Arsenault, just enough to show them trends and developments in the security industry.

Internal education. Ask about how frequently employees are educated about cybersecurity risks and the steps they should take if the company is breached. “Enlightened boards will ask about this,” Arsenault said. “Users are in control, so what is the user effectiveness of that control?”


March 8, 2016  4:18 PM

RSA 2016: Adobe, Google and Microsoft prepare for EU GDPR

Fran Sales Fran Sales Profile: Fran Sales
Adobe, Chief Compliance Officer, Chief Privacy Officer, Data privacy, Data protection, European Data Protection legislation, Google, Microsoft, regulatory compliance, RSA, RSA Conference, Safe Harbor

When General Data Protection Regulation — a new EU-wide data protection framework that will replace Safe Harbor — was introduced by European Union on December 2015, global companies such as Adobe cheered.

“We were hoping for one law that would finally govern all of Europe, instead of differing interpretations,” MeMe Rasmussen, chief privacy officer (CPO) for the software company, said at a RSA 2016 panel session.

The current text of the regulation, however, seems far from a one-stop shop: For starters, the 200-page document is made up of 50 different components. Moreover, data protection authorities (DPAs), or the local authorities of each EU country, have retained interpretive capabilities over each of these components.

“It was written by people who don’t run businesses. [Companies] have to look at it and figure out how we comply. … What they did was leave a lot open to resolution,” Rasmussen said.

GDPR aims to create stronger standards for data protection, but won’t be finalized until later this year and won’t go into effect until 2018. This hasn’t stopped Adobe, along with Google and Microsoft (which were also represented in the panel), to already start considering the risks that could arise once the regulation is in full force.

Not only does the regulation’s sheer breadth mean challenges in deciphering its terms, but it could also create a greater number of compliance obligations that spell significant risk for organizations. Moderator Trevor Hughes, president and CEO of the International Association of Privacy Professionals, cited one sobering statistic: The maximum fining authority given to DPAs and regulators in Europe amounts to 4% of a company’s annual global turnover.

So how are organizations like Adobe, Google and Microsoft looking at GDPR and preparing their responses?

Rasmussen said Adobe is still waiting for the dust to settle — which she doesn’t expect to happen soon. “We kind of ended up with a mixed bag and don’t yet know a lot of what we’re going to have to do. … We’re still waiting for guidance on what certain terms mean.”

Microsoft CPO Brendon Lynch said his company has been developing and investing in its privacy management program for several years, and views GDPR as an “incremental step” instead of a huge shift.

“The reality is, yes, there are more obligations and details to work out, but ultimately it feels like [Microsoft has] taken into account all the new requirements,” he said. “We’ll do a gap analysis against what we currently have. I’m sure there will be some places where we have to do some more.”

Lynch added that while he doesn’t intend to play down GDPR, it doesn’t necessarily change Microsoft’s security posture. It does, however, raise the stakes of failing to comply.

Microsoft also expects greater assurances from GDPR. “How can we get more assurance that all the controls we have in place are effective … that everything is working as they should?” Lynch said.

Keith Enright, legal director of privacy at Google, said that his company is fully aware of GDPR’s diverse requirements that it will have to decipher for users.

“I don’t think we ever really deluded ourselves, given our experiences in Europe, that we would have absolute uniformity of law,” he said.

Enright added that Google and other global tech industry leaders must actively engage with DPAs and regulators in Europe around data privacy.

“[We need to] negotiate and draw out rationality and application as much as we can so that our interests aligned with the DPAs’. We want to protect user privacy to the greatest extent possible, and GDPR gives us the framework,” he said.

One area in which Google will have more work to do under GDPR, said Enright, is developing its privacy programs so that it not only addresses protecting users’ data privacy but also strengthens  the transparency of the company’s compliance efforts.


February 26, 2016  9:53 AM

Privacy Shield faces challenges as regulators, businesses adapt

Fran Sales Fran Sales Profile: Fran Sales
Chief Compliance Officer, Data privacy, Data protection, European Data Protecion legislation, FTC, Safe Harbor, Surveillance

Currently, there are very few concrete details available to the public regarding Privacy Shield, the newly proposed EU-U.S. agreement that will replace the now-void Safe Harbor. In part one of this blog post, we sifted through the information that is currently available to highlight three elements of Privacy Shield worth noting.

Here, Jacqueline Klosek, senior counsel at Goodwin Procter LLP, lays out the potential challenges for the agreement, including how it could affect businesses and their customers.

Privacy Shield has been presented by the European Commission (EC) as a framework that deviates from Safe Harbor, which the European Court of Justice found had deficient privacy protections. But in reality, the new framework’s structure is very similar to its predecessors, at least from what can be gleaned from the details that have been made public, said Klosek (see part one of this blog post, link above).

Despite these similarities, Privacy Shield does have some notable differences that appear to address the inadequacies of Safe Harbor, including the creation of a privacy “ombudsman” role in the U.S. who will be in charge of investigating complaints of inappropriate surveillance by U.S. national security agencies. Furthermore, the EC said that for the first time, the U.S. government has provided “written assurances” that it will place limitations on surveillance programs and set up privacy safeguards.

But will these differences be sufficient enough for Privacy Shield to pass muster with the European authorities?

Right now, the answer to that question is unclear, said Klosek. Like Safe Harbor, she believes the proposal will be challenged in court.

“Ultimately, whether it succeeds will be a question for the European Court of Justice that will depend in large part upon the extent to which the finalized accord successfully addresses the deficiencies that were identified in the Schrems decision,” Klosek said.

Moreover, before U.S. companies can rely on the new framework, the EC is required to create a more detailed “adequacy decision” that can only be approved after EU data protection authorities (DPAs) determine that Privacy Shield does enough to protect EU citizens’ rights — and there is a risk that the DPAs won’t, said Klosek. “[DPAs] could bring enforcement actions against U.S. companies operating in Europe, which would greatly complicate compliance efforts,” she added.

Implications for businesses, consumers

Based on currently available information, there probably won’t be many changes ahead for business that relied on Safe Harbor before, said Klosek. However, the logistics detailing how and when Privacy Shield will be implemented have not yet been settled.

“The fact that the Privacy Shield is so similar to the Safe Harbor means it won’t provide long-term guarantees to U.S. businesses until the European Court of Justice confirms its validity under EU law,” she said.

There is good news for EU consumers whose data is being transferred to the U.S.: They may find more accessible, expanded avenues for redress, as well as more robust privacy protections.

As for businesses that don’t comply with Privacy Shield’s requirements? They could find themselves the target of an enforcement action by the FTC, particularly after allegations that the agency did not enforce Safe Harbor firmly enough in the past.

“But how that will play out on the ground remains to be seen,” Klosek said.


February 24, 2016  12:07 PM

Privacy Shield details lacking, but so far varies little from Safe Harbor

Fran Sales Fran Sales Profile: Fran Sales
Chief Compliance Officer, Compliance, Data privacy, Data protection, European Data Protecion legislation, FTC, Safe Harbor

Two weeks ago, European Commissioner Věra Jourová tweeted that the text for Privacy Shield, a new framework for transatlantic data flows, will be finalized by the end of February. The agreement between the EU and the U.S. will replace the invalidated Safe Harbor framework to provide EU citizens with stronger privacy protections as their personal data is being transferred to the U.S., according to a statement from the European Commission.

Very few details about Privacy Shield have been made public. What we do know is that the agreement aims to fix inadequate privacy protections in the now-defunct Safe Harbor. Other than these differences, Privacy Shield largely keeps Safe Harbor’s structure in place, said Jacqueline Klosek, senior counsel at Goodwin Procter LLP.

“Based upon the information that is presently available, the changes in the Privacy Shield are minor, but move the U.S. incrementally toward greater privacy protection of EU citizens,” she said.

How exactly does Privacy Shield aim to achieve this? From what the European Commission (EC) has revealed so far, there are three noteworthy components of the agreement, according to Klosek:

Strong privacy obligations for U.S. companies that handle personal data. The U.S. Department of Commerce will monitor whether companies are publishing privacy standards, and the Federal Trade Commission (FTC) will enforce these obligations under U.S. law. “Note that although the precise standards remain to be fleshed out, this requirement largely parrots the Safe Harbor program,” said Klosek.

Limits on the surveillance of personal data being transferred to the U.S. The Schrems decision that invalidated Safe Harbor determined that the agreement allowed the indiscriminate collection and surveillance of EU citizens’ data. “The United States has given the EU binding assurances that the access of public authorities for national security purposes will be subject to clear limitations, safeguards and oversight mechanisms,” Jourová said in a statement. According to the EC, these safeguards and limitations will be subject to an annual joint review by the EC and U.S. Commerce Department, although there is little detail about how rigorous this review will be.

However, there are a few holes in this aspect of Privacy Shield that still need to be addressed, Klosek pointed out. First, an investigation of U.S. intelligence legal frameworks shows that the U.S. government was already under these types of limitations. “This was the main U.S. government objection to the Schrems decision. … It did not accurately characterize the state of U.S. intelligence programs,” Klosek said. One notable difference between Privacy Shield and Safe Harbor is a requirement to appoint a privacy “ombudsman” in the U.S. that would field EU citizens’ complaints regarding U.S. surveillance. But the responsibilities of this role are also still unclear, said Klosek.

She added that the EU’s stance on requiring these oversight mechanisms for U.S. intelligence programs excludes the fact that EU national governments’ intelligence programs are subject to “far less oversight” than their U.S. counterparts.

Redress mechanisms to protect EU citizens’ right to privacy. EU citizens who believe their data has been misused will be able to file complaints using “accessible and affordable” dispute resolution mechanisms, according to Jourová. European Data Protection Authorities will refer these complaints to the FTC and the U.S. Commerce Department for possible enforcement. Furthermore, the Judicial Redress Act, which has already passed both houses of U.S. Congress, would allow foreign citizens to sue if they believe their privacy rights were compromised by the U.S. government.

Head to part two this blog post, where we explore the likely challenges Privacy Shield will face from EU authorities, the possible impact of the agreement on businesses and their customers, and more


February 19, 2016  10:57 AM

Apple, FBI face off in iPhone backdoor debate

Fran Sales Fran Sales Profile: Fran Sales
Apple, Apple iOS, backdoors, Compliance, Dodd-Frank, Encryption, FBI, grc, iPhone

This week, Apple chief Tim Cook said in a letter to the company’s customers that it won’t give in to the FBI’s demand to create an iPhone backdoor. Plus, the number of resolved FCPA enforcement cases reaches a 10-year low while the U.S. Justice Department intensifies foreign bribery efforts.

Apple resists FBI order to create iPhone backdoor

In a letter to Apple’s customers released Feb. 17, CEO Tim Cook said that the Federal Bureau of Investigation (FBI) has demanded that the company create a backdoor to its iPhones. A U.S. judge issued an order stating that Apple had to provide “reasonable technical assistance” to the FBI in order to help it unlock data from an iPhone that was used by one of the San Bernardino shooters.

According to Cook, this “assistance” involves creating a new, weaker version of iOS that bypass multiple important security safeguards. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession,” he said in the letter. Cook added that instead of asking Congress to pass legislation, the FBI is citing the All Writs Act of 1789 to propose an expansion of its authority.

apple headquarters, image

Apple Headquarters in Cupertino, Calif. (via Wikipedia)

The creation of the backdoor would set back several years of security advancements and put Apple’s customers at greater risk of being hacked, Cook said. The U.S. Department of Justice did not waver in its stance, saying in a statement that it’s “unfortunate” that Apple continues to refuse to help obtain data from a terrorist’s phone.

As resolved FCPA cases hits a low, DoJ boosts foreign bribery forces

The DoJ is ramping up its efforts to uncover foreign bribery crimes, beefing up the number of agents, prosecutors and accountants dedicated to increasing  the number of wiretaps and informants to deliver new bribery cases under the Foreign Corrupt Practices Act (FCPA). This bolstering of forces will be funded using billions of dollars seized from criminals.

The increase in resources comes after U.S. officials resolved a record-low number of 20 FCPA enforcement actions in 2015 — the lowest since 2006, according to a report from Miller & Chevalier. This drop can be attributed to the Justice Department pursuing bigger cases, such as the current probe into Wal-Mart Stores Inc., as well as not having sufficient resources to investigate all of the foreign bribery cases brought in over the last 10 years, according to George Khouzami, assistant special agent in charge of the FBI’s New York office.

New Minneapolis Fed chief: Dodd-Frank insufficient

Neel Kashkari, the new president of the Federal Reserve Bank of Minneapolis, said in his first public statement since taking over the position in January that the Dodd-Frank Act of 2010 “did not go far enough” in doing away with the problem of financial institutions that are “too big to fail” and would require government bailout in the event of a crisis.

Kashkari, a Republican who served under the Bush administration as Assistant Secretary of the Treasury for Financial Stability, advises legislators to seriously consider options including breaking up banks into smaller entities and “taxing leverage throughout the financial system” to mitigate systemic risks.

Presidential candidate Bernie Sanders supported Kashkari’s stance, while others defended Dodd-Frank in the wake of Kashkari’s speech. The Dodd-Frank defenders said that the law provided regulators with the power to break up firms if they lacked adequate plans for going through bankruptcy without taxpayer help or if they posed significant danger to the financial system.


February 5, 2016  12:20 PM

Barclays, Credit Suisse to pay $154M for ‘dark pool’ trading violations

Fran Sales Fran Sales Profile: Fran Sales
Compliance, Data governance, Data privacy, Data protection, EMV, PCI compliance, Safe Harbor, SEC

The U.S. Securities and Exchange commission announced last week that global banks Barclays and Credit Suisse would pay a record total of more than $154 million to settle allegations over “dark pool” trading. In other recent GRC news, retailers continue to face EMV chip hurdles months after new payment card standards went into effect; and the U.S. and European Union have reached a data transfer agreement.

Barclays and Credit Suisse settle ‘dark pool’ cases

Last week, Barclays PLC and Credit Suisse Group AG agreed to pay a total of $154.3 million to settle federal and state charges that the global banks misled investors with “dark pools,” or private trading platforms where exchanges are not visible to other traders until they are executed. The two settlements are the largest fines ever paid associated with cases that involve dark pool trading, according to a statement by the SEC.

Both banks were charged with misinforming its investors about how their exchanges were monitored in these private venues. Barclays didn’t convey enough information to its clients about how it policed its dark pool’s high-frequency trading, while Credit Suisse did not disclose to investors that the bank systematically prioritized routing orders to its dark pool trading platforms over other venues, according to the SEC’s statement.

Barclays plc world headquarters

Barclays plc world headquarters (via Wikipedia)

“The SEC will continue to shed light on dark pools to better protect investors,” SEC chairperson Mary Jo White said in the statement. New York Attorney General Eric Schneiderman also said at a press conference that his office will continue ongoing investigations into dark pools.

Retailers face EMV chip challenges

On October 1, 2015, new payment standards went into effect requiring retailers to process payment cards embedded with an EMV chip to help reduce the risk of payment data being stolen. It also shifted risk from banks to retailers: The new rules state that merchants who failed to set up chip-enabled terminals could be liable for fraudulent transactions.

Greg Buzek, president of retail consultancy IHL Services, wrote in a blog post that only 8.5% of merchants are currently equipped to process the chip cards. Buzek added that the EMV mandate “forces a tax” on retailers because it slows transaction times and does not validate whether the chip card user is the legitimate owner of the card, which “does absolutely nothing for online or mobile fraud,” he wrote.

Furthermore, IHL’s research found that retailers are getting charged for fraudulent transactions due to lost and stolen cards, which, according to the new EMV guidelines, they should not be liable for. This has spurred retailers to create audit trails to protect themselves from these inappropriate charges, making them even more of a target for data breaches and theft, Buzek argued.

U.S. and E.U. authorities reach data transfer deal

Authorities from the E.U. and U.S. have reached an agreement to remove data transfer restrictions for European and U.S. companies. The deal will replace Safe Harbor, the former agreement between the U.S. Department of Commerce and the EU that had allowed over 4,000 companies to bypass EU data transfer rules and move EU citizens’ personal data across the Atlantic. This framework was struck down last year because of U.S. surveillance concerns.

The agreement would prevent legal action against companies, according to EU regulators.  Sources told Reuters that the agreement would include more robust oversight of companies’ compliance of EU data protection laws, and that the U.S. access to EU citizens’ data would be subject to limitations.


January 21, 2016  3:21 PM

FTC report: Big data analytics could prove harmful to consumers

Fran Sales Fran Sales Profile: Fran Sales
Big Data, Big Data Analysts, Consumer data, FTC, FTC Act, Information security, Predictive Analytics, regulatory compliance

Big data analytics have proven extremely beneficial to both companies and consumers across a wide range of industries, producing valuable insight in fields like healthcare, education and transportation. There is also, however, the potential for this data to be used in a way that harms consumers, according to a Federal Trade Commission report published earlier this month. As I covered in Searchlight last week, the FTC report had a clear message: The federal agency will not hold back from investigating unethical big data analytics processes and bringing enforcement actions against businesses that employ them.

But which big data processes exactly does the FTC report say are potentially problematic, and which does it consider acceptable? Highlights include the following:

Problematic: Differentiating products based on population subsets. Any big data analytics practice that limits provision of products or services to certain population subsets based on statistical input is considered unfair by the FTC, “especially if any of the data can be a proxy for poor or minority or underserved populations,” said Brenda Sharton, head of law firm Goodwin Procter LLP’s business litigation division. She offered the following industry examples of such practices: withholding access to credit, housing or employment due to background checks or screenings; and offering different rates or prices on insurance or product delivery based on an individual’s address. The FTC considers collecting several types of data problematic, including zip codes, social media usage/membership, and shopping habits. Sharton also warned against differentiating products based on characteristics such as race, gender and marital status.

Problematic: Making false promises to customers about how data is analyzed. Another big data practice companies should be wary of is making promises to consumers about how their data will be used and whether it will be entered into predictive analytics platforms. “If you’re going to be using [the data] for any statistical analysis, and it’s either you or your third-party vendor, you want to make sure you don’t promise the consumer that you ‘won’t do that,’ or that you’re informing them that you will,” said Sharton, who also serves as co-chair of Goodwin Procter’s privacy and cybersecurity group.

Acceptable: Targeted advertising. In most cases, this practice is OK with the FTC, said Sharton. For example, “A company’s advertisement to a particular community for credit offers that are open for all to apply is unlikely to violate [equal opportunity credit laws],” she said.

Problematic: Failing to reasonably secure consumers’ data. The FTC acknowledges that in the era of big data, companies are justified when collecting more data than they need. This means their information security needs to be proportionally robust — leading to another practice the report’s authors say is unacceptable: failure to implement security measures that are sophisticated enough to secure data “commensurate with the amount and sensitivity of the data at issue, the size and complexity of the company’s operations, and the cost of available security measures.” For example, organizations that maintain sensitive data such as Social Security numbers or customers’ medical data must have stronger security safeguards than those that only maintain consumers’ names, they added.

The FTC recommends that companies look at three factors to determine whether their information security is strong enough in proportion to the types of data they manage:

  • The amount and sensitivity of data
  • The size and complexity of the company’s operations (“What’s right for a massive Fortune 100 company will be different than what’s right for a … small company,” said Sharton.)
  • What security measures are available and how much they cost

The bottom line for companies employing big data analytics? Full regulatory compliance will likely require some legal advice. “They should ensure they have the counsel to determine whether they are complying with things like FCRA, ECOA and other laws,” Sharton said.

What’s your take on the downside of big data analytics? Let us know at editor@searchcompliance.com.


January 7, 2016  1:31 PM

Repeat HIPAA violators face minimal ramifications

Fran Sales Fran Sales Profile: Fran Sales
grc, HIPAA, PCI compliance, PCI DSS, regulatory compliance, SSL/TLS, TLS

Despite several HIPAA violations, recent data analysis found U.S. healthcare providers such as CVS and the VA face few punitive actions. Also in recent GRC headlines: Companies have two more years to meet the TLS requirement under PCI DSS, and experts foresee big changes ahead for the FCPA’s corruption enforcement practices.

Few penalties imposed on frequent HIPAA violators

CVS Health, Kaiser, the U.S. Department of Veterans Affairs and Walgreens were among hundreds of U.S. health providers that repeatedly violated the Health Insurance Portability and Accountability Act (HIPAA) between 2011 and 2014, according to an analysis of federal data by ProPublica, a non-profit newsroom. Despite numerous violations, sanctions against these providers have rarely been imposed, ProPublica’s research found.

The provider with the second-highest number of violations, CVS, pledged to improve its privacy safeguards or was reminded of its HIPAA obligations by the Office for Civil Rights (OCR)  more than 200 times, according to ProPublica. The VA was the most persistent HIPAA violator, according to the data: Its clinics, hospitals and pharmacies violated HIPAA compliance 220 times, but the OCR never publicly reprimanded or sanctioned the health provider.

Experts say that while some privacy problems are to be expected among large healthcare providers, persistent complaints are a sign of organizational failures. Deven McGraw, deputy director for health information privacy at OCR, told ProPublica that the agency’s top priority is to investigate breaches that affect at least 500 people. She also acknowledged that the OCR can do more about providers who repeatedly violate HIPAA. Although OCR receives thousands of privacy complaints a year, it has issued fewer than 30 financial sanctions for privacy violations since 2009.

Merchants get two extra years to meet key PCI DSS requirement

The Payment Card Industry Security Standards Council (PCI SSC) announced last month that merchants that need to be compliant with Payment Card Data Security Standard (PCI DSS) version 3.1 now have until June 2018 to migrate away from vulnerable encryption protocols, two years later than the original date of June 2016.

Under PCI DSS 3.1, which was released in April 2015, organizations must migrate away from older versions of Transport Layer Security (TLS) — versions 1.0 and earlier — and any version of Secure Sockets layer (SSL) by this date. Furthermore, effective immediately, these organizations are prohibited from implementing new technology that relies on SSL and early TLS. According to a large body of research, these protocols were deemed cryptographically insecure and put payment data at higher risk of exposure.

One of the main reasons PCI SSC extended the deadline until June 2018 is because it hasn’t been seeing criminals accessing cardholder data through the protocol flaws, PCI SSC international director Jeremy King told eWeek. PCI SSC is trying to balance risks with operational needs, King added, but he also warned that the date change is not an excuse to “do nothing for two years.” Instead, he suggested merchants migrate away from the flawed protocols as early as possible.

Major changes to FCPA enforcement expected for 2016

The Foreign Corrupt Practices Act (FCPA) unit of the U.S. Justice department will be receiving more scrutiny and resources this year in light of low enforcement in 2015. The FPCA plans to add 10 new staff members, and DOJ leaders say they will focus on greater transparency as well as pursuing “high impact” bribery cases, The Wall Street Journal reported.

The Justice Department’s foreign corruption unit settled only two corporate FCPA cases last year, down 10 cases from 2014 and nine from 2013. The DOJ has indicated that it will put a policy shift into motion this year that will create rewards for companies that report violations and cooperate with the government. It will also put greater priority on prosecuting individuals.

Defense lawyers worry that the DOJ’s policy shift of incentivizing companies to completely cooperate maybe actually limit voluntary disclosure rather than encourage them, according to The Wall Street Journal. However, they also expect to see more compliance cooperation from companies as a result of the DOJ’s hiring of a compliance expert last year.


December 22, 2015  4:56 PM

GDPR: How will the EU data protection law impact U.S. industry?

Fran Sales Fran Sales Profile: Fran Sales
Compliance, cybersecurity, Cybersecurity legislation, Data privacy, Data protection, EU directive 95/46, European Data Protection legislation, Safe Harbor

Three years in the making, European Union officials finally agreed on a draft of the General Data Protection Regulation. The EU-wide legal framework sets standards for data collection, sharing and privacy that will replace 28 different sets of national privacy laws. Officials promise stronger personal data protection measures for Europeans and an easier compliance process for businesses. While the new rule package won’t go into effect until 2018, cybersecurity experts are already forecasting greater discussion around data privacy, particularly among U.S. companies, because of GDPR.

The first thing about GDPR that struck security experts at a webinar earlier this month is its complexity. During the event, hosted by incident response provider Resilient Systems, three security and privacy specialists — Bruce Schneier, a leading cryptologist and the CTO of Resilient Systems; Jon Oltsik, senior principal analyst at Enterprise Strategy Group; and Gant Redmon, general counsel for Resilient — discussed their security predictions regarding the 201-page document as we enter the New Year.

It’s not only the length of the legislation that’s intricate — “If you have to say it in 201 pages, it’s going to be complex,” Redmon quipped — but also the fact that although EU negotiators are hoping to develop a centralized authority with GDPR. Upon closer inspection, however, that doesn’t actually appear to be the case, Redmon added.

For example, the rule package stipulates that each member country have its own national data protection authority, called a supervisory authority, to which companies and organizations are required to report data breaches.

“It’s going to start to look more like the U.S. There will be different folks and different bents to compliance that you have in each of these member states,” Redmon said.

He added that even though proponents of GDPR were hoping for more objective data protection standards from the legislation, in the future these codes of conduct will likely be outsourced to different industries.

“Both of these [factors] will have a big effect on industry, and compliance is going to be more complex than people were hoping,” Redmon said.

There is also the danger of over-regulation, which could push data collection activities underground, Schneier warned.

“I’m a big fan of the regulation of privacy … but too little regulation and it’s a free for all; too much regulation and we lose visibility into practices,” he said.

Contributing to this potential over-regulation is data sharing legislation passed by the U.S. Congress last week that takes a different stance from the EU: The Cybersecurity Act of 2015 (CISA). The bill, which was attached to a must-pass spending bill, provides liability protection and antitrust exemptions for companies that choose to share cybersecurity information with federal agencies such as the NSA. Detractors say the law allows companies to bypass civil rights and privacy mandates, including warrant requirements to conduct surveillance.

“Privacy protections have been weakened and Europe is now 200 pages and counting in complication. This is going to be hard,” Schneier said. “We want one single Internet — one single set of rules — and we’re further and further away from that.”

Oltsik agreed, saying that CISA not only waters down privacy but also creates a complete disconnect between U.S. legislators and their EU counterparts. This gap must be bridged in order to address the global problem of what he calls the “Balkanization” of privacy.

“We’re talking about data privacy on one hand and backdoors … on the other,” Oltsik said. “It’s as if we’re debating these issues with a complete lack of cooperation with each other.”


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: