IT Compliance Advisor

February 24, 2016  12:07 PM

Privacy Shield details lacking, but so far varies little from Safe Harbor

Fran Sales Fran Sales Profile: Fran Sales
Chief Compliance Officer, Compliance, Data privacy, Data protection, European Data Protecion legislation, FTC, Safe Harbor

Two weeks ago, European Commissioner Věra Jourová tweeted that the text for Privacy Shield, a new framework for transatlantic data flows, will be finalized by the end of February. The agreement between the EU and the U.S. will replace the invalidated Safe Harbor framework to provide EU citizens with stronger privacy protections as their personal data is being transferred to the U.S., according to a statement from the European Commission.

Very few details about Privacy Shield have been made public. What we do know is that the agreement aims to fix inadequate privacy protections in the now-defunct Safe Harbor. Other than these differences, Privacy Shield largely keeps Safe Harbor’s structure in place, said Jacqueline Klosek, senior counsel at Goodwin Procter LLP.

“Based upon the information that is presently available, the changes in the Privacy Shield are minor, but move the U.S. incrementally toward greater privacy protection of EU citizens,” she said.

How exactly does Privacy Shield aim to achieve this? From what the European Commission (EC) has revealed so far, there are three noteworthy components of the agreement, according to Klosek:

Strong privacy obligations for U.S. companies that handle personal data. The U.S. Department of Commerce will monitor whether companies are publishing privacy standards, and the Federal Trade Commission (FTC) will enforce these obligations under U.S. law. “Note that although the precise standards remain to be fleshed out, this requirement largely parrots the Safe Harbor program,” said Klosek.

Limits on the surveillance of personal data being transferred to the U.S. The Schrems decision that invalidated Safe Harbor determined that the agreement allowed the indiscriminate collection and surveillance of EU citizens’ data. “The United States has given the EU binding assurances that the access of public authorities for national security purposes will be subject to clear limitations, safeguards and oversight mechanisms,” Jourová said in a statement. According to the EC, these safeguards and limitations will be subject to an annual joint review by the EC and U.S. Commerce Department, although there is little detail about how rigorous this review will be.

However, there are a few holes in this aspect of Privacy Shield that still need to be addressed, Klosek pointed out. First, an investigation of U.S. intelligence legal frameworks shows that the U.S. government was already under these types of limitations. “This was the main U.S. government objection to the Schrems decision. … It did not accurately characterize the state of U.S. intelligence programs,” Klosek said. One notable difference between Privacy Shield and Safe Harbor is a requirement to appoint a privacy “ombudsman” in the U.S. that would field EU citizens’ complaints regarding U.S. surveillance. But the responsibilities of this role are also still unclear, said Klosek.

She added that the EU’s stance on requiring these oversight mechanisms for U.S. intelligence programs excludes the fact that EU national governments’ intelligence programs are subject to “far less oversight” than their U.S. counterparts.

Redress mechanisms to protect EU citizens’ right to privacy. EU citizens who believe their data has been misused will be able to file complaints using “accessible and affordable” dispute resolution mechanisms, according to Jourová. European Data Protection Authorities will refer these complaints to the FTC and the U.S. Commerce Department for possible enforcement. Furthermore, the Judicial Redress Act, which has already passed both houses of U.S. Congress, would allow foreign citizens to sue if they believe their privacy rights were compromised by the U.S. government.

Head to part two this blog post, where we explore the likely challenges Privacy Shield will face from EU authorities, the possible impact of the agreement on businesses and their customers, and more

February 19, 2016  10:57 AM

Apple, FBI face off in iPhone backdoor debate

Fran Sales Fran Sales Profile: Fran Sales
Apple, Apple iOS, backdoors, Compliance, Dodd-Frank, Encryption, FBI, grc, iPhone

This week, Apple chief Tim Cook said in a letter to the company’s customers that it won’t give in to the FBI’s demand to create an iPhone backdoor. Plus, the number of resolved FCPA enforcement cases reaches a 10-year low while the U.S. Justice Department intensifies foreign bribery efforts.

Apple resists FBI order to create iPhone backdoor

In a letter to Apple’s customers released Feb. 17, CEO Tim Cook said that the Federal Bureau of Investigation (FBI) has demanded that the company create a backdoor to its iPhones. A U.S. judge issued an order stating that Apple had to provide “reasonable technical assistance” to the FBI in order to help it unlock data from an iPhone that was used by one of the San Bernardino shooters.

According to Cook, this “assistance” involves creating a new, weaker version of iOS that bypass multiple important security safeguards. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession,” he said in the letter. Cook added that instead of asking Congress to pass legislation, the FBI is citing the All Writs Act of 1789 to propose an expansion of its authority.

apple headquarters, image

Apple Headquarters in Cupertino, Calif. (via Wikipedia)

The creation of the backdoor would set back several years of security advancements and put Apple’s customers at greater risk of being hacked, Cook said. The U.S. Department of Justice did not waver in its stance, saying in a statement that it’s “unfortunate” that Apple continues to refuse to help obtain data from a terrorist’s phone.

As resolved FCPA cases hits a low, DoJ boosts foreign bribery forces

The DoJ is ramping up its efforts to uncover foreign bribery crimes, beefing up the number of agents, prosecutors and accountants dedicated to increasing  the number of wiretaps and informants to deliver new bribery cases under the Foreign Corrupt Practices Act (FCPA). This bolstering of forces will be funded using billions of dollars seized from criminals.

The increase in resources comes after U.S. officials resolved a record-low number of 20 FCPA enforcement actions in 2015 — the lowest since 2006, according to a report from Miller & Chevalier. This drop can be attributed to the Justice Department pursuing bigger cases, such as the current probe into Wal-Mart Stores Inc., as well as not having sufficient resources to investigate all of the foreign bribery cases brought in over the last 10 years, according to George Khouzami, assistant special agent in charge of the FBI’s New York office.

New Minneapolis Fed chief: Dodd-Frank insufficient

Neel Kashkari, the new president of the Federal Reserve Bank of Minneapolis, said in his first public statement since taking over the position in January that the Dodd-Frank Act of 2010 “did not go far enough” in doing away with the problem of financial institutions that are “too big to fail” and would require government bailout in the event of a crisis.

Kashkari, a Republican who served under the Bush administration as Assistant Secretary of the Treasury for Financial Stability, advises legislators to seriously consider options including breaking up banks into smaller entities and “taxing leverage throughout the financial system” to mitigate systemic risks.

Presidential candidate Bernie Sanders supported Kashkari’s stance, while others defended Dodd-Frank in the wake of Kashkari’s speech. The Dodd-Frank defenders said that the law provided regulators with the power to break up firms if they lacked adequate plans for going through bankruptcy without taxpayer help or if they posed significant danger to the financial system.

February 5, 2016  12:20 PM

Barclays, Credit Suisse to pay $154M for ‘dark pool’ trading violations

Fran Sales Fran Sales Profile: Fran Sales
Compliance, Data governance, Data privacy, Data protection, EMV, PCI compliance, Safe Harbor, SEC

The U.S. Securities and Exchange commission announced last week that global banks Barclays and Credit Suisse would pay a record total of more than $154 million to settle allegations over “dark pool” trading. In other recent GRC news, retailers continue to face EMV chip hurdles months after new payment card standards went into effect; and the U.S. and European Union have reached a data transfer agreement.

Barclays and Credit Suisse settle ‘dark pool’ cases

Last week, Barclays PLC and Credit Suisse Group AG agreed to pay a total of $154.3 million to settle federal and state charges that the global banks misled investors with “dark pools,” or private trading platforms where exchanges are not visible to other traders until they are executed. The two settlements are the largest fines ever paid associated with cases that involve dark pool trading, according to a statement by the SEC.

Both banks were charged with misinforming its investors about how their exchanges were monitored in these private venues. Barclays didn’t convey enough information to its clients about how it policed its dark pool’s high-frequency trading, while Credit Suisse did not disclose to investors that the bank systematically prioritized routing orders to its dark pool trading platforms over other venues, according to the SEC’s statement.

Barclays plc world headquarters

Barclays plc world headquarters (via Wikipedia)

“The SEC will continue to shed light on dark pools to better protect investors,” SEC chairperson Mary Jo White said in the statement. New York Attorney General Eric Schneiderman also said at a press conference that his office will continue ongoing investigations into dark pools.

Retailers face EMV chip challenges

On October 1, 2015, new payment standards went into effect requiring retailers to process payment cards embedded with an EMV chip to help reduce the risk of payment data being stolen. It also shifted risk from banks to retailers: The new rules state that merchants who failed to set up chip-enabled terminals could be liable for fraudulent transactions.

Greg Buzek, president of retail consultancy IHL Services, wrote in a blog post that only 8.5% of merchants are currently equipped to process the chip cards. Buzek added that the EMV mandate “forces a tax” on retailers because it slows transaction times and does not validate whether the chip card user is the legitimate owner of the card, which “does absolutely nothing for online or mobile fraud,” he wrote.

Furthermore, IHL’s research found that retailers are getting charged for fraudulent transactions due to lost and stolen cards, which, according to the new EMV guidelines, they should not be liable for. This has spurred retailers to create audit trails to protect themselves from these inappropriate charges, making them even more of a target for data breaches and theft, Buzek argued.

U.S. and E.U. authorities reach data transfer deal

Authorities from the E.U. and U.S. have reached an agreement to remove data transfer restrictions for European and U.S. companies. The deal will replace Safe Harbor, the former agreement between the U.S. Department of Commerce and the EU that had allowed over 4,000 companies to bypass EU data transfer rules and move EU citizens’ personal data across the Atlantic. This framework was struck down last year because of U.S. surveillance concerns.

The agreement would prevent legal action against companies, according to EU regulators.  Sources told Reuters that the agreement would include more robust oversight of companies’ compliance of EU data protection laws, and that the U.S. access to EU citizens’ data would be subject to limitations.

January 21, 2016  3:21 PM

FTC report: Big data analytics could prove harmful to consumers

Fran Sales Fran Sales Profile: Fran Sales
Big Data, Big Data Analysts, Consumer data, FTC, FTC Act, Information security, Predictive Analytics, regulatory compliance

Big data analytics have proven extremely beneficial to both companies and consumers across a wide range of industries, producing valuable insight in fields like healthcare, education and transportation. There is also, however, the potential for this data to be used in a way that harms consumers, according to a Federal Trade Commission report published earlier this month. As I covered in Searchlight last week, the FTC report had a clear message: The federal agency will not hold back from investigating unethical big data analytics processes and bringing enforcement actions against businesses that employ them.

But which big data processes exactly does the FTC report say are potentially problematic, and which does it consider acceptable? Highlights include the following:

Problematic: Differentiating products based on population subsets. Any big data analytics practice that limits provision of products or services to certain population subsets based on statistical input is considered unfair by the FTC, “especially if any of the data can be a proxy for poor or minority or underserved populations,” said Brenda Sharton, head of law firm Goodwin Procter LLP’s business litigation division. She offered the following industry examples of such practices: withholding access to credit, housing or employment due to background checks or screenings; and offering different rates or prices on insurance or product delivery based on an individual’s address. The FTC considers collecting several types of data problematic, including zip codes, social media usage/membership, and shopping habits. Sharton also warned against differentiating products based on characteristics such as race, gender and marital status.

Problematic: Making false promises to customers about how data is analyzed. Another big data practice companies should be wary of is making promises to consumers about how their data will be used and whether it will be entered into predictive analytics platforms. “If you’re going to be using [the data] for any statistical analysis, and it’s either you or your third-party vendor, you want to make sure you don’t promise the consumer that you ‘won’t do that,’ or that you’re informing them that you will,” said Sharton, who also serves as co-chair of Goodwin Procter’s privacy and cybersecurity group.

Acceptable: Targeted advertising. In most cases, this practice is OK with the FTC, said Sharton. For example, “A company’s advertisement to a particular community for credit offers that are open for all to apply is unlikely to violate [equal opportunity credit laws],” she said.

Problematic: Failing to reasonably secure consumers’ data. The FTC acknowledges that in the era of big data, companies are justified when collecting more data than they need. This means their information security needs to be proportionally robust — leading to another practice the report’s authors say is unacceptable: failure to implement security measures that are sophisticated enough to secure data “commensurate with the amount and sensitivity of the data at issue, the size and complexity of the company’s operations, and the cost of available security measures.” For example, organizations that maintain sensitive data such as Social Security numbers or customers’ medical data must have stronger security safeguards than those that only maintain consumers’ names, they added.

The FTC recommends that companies look at three factors to determine whether their information security is strong enough in proportion to the types of data they manage:

  • The amount and sensitivity of data
  • The size and complexity of the company’s operations (“What’s right for a massive Fortune 100 company will be different than what’s right for a … small company,” said Sharton.)
  • What security measures are available and how much they cost

The bottom line for companies employing big data analytics? Full regulatory compliance will likely require some legal advice. “They should ensure they have the counsel to determine whether they are complying with things like FCRA, ECOA and other laws,” Sharton said.

What’s your take on the downside of big data analytics? Let us know at

January 7, 2016  1:31 PM

Repeat HIPAA violators face minimal ramifications

Fran Sales Fran Sales Profile: Fran Sales
grc, HIPAA, PCI compliance, PCI DSS, regulatory compliance, SSL/TLS, TLS

Despite several HIPAA violations, recent data analysis found U.S. healthcare providers such as CVS and the VA face few punitive actions. Also in recent GRC headlines: Companies have two more years to meet the TLS requirement under PCI DSS, and experts foresee big changes ahead for the FCPA’s corruption enforcement practices.

Few penalties imposed on frequent HIPAA violators

CVS Health, Kaiser, the U.S. Department of Veterans Affairs and Walgreens were among hundreds of U.S. health providers that repeatedly violated the Health Insurance Portability and Accountability Act (HIPAA) between 2011 and 2014, according to an analysis of federal data by ProPublica, a non-profit newsroom. Despite numerous violations, sanctions against these providers have rarely been imposed, ProPublica’s research found.

The provider with the second-highest number of violations, CVS, pledged to improve its privacy safeguards or was reminded of its HIPAA obligations by the Office for Civil Rights (OCR)  more than 200 times, according to ProPublica. The VA was the most persistent HIPAA violator, according to the data: Its clinics, hospitals and pharmacies violated HIPAA compliance 220 times, but the OCR never publicly reprimanded or sanctioned the health provider.

Experts say that while some privacy problems are to be expected among large healthcare providers, persistent complaints are a sign of organizational failures. Deven McGraw, deputy director for health information privacy at OCR, told ProPublica that the agency’s top priority is to investigate breaches that affect at least 500 people. She also acknowledged that the OCR can do more about providers who repeatedly violate HIPAA. Although OCR receives thousands of privacy complaints a year, it has issued fewer than 30 financial sanctions for privacy violations since 2009.

Merchants get two extra years to meet key PCI DSS requirement

The Payment Card Industry Security Standards Council (PCI SSC) announced last month that merchants that need to be compliant with Payment Card Data Security Standard (PCI DSS) version 3.1 now have until June 2018 to migrate away from vulnerable encryption protocols, two years later than the original date of June 2016.

Under PCI DSS 3.1, which was released in April 2015, organizations must migrate away from older versions of Transport Layer Security (TLS) — versions 1.0 and earlier — and any version of Secure Sockets layer (SSL) by this date. Furthermore, effective immediately, these organizations are prohibited from implementing new technology that relies on SSL and early TLS. According to a large body of research, these protocols were deemed cryptographically insecure and put payment data at higher risk of exposure.

One of the main reasons PCI SSC extended the deadline until June 2018 is because it hasn’t been seeing criminals accessing cardholder data through the protocol flaws, PCI SSC international director Jeremy King told eWeek. PCI SSC is trying to balance risks with operational needs, King added, but he also warned that the date change is not an excuse to “do nothing for two years.” Instead, he suggested merchants migrate away from the flawed protocols as early as possible.

Major changes to FCPA enforcement expected for 2016

The Foreign Corrupt Practices Act (FCPA) unit of the U.S. Justice department will be receiving more scrutiny and resources this year in light of low enforcement in 2015. The FPCA plans to add 10 new staff members, and DOJ leaders say they will focus on greater transparency as well as pursuing “high impact” bribery cases, The Wall Street Journal reported.

The Justice Department’s foreign corruption unit settled only two corporate FCPA cases last year, down 10 cases from 2014 and nine from 2013. The DOJ has indicated that it will put a policy shift into motion this year that will create rewards for companies that report violations and cooperate with the government. It will also put greater priority on prosecuting individuals.

Defense lawyers worry that the DOJ’s policy shift of incentivizing companies to completely cooperate maybe actually limit voluntary disclosure rather than encourage them, according to The Wall Street Journal. However, they also expect to see more compliance cooperation from companies as a result of the DOJ’s hiring of a compliance expert last year.

December 22, 2015  4:56 PM

GDPR: How will the EU data protection law impact U.S. industry?

Fran Sales Fran Sales Profile: Fran Sales
Compliance, cybersecurity, Cybersecurity legislation, Data privacy, Data protection, EU directive 95/46, European Data Protection legislation, Safe Harbor

Three years in the making, European Union officials finally agreed on a draft of the General Data Protection Regulation. The EU-wide legal framework sets standards for data collection, sharing and privacy that will replace 28 different sets of national privacy laws. Officials promise stronger personal data protection measures for Europeans and an easier compliance process for businesses. While the new rule package won’t go into effect until 2018, cybersecurity experts are already forecasting greater discussion around data privacy, particularly among U.S. companies, because of GDPR.

The first thing about GDPR that struck security experts at a webinar earlier this month is its complexity. During the event, hosted by incident response provider Resilient Systems, three security and privacy specialists — Bruce Schneier, a leading cryptologist and the CTO of Resilient Systems; Jon Oltsik, senior principal analyst at Enterprise Strategy Group; and Gant Redmon, general counsel for Resilient — discussed their security predictions regarding the 201-page document as we enter the New Year.

It’s not only the length of the legislation that’s intricate — “If you have to say it in 201 pages, it’s going to be complex,” Redmon quipped — but also the fact that although EU negotiators are hoping to develop a centralized authority with GDPR. Upon closer inspection, however, that doesn’t actually appear to be the case, Redmon added.

For example, the rule package stipulates that each member country have its own national data protection authority, called a supervisory authority, to which companies and organizations are required to report data breaches.

“It’s going to start to look more like the U.S. There will be different folks and different bents to compliance that you have in each of these member states,” Redmon said.

He added that even though proponents of GDPR were hoping for more objective data protection standards from the legislation, in the future these codes of conduct will likely be outsourced to different industries.

“Both of these [factors] will have a big effect on industry, and compliance is going to be more complex than people were hoping,” Redmon said.

There is also the danger of over-regulation, which could push data collection activities underground, Schneier warned.

“I’m a big fan of the regulation of privacy … but too little regulation and it’s a free for all; too much regulation and we lose visibility into practices,” he said.

Contributing to this potential over-regulation is data sharing legislation passed by the U.S. Congress last week that takes a different stance from the EU: The Cybersecurity Act of 2015 (CISA). The bill, which was attached to a must-pass spending bill, provides liability protection and antitrust exemptions for companies that choose to share cybersecurity information with federal agencies such as the NSA. Detractors say the law allows companies to bypass civil rights and privacy mandates, including warrant requirements to conduct surveillance.

“Privacy protections have been weakened and Europe is now 200 pages and counting in complication. This is going to be hard,” Schneier said. “We want one single Internet — one single set of rules — and we’re further and further away from that.”

Oltsik agreed, saying that CISA not only waters down privacy but also creates a complete disconnect between U.S. legislators and their EU counterparts. This gap must be bridged in order to address the global problem of what he calls the “Balkanization” of privacy.

“We’re talking about data privacy on one hand and backdoors … on the other,” Oltsik said. “It’s as if we’re debating these issues with a complete lack of cooperation with each other.”

December 9, 2015  5:15 PM

New York proposes banking rules to block terrorism funding

Fran Sales Fran Sales Profile: Fran Sales
Compliance, Dodd-Frank, grc, HIPAA, SEC

The governor of New York has introduced new state banking rules designed to curb money laundering and block terrorism funding. Also in recent GRC news: Most healthcare organizations lack HIPAA-compliant messaging apps; the Fed adopts stricter bailout measures; and a former SEC commissioner says the agency faces a “crisis of confidence” due its controversial use of internal judges.

New York introduces banking rules to stop terrorism funding

andrew cuomo, new york, governor, headshot, imageLast week, New York Gov. Andrew Cuomo (pictured left) proposed rules that would require New York State banks to follow stringent measures designed to prevent money laundering and terrorism funding. Under the proposed rules, banks must have a chief compliance officer (CCO) who would certify that systems that detect and prevent such illicit activity are in place. CCOs who file false certifications will face criminal charges, according to the rules, which are being written by the New York State Department of Financial Services (NYSDFS).
Cuomo’s proposal to prevent militant groups from using the New York financial system comes as the U.S. government and international authorities intensify efforts to thwart terrorism funding to groups such as ISIL in the wake of recent attacks in Paris.

The proposed rules also require improving banks’ systems to monitor and filter illicit transactions, as well as boosting software that automatically block suspicious transactions, according to NYSDFS.

Survey: 92% of health institutions use non-HIPAA-compliant messaging apps

An October survey found that 92% of healthcare institutions currently use mobile messaging apps that are not compliant with the Health Insurance Portability and Accountability Act (HIPAA).

The survey, conducted by mobile provider Infinite Convergence Solutions, garnered responses from 500 professionals in the finance/banking, healthcare, legal and retail industries about their mobile messaging behaviors. The survey further found that more than half of healthcare organizations (51%) do not have an official mobile messaging platform. Of those that did have one, less than a quarter (24%) employed an internal, corporate-created app.

“Healthcare employees communicate inherently sensitive information, like patient prescriptions, medical information, etc., yet their employers do not have the proper mobile messaging security infrastructure in place to adhere to HIPAA or other regulatory requirements,” Anurag Lal, CEO of Infinite Convergence, said in statement.

Fed adopts new rule to rein in emergency lending powers

The Federal Reserve Board last Monday approved a rule that curbs the central bank’s ability to bail out firms during a crisis. The rule was required by the Dodd-Frank Act, which mandates that the Fed limit its emergency lending authority to “broad-based programs” instead of specific institutions. It is meant to assuage concerns by legislators that the Fed holds too much power to pump money into the financial system. Critics say that after the 2008 financial crisis, the Fed used its emergency lending powers for the first time since the Great Depression and operated without adequate restrictions.

Under the adopted rule, the Fed will only be able to use its emergency lending power to help a market or industry sector, and not to bail individual firms out of bankruptcy. The rule requires that at least five firms be eligible for participation in each Federal Reserve lending program, and that each organization has enough collateral to protect taxpayers.

Ex-official says SEC faces ‘crisis of confidence’ over internal judges

The Securities and Exchange Commission defends its practice of bringing cases to its own in-house judges instead of federal court, but a former SEC official disagrees.

Former SEC Commissioner Joseph Grundfest testified before a House Financial Services subcommittee that the agency is undergoing “a crisis of confidence over the fairness of its internal administrative procedures.” The subcommittee, which covers capital markets, recently proposed a bill that would allow defendants to choose a trial by a federal judge or jury instead of before one of the SEC’s internal judges.

In his testimony, Grundfest, now a professor at Stanford Law School, urged the SEC to change its internal policies.

Critics have been challenging the SEC’s increasing use of its in-house court to preside over serious cases such as those involving insider trading. They cite the SEC’s possible “home court advantage” because defendants lack protections that are available in a federal court trial, such as pre-trial discovery rights.

November 25, 2015  9:09 AM

Privacy vs. public safety remains central to encryption debate

Fran Sales Fran Sales Profile: Fran Sales
cybersecurity, Cybersecurity legislation, Data-security, Encryption, Encryption keys, FBI, NSA, Security

In the wake of the horrific attacks in Paris earlier this month, government and intelligence officials pointed a finger at end-to-end encryption (E2EE) and how it enabled attackers to “go dark” — in other words, become invisible to law enforcement.

This is only the latest development in a years-old encryption debate between intelligence officials and Silicon Valley: Should tech companies give intelligence agencies back-door access to encrypted devices and networks, or hold their ground on strong encryption to protect their customers’ right to privacy? Even before the attacks rekindled the public safety vs. privacy debate, earlier this month a panel of experts from both sides of the argument weighed in on the pros and cons of E2EE at the Advanced Cybersecurity Center’s conference in Boston.

One panelist, FBI General Counsel James Baker, stressed that there is no perfect technical solution to the public safety vs. privacy debate. He added that it’s up to legislators and individual technology companies to decide how far they want to enable government surveillance.

cybersecurity, wikipedia, image

“Under what set of circumstances do the people want that to happen? What do you want us to do, and what risks are you willing to take on all sides of the equation?” Baker asked the audience, which mostly consisted of information security professionals.

He added that Congress remains behind in addressing the problem as well.

“Current legislative thinking is unsatisfactory in balancing all these types of risks,” Baker said. “It’s about creating laws that effectively enable the government to obtain the results of surveillance in a way that’s consistent with our constitutional rights.”

At present, the FBI is offering two solutions for its “going dark” dilemma: Split-key encryption (data can only be decrypted by combining several keys) or encryption via “key escrow” (one key out of many is stored by a government agency).

Eric Wenger, director of cybersecurity, privacy and global affairs at Cisco, argued that these solutions are insufficient for tech companies. He said that they gave the public a mixed message: “We want you to use really strong encryption, but we just want a way to break into it,” he said.

Wenger also questioned whether the FBI and other law enforcement entities needed access to encrypted data for every investigation. With kidnapping cases, for instance, unencrypted geolocation data from a suspect’s device could prove to be the most important piece of information to move the investigation forward.

“We need to tease these problems apart and get to the things that are really the most meddlesome for law enforcement,” Wenger said.

Fellow panelist Susan Landau, a security policy professor at WPI, agreed, saying the encryption conversation is complex and needs to be considered on a case-by-case basis. This would require changing how local and federal law enforcement conduct investigations, however, and would likely come with considerable costs, she said.

“It’s complicated. It takes time. But I find myself in the position of actually supporting extra funding and saying, ‘Look, we’re talking about securing everybody and making investigations more expensive — or the reverse,'” she said.

But reservations around split-key encryption, key-escrow encryption and other proposals that facilitate government surveillance doesn’t mean that technology companies don’t care about public safety, Wenger insisted.

“I want [the FBI’s Baker] to be able to get what he’s entitled to get, but the problem is at what cost?” he said.

Head to SearchCompliance to read more about the panelists’ take on end-to-end encryption.

November 18, 2015  3:40 PM

Beyond BYOD: How new tech is driving digital information governance

Ben Cole Ben Cole Profile: Ben Cole
Information governance, Internet of Things, iot

(This blog post was written by Diane K. Carlisle, executive director of content at ARMA International.)

Day by day, effective information governance (IG) is made more urgent and more complicated by disruptive technologies and new business models that are rocketing throughout organizations. Many companies are still in the early stages of solving such digital-age challenges as big data and the bring-your-own-device (BYOD) model, but advanced technologies continue to emerge. The Internet of Things (IoT), for instance, is a phenomenon that offers profound business opportunities while carrying great risk.

In a 2015 white paper titled “Internet of Things: Privacy and Security in a Connect World,” the FTC defined IoT as “the ability of everyday objects to connect to the Internet and to send and receive data.”

Examples include cameras that permit users to post pictures online with one click; automated systems that let users turn lights on and off remotely; and sensors in storage systems that can detect RFID data in order to manage inventory more efficiently. The IoT umbrella also includes such wearable devices as bracelets that track and share your workout data and heart implants that monitor and transmit health information.

The business benefits — and detriments — of IoT

Many companies hope the IoT will enhance productivity and generate new business models and revenue sources. Typically, IoT devices collect data and flow it to other devices. Some smart televisions, for example, can detect whether anyone in the room is actually watching the screen and transmit that information to other smart devices. Companies then use the data to negotiate ad rates, or to target products or other programs for that household.

Organizations are eager to pursue IoT because they stand to benefit dramatically from using the information they collect through it. For example, smart meters can help utility companies reduce the costs of manual meter reading and can monitor and predict resource usage at peak times. This information helps them ensure adequate supplies to meet customer demand. It might also help them justify rate hike requests to public utility commissions. The technology helps customers understand their power usage so they can make beneficial changes.

But there are down sides as well.

Such data generation and aggregation present big challenges in the governance of information because the IoT intensifies the volumes of data, the variety of sources and how it is dispersed. IoT devices spread immeasurable volumes of data to other connected devices, some of which may be external to an organization’s infrastructure and therefore beyond its sphere of information security. Thus, organizations that turn to the IoT for business advantage must be prepared for the associated information risks — especially when it comes to privacy and security.

Chief among the IG concerns and potential risks is the organization’s duty to protect customer privacy. Customer identification and banking information is linked to that smart meter, and the customers may be skeptical of the organization’s ability to protect that information from unauthorized use. And of course, managing and protecting such large volumes of information are challenging for most companies.

Providing adequate security for the information throughout its capture, transmittal, and storage requires financial resources that management may not have anticipated. Information is vulnerable at any of these points. And as the sensitivity of the data and functionality increases, consumers’ concerns about privacy protection may increase as well. Consumers may be more sensitive about banking information transmitted through Apple Pay than they are about their consumption of electricity. Take it a step further and think of the “smart home.” The security measures must be extremely effective to prevent unauthorized access.

These are but a few examples. In short, the IoT world is changing rapidly and it’s easy to foresee an overwhelming volume of data entering the corporate environment.

Planning for IoT initiatives

With that said, organizations must address the full scope of IG considerations as they implement IoT applications. In the following list are steps they can take to plan for IoT initiatives:

  • Convene an IoT implementation team: It should include representatives from IT, RIM (records and information), legal and the relevant business units. If your organization has an IG steering committee, this would suggest a natural fit. Otherwise, convene a collaborative team to understand the benefits and risks and to make decisions on the implementation, considering all the factors involved.
  • Conduct a risk assessment: Depending on the specific application, the organization may be taking on a greater degree of information-related risk.
  • Make a plan: An IoT initiative must be taken seriously. Build the plan around specific business goals and strategies. Establish benchmarks and metrics to evaluate the success or failure of the initiative.
  • Integrate regulatory and compliance requirements. These requirements will continue to apply, regardless of how information is captured.
  • Assess the impact of IoT on the retention/disposition policy and schedule. You will be greatly lengthening the retention period for some types of information. Make sure you can delete the information when the retention period has expired and that the necessary retention schedule modifications are made.
  • Ensure that IT has the capacity to deal with the additional volumes of information. The growth in data that requires storage can quickly overwhelm an organization. Such volumes can hinder business efficiencies, make e-discovery more costly and jeopardize the defensibility of legal holds.

While the IoT trend can be overwhelming, a sound IG program can give your organization a head start on addressing the challenges that come along with the opportunities.

The building blocks for a sound IG program are the Generally Accepted Recordkeeping Principles® developed by ARMA International, a thought-leader on IG. These Principles — Accountability, Transparency, Integrity, Protection, Compliance, Availability, Retention, Disposition — work together to foster a collaborative approach that ensures information is treated as an asset, protected in compliance with all regulations, and disposed of according to a legally defensible retention plan.

Accompanying the Principles is the IG Maturity Model, which defines characteristics of various levels of recordkeeping programs. It’s an assessment tool you can use to evaluate your IG program against the Principles. It helps you identify the gaps between your current situation and your desirable level of maturity for each principle. More information on both the Principles and the Maturity Model are available on the ARMA website.

The collaborative IG approach and conformity to the Principles will ensure maximum information security and a sound, holistic IG program that will prepare your organization for virtually any information-related challenge.

Diane K. Carlisle, IGP, CRM, is executive director of content at ARMA International, a not-for-profit professional association and authority on governing information as a strategic asset.

November 12, 2015  8:51 AM

Fed Chair says regulatory compliance problems persist at large banks

Fran Sales Fran Sales Profile: Fran Sales
Audit and compliance, CFO, Compliance, Dodd-Frank, Finance, Financial firms, Financial industry, grc, GRC strategy, regulatory compliance, Risk management

In recent regulatory compliance news, the Federal Reserve Chairwoman testified before a House panel that very large U.S. banks still experience “substantial” GRC management failures; recent research casts doubt on the effectiveness of new compensation “clawback” rules proposed under the Dodd-Frank Act.

Fed Chair: Big firms still face regulatory compliance issues

The leader of the Federal Reserve has rebuked very large U.S. banks for persistent regulatory compliance and risk management breakdowns, but also suggested legislation to lighten the regulatory burden on midsized firms.

Last week during a three-hour testimony in front of the House Financial Services Committee, Fed Chairwoman Janet Yellen said that although the largest financial firms the Fed regulates have improved governance, internal controls and risk management since the 2008 financial crisis, they still undergo “substantial compliance and risk management issues.” Yellen said the Fed is prepared to require very large firms to make considerable changes to their businesses if these banks’ “living wills” — or plans that detail how they would dismantle operations during bankruptcy — don’t pass muster with the Fed.

Federal Reserve, Chairwoman, Chair Janet Yellen, photo, wikimediaYellen also said the Fed is open to tweaking its regulatory regime to help regional banks with more than $50 billion in assets. Under the Dodd-Frank Act, these banks are accountable to more stringent rules than those with fewer assets. However, she pushed back against a bill proposed by the House that would dictate criteria for which of these firms would face tougher rules. Instead, Yellen requested that the Fed have the flexibility to modify the rules.

Study: Execs tend to refuse restatements if their pay is incentive-based

New research shows that compensation clawback rules proposed under the Dodd-Frank Act might not be as effective as proponents anticipate in influencing companies to fix faulty financial statements. Under the new rules, which will likely be adopted later this year, issuing these restatements will initiate the “clawback,” or return, of financial executives’ inappropriate bonuses.

The research, published by Accounting Review this month, concluded that senior executives (mainly CFOs, controllers and treasurers) are less likely to agree to fix faulty financial statements when most of their compensation is incentive-based. Jonathan S. Pyzoha, an assistant professor of accountancy at Miami University, conducted a study to determine whether executives from 112 public financial companies negotiate more firmly with auditors to fully avoid a restatement if their incentive-based pay is at stake.

Attorneys that work with companies considering restatements believe that clawbacks are not among their principal concerns, according to MarketWatch. However, Pyzoha’s research shows that this is not the case for executives with the bulk of their pay being incentive-based. He found that these executives were less amenable to fixing financial statements if the restatement was proposed by a “low quality” auditor — with quality based on the auditor’s time and experience in the field. However, he also found that executives were more open to restatements if the proposal came from a high-quality auditor.

Pyzoha advised companies to have their audit committee’s financial experts play a greater role in the restatement process to counterbalance these executives’ influence.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: