In a letter to the FTC, the Electronic Privacy Information Center (EPIC) is pushing for an investigation because of more Google Search changes. EPIC said the inclusion in Google Search results of personal data, such as photos and contact details gathered from Google Plus, raises “concerns related to both competition and the implementation of the commission’s consent order.”
Under the settlement reached with the FTC in April, Google was required to implement a comprehensive privacy program and submit regular, independent privacy audits for the next 20 years.
“Google allows users to opt out of receiving search results that include personal data, but users cannot opt out of having their information found by their Google+ contacts by Google Search,” EPIC Executive Director Marc Rotenberg wrote in the letter to the FTC. “In contrast, Google allows content owners to remove pages from Google’s public search results.”
The EPIC letter also contends Google’s changes create potential antitrust violations because the company prioritizes its own content when returning search results.
In Google’s official blog earlier this week, Google fellow Amit Singhal wrote a lengthy post outlining and explaining the benefits of Google Search Plus Your World. Singhal touted what he called the new feature’s “unprecedented” security, transparency and control. The company has also posted accolades from analysts and consumers touting Google’s Search Plus Your World.
The FTC has yet to comment publicly on EPIC’s letter and call for another investigation into Google’s online consumer privacy practices. But it’s worth noting that the last couple of times EPIC made similar complaints against high-profile Internet companies, it resulted in privacy-related FTC settlements for both Google and Facebook with the FTC.]]>
On Monday, Yahoo launched a new online privacy tool that, in theory, allows users to gain more insight into the data that the media company has gathered about their interests. According to the press release, the tool provides users with the ability to “assert greater control over their online experience,” providing Yahoo’s “educated guesses about their interests” and granular controls for those users to opt out of those categories or out of interest-based advertising entirely.
The “Ad Interest Manager” was announced and released in beta on the same day the Federal Trade Commission held its first roundtable on privacy in Washington, D.C. The privacy workshop agenda (PDF) for the FTC privacy roundtable includes academics, advocates and representatives from media, data mining, software and analytics companies.
This introduction of an online privacy tool for consumers by Yahoo follows the addition of an online privacy dashboard from Google last month and the July release of self-regulatory online privacy principles for the use and collection of behavioral data for Internet advertising.
Whether such efforts are enough to preemptively address attention from the FTC will be an open question in 2010. As FTC chairman Jon Leibowitz stated earlier in the year, this is “industry’s last chance to get its act together on behavioral targeting.”
Capitalizing on this regulatory focus, the Center for Democracy & Technology (CDT) also began a consumer online privacy campaign last week called “Take Back Your Privacy.”
“All social media should have granular privacy controls,” said Leslie Harris, president and CEO of CDT. An important element of the CDT’s online privacy campaign effort includes the release of an online privacy Compliant Tool that allows people to register online privacy concerns with the FTC and share that action with connections with social media.
Harris, who was at the FTC’s privacy roundtable yesterday, says that next year will be “the first time there will be serious consideration of consumer privacy legislation in many years.”
According to the CDT’s Ari Schwartz, Rep. Richard Boucher has put forward an outline of a consumer privacy bill that will be a framework for action in January.
As Kara Swisher pointed out at CNET, the addition of this online privacy tool by Yahoo coincides with a “bigger backdrop” of “the pending regulatory approval of the massive search and advertising partnership between Yahoo and Microsoft. The two companies announced last week that they had completed the definitive agreement for the deal.”
As Swisher observed, “one of the key issues for regulators, of course, is the privacy implications of combining the search and online ad technologies of the No. 2 and No. 3 players.” Google, Yahoo and the online advertising industry as a whole will be watching carefully to see what FTC compliance and action from Congress will mean for all in the year ahead.
For insight into the way that the regulator sees the relationships here, review the FTC graphic below describing a user’s “personal data ecosystem.”
[Image source at FTC.gov. (PDF). For specific industry relationships, review these data flow charts. (PDF).]
“We found that the role of CIOs in the federal government is very much focused on data centers, networking and technology, not on how we can transform the function of the public sector itself.” He explained that he wants to “leverage tech to fundamentally change the way the public sector operates.” Now, as the federal government works to account for each of the $787 billion in spending from the American Recovery and Reinvestment Act of 2009 and publishes more data from its agencies, Kundra said, “we’re shifting away from democratizing data to thinking about how public policy can be powered by that information.”
Cloud computing, SOA and agile development
In tracing the path of technology from agrarian to industrial to the current information revolution, Kundra noted the transformative effect of both cell phones and social networking platforms like Facebook, YouTube and Twitter. “We’re seeing the impact that Twitter has on the geopolitical climate of the world,” he said. “Information is far more liquid than it has been in the history of civilization.” The disruptive effects of the online revolution in user-generated content are steadily filtering into government. The “Darwinian pressures” exerted upon real estate, real estate, consumer products and the automotive industry haven’t hit government yet, Kundra observed. “It’s easy to go online and compare consumer products, but it’s very difficult, if not impossible, to get information to make intelligent decisions.” In launching the contest Apps for Democracy, in fact, Kundra found a way to introduce an element of competition and innovation into an government IT ecosystem that was underserved in both areas.
Kundra has been a proponent of cloud computing for years, going back to his position as the CTO of the District of Columbia, where he signed a contract with Google for business services. Today, he emphasized the need for security, interoperability and data portability in federal government use of cloud computing. “As we make the shift towards cloud computing, security threats need to be addressed. Solutions cannot be bolted on afterwards. Data portability is central, so that as we move from Vendor A to Vendor B we architect this with interoperability and standards so that we don’t spend billions later.”
Questioned on whether service-oriented architecture still is an emphasis in a federal cloud computing paradigm, Kundra said SOA “absolutely” still matters. “Look at the Social Security Administration and what it’s done with SOA and local government,” he said. “They can build lightweight applications to interact with databases elsewhere.” That embrace of modern development practices extends beyond just SOA or upgrading programmers’ skills from COBOL. “How do we move towards an agile procurement or agile development methodology?” asked Kundra.
In some areas, the government is moving to make systems more interoperable. Kundra pointed to what what’s happening between the IRS and Department of Education in student aid. “Before, if you wanted to apply and get aid, you had to fill out a FAFSA,” said Kundra. “That form is more complex than a 1040.” Starting in January, there will be a brand new online way to fill out a Free Application for Student Aid, according to Kundra, which will eliminate 70 questions and 20 Web screens. “Students will be able to get IRS data and autopopulate it in the form for student aid.”
Government 2.0 and data-driven policy
As he grows into the U.S. CIO role, Kundra has continued to add to the areas where government IT spending and management has been and where he’d like it to go. IT systems were “not invested where they should be, which is at the intersection of the American people and government,” he said. As he put it, it’s a “simple change in default setting to being that of secretive, opaque and closed to transparent, open and participatory.”
The old mode involved the management of $70 billion of federal IT investments through a “closed, opaque, checklist-driven process,” Kundra said. Now USAspending.gov, the federal IT dashboard, tracks spending. The website has received more than 56 million hits since launch, according to Kundra. In the old way of thinking, there was a “presumption that the government has a monopoly on the best ideas,” said Kundra. Now, Data.gov provides machine-readable data for developers to mash up. Historically, there’s been a “complex, time-consuming, paper-based acquisition process,” said Kundra. Now, there’s Apps.gov.
Cybersecurity and FISMA reform
Kundra sees the same transition toward more flexible systems in cybersecurity. “We’re moving from a manual, reporting-based, compliance-focused approach to a real-time measurement of actual cybersecurity,” said Kundra, referring to the new “Cyberscope” system for online reporting of cybersecurity threats that launched in October. “You cannot address real-time threats with a solution that’s focused on reporting requirements on a quarterly basis.”
“The OpenID and .gov project’s goal is to make government more transparent to citizens,” said Don Thibeau, executive director of the OpenID Foundation at the OASIS Identity Management 2009 conference, referring the audience to IDManagement.gov.
There are now more than 1 billion OpenID-enabled accounts, according to Thibeau, with more than 40,000 websites supporting the framework, including technology companies Google, Yahoo, Facebook, AOL, MySpace, Novell and Sun Microsystems.
The OpenID identity management pilot at the National Institutes of Health (NIH) will be limited to conference registration, wiki authorization and library access, which require only Level of Access (LOA) 1 authentication.
Debbie Bucci, the integration services center program lead at the Center for Information Technology at NIH, talked about the success of existing identity management frameworks for authentication at the institute.
Bucci is cautious about implementing OpenID but sees utility in federated identity, given the success of InCommon, an identity framework at NIH. She expressed support for the “idea that you could take the same username and password and spread it around the business units.”
According to Bucci, NIH’s systems have more than 35,000 users, 250 service-level agreements and handle over 1 million transactions every day, 83% of which are external. Current user participation for InCommon is 21%, focused on higher education and research. The NIH’s electronic research administration supports more than 9,500 institutions and agencies, according to Bucci. By contrast, InCommon includes 165. More information about these identity management programs can be found at Federatedidentity.nih.gov.
According to Peter Alterman, senior advisor for strategic initiatives at NIH, the institute is continuing to work toward implementation of the Electronic Signatures in Global & National Commerce Act, also known as E-SIGN.
According to Thibeau, the core design principle for the trust framework is “openness,” meaning it will be open to all identity providers, qualified auditors, provider certification and evolution. He says that both the OpenID and Identity Card Foundations are working to collaborate with Harvard University’s Berkman Center and the Center for Democracy and Technology (CDT) to further expand the open trust framework.
That latter relationship may be important, as the CDT’s Schwartz said that “at Level 3 , we have a lot of concerns. If you don’t have limitations there, there will be a drive to ask for as much information as you can get.” Many high-priority citizen-to-government transactions are classified as LOA 3 or higher, including IRS tax filing, Social Security and Medicare. Given that limitation, there may be some roadblocks to address before government agencies that must address compliance under the Privacy Act implement this federated identity management framework.
Questioned about time frames and implementation metrics, Thibeau said in an email interview to “remember the effort under way is a pilot; a very deliberate beta test of new technology protocols, new integration and interoperability task. We don’t know when we will finish, but we do know we will make mistakes and wrestle with usability and security issues.
”Given all the players involved, it’s hard to say what will be completed and when. The most valuable new piece is how many people and many organizations are coalescing around a practical and far-reaching solution set for the challenges of identity from a user perspective. This goes beyond the tired truisms that often characterize privacy versus security debates. There is a real hunger for real solutions in identity authentication. Whether you frame it as open government, open source or open identity, there are powerful political, public and commercial drivers at work involving identity on the Web. The legal and policy discussions around open identity trust frameworks are a leading-edge indication that practical solutions are in play and
pragmatic (private and public sectors) organizations are involved.”
Thibeau was clear about the stage that the pilot is currently in. “We are at the beginning of a shakedown cruise on two tracks,” he said, referring to both the open source identity technologies and the open trust framework itself. “Both are parts of the GSA ICAM schema and both are on the agenda of the OpenID Foundation and Identity (IDF and ICF) boards to consider. They still have a review of and decision making around certification requirements, operations and strategy. As we begin technical testing of government pilots, we are also finalizing the certification of a trust framework process that is a critical element in government adoption and seen by some industry leaders as applicable for high value commercial applications.
Thibeau went on to explain that “the U.S. government is still finalizing requirements for credible, independent and industry standards-based identity certification.” The process holds interest beyond the borders of the U.S. as well, according to Thibeau. “Many international governments as well as U.S. state and local governments are studying the U.S. ICAM test of its ‘schema’ of technology protocols combined with industry self certification models. Identity provider certification of Open Trust Framework models have gained momentum after recent meetings with the Center for Democracy in Technology and feedback from various government agencies, including the GSA ICAM leadership, NIST, NIH and the national security staff in the White House.”
John Bradley, the chief security officer at ooTao Inc, serves on the OASIS XRI, XDI and ORMS Technical Committees and fielded questions about the details of the OpenID pilot at NIH. For more information, Bradley’s blog includes many useful links on the OpenID in government project.
Building more intelligence and efficiency into the network, however, has relevance to more than energy policy. As a working group of information security professionals determined over the course of the summer, there are significant smart grid privacy concerns to consider.
These considerations can be neatly summarized in the following excerpt from the NIST report: “The major benefit provided by the Smart Grid, i.e. the ability to get richer data to and from customer meters and other electric devices, is also its Achilles’ heel from a privacy viewpoint. Privacy advocates have raised serious concerns about the type and amount of billing and usage information flowing through the various entities of the Smart Grid … that could provide a detailed time-line of activities occurring inside the home.”
As privacy expert Rebecca Herold explains on her blog, smart grid privacy needs to be considered as utilities move to a next-generation infrastructure. Those implications were concisely listed by Herold as follows:
Sarah Cortes, a contributor for SearchCompliance.com, was the project manager for the Privacy Sub-group of the NIST’s Cyber Security Coordination Task Group.
Key points in the current release of the smart grid privacy document include the following issues, according to Cortes:
The body of the privacy groups work may be found in this draft: NISTIR 7628 Smart Grid Cyber Security Strategy and Requirements (PDF).
Social networking and distributed collaboration sped up report writing for infosec team
One aspect of the report’s generation is worth recognizing: the role that the various collaborative technologies and social networking platforms played in gathering, synthesizing and producing the final deliverable for NIST. As Cortes explained in an email, preparing the current release of the Smart Grid privacy document included the following considerations:
According to Christophe Veltsos, a Midwestern-based information security professional who participated in the NIST CSCTG, the team used the suite of collaborative technologies common to many enterprises in late 2009.
“Gal Shpantzer and I used Google Docs to do live edits, both of us working at the same time,” said Veltsos. “We used either a live phone line or GChat to help facilitate the conversation.” The team members, including Herold, also used email, free conference-calling websites and tweets to send quick bursts of info/updates to each other.
Cortes also said NIST involved Twitter users from the start.
UPDATE: Christophe Veltos wrote to correct the record on the central role that DC-based information security consultant Gal Shpantzer played in organizing the CSCTG. Veltsos points out that “while Sarah was the project manager, Gal was the catalyst and is considered by NIST to be the team leader of the privacy group.”
“When forming the group, NIST staff turned to the industry professionals they most respected across the U.S.: members of Twitter’s online information technology privacy, compliance and security community,” she explained. ”One by one, Gal recruited respected members of the IT professional community, met with prospective members in person at times, and sought out suggestions for additional members. All prospective members could quickly and easily be thoroughly checked out as far as qualifications, accomplishments, and references, all informally through common Twitter features. The breadth and depth of advisory group members was substantial compared to similar panels formed with more traditional methods taking far longer.
According to Cortes, “Twitter has become the medium of choice for networking IT professionals for a few reasons, among them:
If you have thoughts and comments about either smart grid privacy or the utility of social networking for collaboration between compliance and security professionals, please leave them in the comments. Or, if you like, @reply on Twitter. You’ll find SearchCompliance.com there under @ITcompliance, as well as this author as @digiphile.
When you listen to the podcast, you’ll learn the following:
Herold is an information privacy, security and compliance consultant, and a frequent contributor to SearchCompliance.com. You can read her blog at Realtime-ITCompliance.com and follow her on Twitter at @PrivacyProf.
Herold’s recent work at SearchCompliance.com includes:
Major questions for enterprise adoption remain, however, particularly around the issues of security and compliance. When I attended “An Evening in the Clouds” at last year’s Enterprise 2.0 Conference, where Amazon.com, Google and Salesforce.com offered complete cloud computing environments, these paired challenges were cited by a California state CIO as major barriers to governmental adoption. The advantages posed by accessing the computing power and storage capacity offered in the cloud have simply not outweighed the risks posed by exposing sensitive data by moving it outside of encrypted internal networks.
The issue of compliance and cloud computing is of critical importance for organizations large and small, public or private, nonprofit or for-profit. As Lamont Wood notes in his excellent article for Computerworld on cloud compliance, “the user organization is responsible for figuring out who is doing what to its data and requiring assurances about the data staying in compliance.”
As Wood writes, there are specific issues raised by the usual suspects: SAS 70, Payment Card Industry Data Security Standards (PCI DSS) and the Health Insurance Portability and Accountability Act. For instance, Chris Day, senior VP at Terremark Worldwide Inc., noted in the article:
“PCI responsibilities of a cloud provider include firewalls, intrusion detection, disaster recovery, physical controls and appropriate segmentation of staff duties. Servers handling PCI data should be in a separate room with solid walls and a monitored door, rather than being placed in the main floor of the data center with the other servers.”
Moving personally identifiable information (PII) that an organization has assumed responsibility for protecting to a third party also requires a degree of trust that many governmental entities have been understandably hesitant to make. Or, in some cases, will never make. As Scott McClellan, VP and chief technologist of scalable computing at HP, said to Stacy Higgenbotham during an interview for GigaOm: “Solving security is a trust issue that can be surmounted, but the legal issues around location cannot be. There are also items, such as corporate data for financial results during a quiet period, that aren’t going to leave the enterprise walls.” CIOs, CTOs and CCOs tasked with SOX compliance will likely find that reality familiar.
In the wake of comments by Vivek Kundra, the new U.S. CIO, federal adoption of cloud computing and Software as a Service (SaaS) is likely about to increase. Consider what David Linthicum writes at the Intelligent Enterprise: “It’s clear that new White House appointee Vivek Kundra is part of a ‘new generation of CIOs’ that consider cloud computing as a viable architectural option.” Just read what Kundra told The Wall Street Journal last year:
“I’m all about the cloud computing notion. … I look at my lifestyle, and I want access to information wherever I am. … I am killing projects that don’t investigate Software as a Service first.”
The nation’s first CIO is going to face some stiff criticism in both private and public enterprises. I’m not worried about Kundra’s ability to weather such critiques — he handled the federal sting on his D.C. offices with grace, after all — but there are valid questions about whether certain data sets and institutions should ever be integrated into a cloud computing model.
Perhaps prompted by Kundra’s evangelism, Google’s lobbying or pervasive cloud security concerns, the FTC met this past week to discuss securing personal data in the global economy. The discussion was timely. (Although, as of today, there was no webcast posted for the session.)
As reported by Computerworld UK, Google is getting slammed for the security of its cloud services. According to Jeremy Kirk, “the Electronic Privacy Information Centre is calling on the U.S. Federal Trade Commission to investigate whether Google is making deceptive claims over the security of data stored in cloud-computing services such as Gmail and Google Docs.”
That’s precisely the issue that Chuck Goolsbee focused on when he shared his considerable skepticism about whether cloud computing providers can meet regulatory compliance requirements like PCI DSS in “Don’t buy cloud computing hype: Business model will evaporate.” If there isn’t an effective regulatory standard, framework and guidance from the appropriate regulatory bodies on security, compliance issues will derail enterprise adoption.
FTC guidance on cloud compliance or official recognition by the PCI Security Standards Council may be forthcoming later this year. Given the visibility, adoption and interest in cloud computing in the enterprise, cloud compliance under PCI or something like it seems likely. Michael Dahn addressed the issue late last year in“Cloud computing security and PCI,” relating the controversy to earlier discussions around compliance and virtualization. He reiterates a common mantra I’ve heard this month: “Regulatory compliance and PCI are NOT technology issues, but risk management issues.”
One potential way that enterprises can adopt cloud computing and still remain compliant may be to use multiple clouds: an external cloud and an internal cloud or private cloud. In fact, at least one Gartner analyst thinks private cloud networks are the future of corporate IT. As Higgenbotham reported in her excellent article on the cloud and the enterprise at GigaOm:
“HP and companies such as Elastra, Sun Microsystems and IBM are pitching highly virtualized and automated environments that mimic the agility of the public clouds. However, all of the people at enterprise-oriented companies I’ve spoken with believe that their customers should start turning over some computing tasks to external clouds, be they infrastructure providers like those offered by Amazon, Rackspace’s new CloudServers business, Sun’s planned cloud or platforms such as Microsoft’s Azure.
Many believe enterprise customers will source their computing to multiple clouds, both to avoid vendor lock-in and because some clouds will be optimized for certain types of computing tasks. That’s why tools to manage multiple clouds will be important. RightScale, Aptana, Sun and others are all trying to help manage multiple clouds.
And once an enterprise sends out data to external clouds, they will need to find ways to manage, secure and actually deliver this data from inside the corporation to a cloud. Some providers, like Rackspace and Voxel, are banking on customers using their hosting and cloud products as a way to keep the data inside the same company (and maybe data center). [Werner Vogels, CTO of Amazong's Web Services] says Amazon uses VPNs with enterprise clients, while startups such as Aspera are creating private highways to deliver data between clouds.”
In a throwback to a classic IT conundrum, this paradigm of multiple clouds and multiple cloud providers is complicated by interoperability issues. As Rich Miller writes at Data Center Knowledge, cloud interoperability is a hot topic at the moment as a result of the posed multiple cloud model and the diversity of infrastructure and standards presented by Google, Amazon, Microsoft and other cloud computing providers. As Craig Balding of Cloud Security told Wall Street & Technology, “Amazon and Google are walled gardens. You can’t take an app from Google and bring it over to Amazon because they’re architected differently.”
In the cited article, “Adoption of Cloud Computing Hindered by Security and Operations Concerns,” Penny Crossmon reported significant compliance issues rest in the ability of cloud computing providers to meet audit requirements. As she writes, “the biggest hurdle for cloud computing on Wall Street right now — especially among large firms — is a lack of visibility into cloud providers’ operations and security.” Balding suggested that nondisclosure agreements may provide a means for cloud computing providers to share more detailed information about where data is being handled, how, by whom and how it is being protected — all crucial under many compliance regulations. Amazon Web Services released a cloud security white paper to try to address the visibility issue but leaves key details like access to log files or access for digital forensic investigations related to e-discovery unanswered.
One project to keep an eye on is Eucalyptus, an open source system that enables organizations to create private clouds that will be interoperable with multiple cloud computing platforms. As Simon Wardley recently blogged on Radar O’Reilly, “Karmic Koalas love Eucalyptus.” This cryptic title refers to the fact that Mark Shuttleworth recently announced that the next release of Ubuntu 9.10 is code-named Karmic Koala. The project won’t address the thorny issues of compliance requirements but, given the ambition of Canonical (the company that sponsors and supports Ubuntu) to “provide our users with the ability to build their own clouds whilst promoting standards for the cloud computing space,” it may drive the industry towards the adoption of PCI or other frameworks.
For more commentary and perspectives on compliance in the cloud, review the following articles and blog posts: