October 30, 2009 4:01 PM
Posted by: Linda Tucci
, file sharing
The Washington Post broke a story last night that should prick up the ears of information security and compliance officers. The names of more than 30 lawmakers under scrutiny by the highly secretive House ethics committee for possible ethics violations were leaked when a “low-level” staffer working from home put them on a peer-to-peer file sharing network.
The security breach brought swift action. The staffer was fired, and a lot of Congressional leaders were embarrassed. Statements came flying from all parties involved. The ethics committee does not make the names public (of their colleagues, no less!) until an official investigation is announced, for the obvious reason that these secret probes could unfairly damage a lawmaker’s reputation.
The leak does not appear to be politically motivated in any obvious way. The source who tipped off the reporters is not connected to the congressional investigations, according to the story. Which makes this security breach all the more scary.
The incident should add a big jolt to the Committee on Oversight and Government Reform hearings under way on inadvertent file sharing over P2P networks. And serve as another reminder to CIOs to revisit their P2P policies. As we reported in a story in August on the P2P hearings, research shows that 73% of companies take some kind of stance on P2P, but only 18% ban it outright. Companies tend to view P2P file sharing as more of a bandwidth issue than a security risk. Think again, and check out the story for peer-to-peer file-sharing tips.
October 27, 2009 7:43 PM
Posted by: Scot Petersen
California Data Security and Privacy Law
, data breach
, Massachusetts Data Security and Privacy Law
, SB 20
In case you missed it, California Gov. Arnold Schwarzenegger vetoed Senate Bill 20, which would have added a few more requirements to the state’s existing data breach notification law.
Sponsored by state Sen. Joe Simitian, the additions to the landmark data breach law would require holders of personal information to reveal the type of information that was lost and details of the actual breach incident, in addition to notifying data owners of the event.
In his veto letter, Schwarzenegger called the bill “unnecessary … because there is no evidence that there is a problem with the information provided to consumers.”
In an interview with SearchCompliance.com in September, Sen. Simitian said that final negotiations had eliminated any opposition to SB 20, and said the purpose of the bill was to provide consumers with more information. “My argument was, you want to let the state know, so we can get some sense of the scope of the problem,” he said. “And also so consumers have some sense. If I communicate to you that you are one of three files that were compromised, then you are probably a little more anxious and a little more likely to take some steps to protect yourself then if you were one of 500,000.”
In reacting to the veto, Sen. Simitian said, “I’m surprised as well as disappointed by the governor’s veto,” said Simitian in a statement. “This was a common sense step to help consumers. No one likes to get the news that personal information about them has been stolen. But when it happens, people are entitled to get the information they need to decide what to do next. This bill would have made one of California’s key consumer protections even better.”
What happens next is not clear. Simitian said in the interview that if SB 20 was passed he would not foresee any additional changes, arguing that the “light touch” of the existing law was enough to keep data holders responsible and proactive, rather than mandating encryption and other technologies like Massachusetts and Nevada have done.
October 23, 2009 1:52 PM
Posted by: GuyPardon
, carbon compliance
, data center
, Greenhouse gas
, Smart Grid
, United States Congress
On Monday, the White House announced a “bottom up” initiative to “green government,” launching a new initiative for federal employees to contribute ideas for energy efficiency. The GreenGov Challenge follows up on an Executive Order that President Barack Obama signed on Oct. 5 that directed federal agencies to appoint a sustainability officer and set emissions reductions targets for 2010.
Watch: Video of President Obama signing the Executive Order
In other words, so-called “carbon compliance” is now officially on the horizon line for the IT staff at federal agencies. If Congress decides to move forward with regulation of greenhouse gas emissions, CIOs at businesses in the private sector will also be faced with meeting new requirements.
Asking more than 1.8 million civilian employees and armed service members for their ideas on saving energy is bound to yield a good idea or three. Larger questions around implementation and measurement of enforcement of carbon emissions will be thornier and may not lend themselves to crowdsourcing.
As I wrote in today’s story, the role of sustainability software in carbon compliance is likely to be substantial. Another issue to be aware of is nascent competition in the market for electric metering in the smart grid. Google PowerMeter might run right up against the entrenched leader in smart metering software, a certain business software company located in Germany: SAP. As reported last year by SearchSAP.com, SAP is positioned for utility transformation as the smart grid develops. To be fair, Google is positioned at the consumer and small business level, while SAP is the definition of an enterprise software provider.
Given the pressure for homeowners, businesses and data center operators to become more sustainable in the years ahead, however, there’s likely to be room in the carbon compliance software market for both companies for some time to come.
October 8, 2009 9:18 PM
Posted by: GuyPardon
, Identity management
, National Institute of Health
, OpenID Foundation
, United States
As I reported last month, the U.S. federal government will try using OpenID as a federated identity framework for .gov authentication.
“The OpenID and .gov project’s goal is to make government more transparent to citizens,” said Don Thibeau, executive director of the OpenID Foundation at the OASIS Identity Management 2009 conference, referring the audience to IDManagement.gov.
There are now more than 1 billion OpenID-enabled accounts, according to Thibeau, with more than 40,000 websites supporting the framework, including technology companies Google, Yahoo, Facebook, AOL, MySpace, Novell and Sun Microsystems.
The OpenID identity management pilot at the National Institutes of Health (NIH) will be limited to conference registration, wiki authorization and library access, which require only Level of Access (LOA) 1 authentication.
Debbie Bucci, the integration services center program lead at the Center for Information Technology at NIH, talked about the success of existing identity management frameworks for authentication at the institute.
Bucci is cautious about implementing OpenID but sees utility in federated identity, given the success of InCommon, an identity framework at NIH. She expressed support for the “idea that you could take the same username and password and spread it around the business units.”
According to Bucci, NIH’s systems have more than 35,000 users, 250 service-level agreements and handle over 1 million transactions every day, 83% of which are external. Current user participation for InCommon is 21%, focused on higher education and research. The NIH’s electronic research administration supports more than 9,500 institutions and agencies, according to Bucci. By contrast, InCommon includes 165. More information about these identity management programs can be found at Federatedidentity.nih.gov.
According to Peter Alterman, senior advisor for strategic initiatives at NIH, the institute is continuing to work toward implementation of the Electronic Signatures in Global & National Commerce Act, also known as E-SIGN.
According to Thibeau, the core design principle for the trust framework is “openness,” meaning it will be open to all identity providers, qualified auditors, provider certification and evolution. He says that both the OpenID and Identity Card Foundations are working to collaborate with Harvard University’s Berkman Center and the Center for Democracy and Technology (CDT) to further expand the open trust framework.
That latter relationship may be important, as the CDT’s Schwartz said that “at Level 3 , we have a lot of concerns. If you don’t have limitations there, there will be a drive to ask for as much information as you can get.” Many high-priority citizen-to-government transactions are classified as LOA 3 or higher, including IRS tax filing, Social Security and Medicare. Given that limitation, there may be some roadblocks to address before government agencies that must address compliance under the Privacy Act implement this federated identity management framework.
Questioned about time frames and implementation metrics, Thibeau said in an email interview to “remember the effort under way is a pilot; a very deliberate beta test of new technology protocols, new integration and interoperability task. We don’t know when we will finish, but we do know we will make mistakes and wrestle with usability and security issues.
”Given all the players involved, it’s hard to say what will be completed and when. The most valuable new piece is how many people and many organizations are coalescing around a practical and far-reaching solution set for the challenges of identity from a user perspective. This goes beyond the tired truisms that often characterize privacy versus security debates. There is a real hunger for real solutions in identity authentication. Whether you frame it as open government, open source or open identity, there are powerful political, public and commercial drivers at work involving identity on the Web. The legal and policy discussions around open identity trust frameworks are a leading-edge indication that practical solutions are in play and
pragmatic (private and public sectors) organizations are involved.”
Thibeau was clear about the stage that the pilot is currently in. “We are at the beginning of a shakedown cruise on two tracks,” he said, referring to both the open source identity technologies and the open trust framework itself. “Both are parts of the GSA ICAM schema and both are on the agenda of the OpenID Foundation and Identity (IDF and ICF) boards to consider. They still have a review of and decision making around certification requirements, operations and strategy. As we begin technical testing of government pilots, we are also finalizing the certification of a trust framework process that is a critical element in government adoption and seen by some industry leaders as applicable for high value commercial applications.
Thibeau went on to explain that “the U.S. government is still finalizing requirements for credible, independent and industry standards-based identity certification.” The process holds interest beyond the borders of the U.S. as well, according to Thibeau. “Many international governments as well as U.S. state and local governments are studying the U.S. ICAM test of its ‘schema’ of technology protocols combined with industry self certification models. Identity provider certification of Open Trust Framework models have gained momentum after recent meetings with the Center for Democracy in Technology and feedback from various government agencies, including the GSA ICAM leadership, NIST, NIH and the national security staff in the White House.”
John Bradley, the chief security officer at ooTao Inc, serves on the OASIS XRI, XDI and ORMS Technical Committees and fielded questions about the details of the OpenID pilot at NIH. For more information, Bradley’s blog includes many useful links on the OpenID in government project.
October 7, 2009 3:54 PM
Posted by: GuyPardon
, Identity management
, Public key infrastructure
, Software as a service
Last week at the OASIS Identity Management Conference, Gregg “Skip” Bailey, director of technology integration for the federal practice at Deloitte, suggested that agencies looking to leverage the power and scale of cloud computing should use the Identity, Credential and Access Management Subcommittee’s (ICAM) framework.
Bailey says embracing that framework may solve some of the federated public key infrastructure (PKI) management challenges involved in securing personally identifiable information (PII). Bailey said that a useful resource, a cloud standards wiki for proposed and ratified cloud computing standards, is available at the (aptly entitled) Cloud-Standards.org.
Bailey, a former CIO at the Bureau of Alcohol, Tobacco and Firearms, defined the role of a CIO simply in this context to conference attendees: “reduce the cost of commodity technologies and increase innovation in applying those technologies to mission goals.”
“Private clouds are the predominant focus in large enterprises,” said Bailey. “Single-purpose SaaS offerings are most widely adopted.” In his assessment, cloud computing “probably provides the ability to apply to both areas,” with “enterprise flexibility and time to value are significant drivers.”
October 2, 2009 7:21 PM
Posted by: GuyPardon
, Google Docs
, identity theft
, National Institute of Standards and Technology
, Personally identifiable information
, Smart Grid
, smart grid privacy
Last month, the National Institutes of Standards and Technology (NIST) outlined a framework for building more intelligence and interoperability into the electrical system of the United States. Such a system is generally known as the “smart grid.” Commerce Secretary Gary Locke released a plan for smart grid interoperability that’s meant to lead to a “secure, more efficient and environmentally friendly” system. A draft of the report from NIST is available for download as a PDF: “NIST Framework and Roadmap for Smart Grid Interoperability Standards Release 1.0″
Building more intelligence and efficiency into the network, however, has relevance to more than energy policy. As a working group of information security professionals determined over the course of the summer, there are significant smart grid privacy concerns to consider.
These considerations can be neatly summarized in the following excerpt from the NIST report: “The major benefit provided by the Smart Grid, i.e. the ability to get richer data to and from customer meters and other electric devices, is also its Achilles’ heel from a privacy viewpoint. Privacy advocates have raised serious concerns about the type and amount of billing and usage information flowing through the various entities of the Smart Grid … that could provide a detailed time-line of activities occurring inside the home.”
As privacy expert Rebecca Herold explains on her blog, smart grid privacy needs to be considered as utilities move to a next-generation infrastructure. Those implications were concisely listed by Herold as follows:
- Identity theft.
- Determining personal behavior patterns.
- Determining specific appliances used.
- Performing real-time surveillance.
- Revealing activities through residual data.
- Targeted home invasions.
- Providing accidental invasions.
- Activity censorship.
- Decisions and actions based upon inaccurate data.
- Revealing activities when used with data from other utilities.
Sarah Cortes, a contributor for SearchCompliance.com, was the project manager for the Privacy Sub-group of the NIST’s Cyber Security Coordination Task Group.
Key points in the current release of the smart grid privacy document include the following issues, according to Cortes:
- Enforcement of state privacy-related laws is often delegated to agencies other than public utility commissions.
- State utility commissions currently lack formal privacy policies or standards related to the smart grid.
- The lack of consistent and comprehensive privacy policies throughout the entities that will be involved with the smart grid creates a privacy risk.
- Comprehensive and consistent definitions of personally identifiable information do not typically exist.
The body of the privacy groups work may be found in this draft: NISTIR 7628 Smart Grid Cyber Security Strategy and Requirements (PDF).
Social networking and distributed collaboration sped up report writing for infosec team
One aspect of the report’s generation is worth recognizing: the role that the various collaborative technologies and social networking platforms played in gathering, synthesizing and producing the final deliverable for NIST. As Cortes explained in an email, preparing the current release of the Smart Grid privacy document included the following considerations:
- Ensuring adequate input from each of the 50 state NARUC energy commissions and other sources in a very short time frame.
- Aligning recommendations with the plethora of existing laws.
- Documenting concrete privacy risks.
- Separating privacy risks from security and other risks.
According to Christophe Veltsos, a Midwestern-based information security professional who participated in the NIST CSCTG, the team used the suite of collaborative technologies common to many enterprises in late 2009.
“Gal Shpantzer and I used Google Docs to do live edits, both of us working at the same time,” said Veltsos. “We used either a live phone line or GChat to help facilitate the conversation.” The team members, including Herold, also used email, free conference-calling websites and tweets to send quick bursts of info/updates to each other.
Cortes also said NIST involved Twitter users from the start.
UPDATE: Christophe Veltos wrote to correct the record on the central role that DC-based information security consultant Gal Shpantzer played in organizing the CSCTG. Veltsos points out that “while Sarah was the project manager, Gal was the catalyst and is considered by NIST to be the team leader of the privacy group.”
“When forming the group, NIST staff turned to the industry professionals they most respected across the U.S.: members of Twitter’s online information technology privacy, compliance and security community,” she explained. ”One by one, Gal recruited respected members of the IT professional community, met with prospective members in person at times, and sought out suggestions for additional members. All prospective members could quickly and easily be thoroughly checked out as far as qualifications, accomplishments, and references, all informally through common Twitter features. The breadth and depth of advisory group members was substantial compared to similar panels formed with more traditional methods taking far longer.
According to Cortes, “Twitter has become the medium of choice for networking IT professionals for a few reasons, among them:
- If you’re in IT and you’re not comfortable with Twitter, you are lacking a basic technical skill.
- Twitter enables members of the IT community to check out each other’s static Web pages and credentials, but then get to know members of their own industry over time through their communications streams. How professional and informative is this person, over a period of time? How respected are they by other well-respected professionals, apparent through the interlocking web of followers? How many others respect this person, apparent from absolute numbers of followers, quality of followers, and mentions by others?
- Twitter communication allows personality to come through and thus enables people to feel comfortable with each other much more quickly than other mediums.
- It allows for a combination of private and public messages, allowing swift reaction to breaking industry developments.
- It allows professionals to get a quick response to a technical question.
- It enables professionals to know at a glance whether they are up to date on developments on our field or out to lunch, a constant problem in this field. What are other respected IT professionals talking about each day? What are they not talking about?”
If you have thoughts and comments about either smart grid privacy or the utility of social networking for collaboration between compliance and security professionals, please leave them in the comments. Or, if you like, @reply on Twitter. You’ll find SearchCompliance.com there under @ITcompliance, as well as this author as @digiphile.
September 28, 2009 9:23 PM
Posted by: GuyPardon
, Health care
, Health Insurance Portability and Accountability Act
, Information security
, IT compliance
, Payment card industry
, PCI DSS
, Wired Equivalent Privacy
The diversity of stakeholders involved in IT compliance is reflected in the many compliance resources that are published each month across the TechTarget network of IT media. For instance, this month’s Storage Decisions Conference explored how storage managers must explain retention, email archiving and compliance.
At SearchOracle.com, there’s news about how Oracle updated Agile PLM for food and beverage compliance, allowing manufacturers to better analyze ingredients for safety.
At SearchFinancialSecurity.com, a new story explores full disk encryption, which is fast becoming a priority for laptop security in midmarket companies given increasing fears of data breaches. The article explains how to choose full disk encryption for laptop security, compliance.
Earlier this year, SearchNetworking.com ran “New PCI compliance rules ban WEP, tighten wireless LAN security.”
PCI DSS compliance
Since security and compliance are bound closely together, it should come as no surprise that SearchSecurity.com features new compliance resources regularly. That’s particularly true when it comes to PCI compliance.
Last week, site editor Rob Westervelt wrote “PCI virtualization SIG closer to proposing changes to standard.” Westervelt writes that the PCI Virtualization Special Interest Group, which has been studying virtualization for the payment card industry (PCI), is close to issuing guidance ways to maintain PCI DSS compliance when using virtualization.
For more on PCI, editorial director’s Kelley Damore feature about what PCI compliance really means in September’s issue of Information Security magazine has a plethora of useful links.
Elsewhere on SearchSecurity.com, Eric Holmquist offered guidance on strategies for using technology to enable automated compliance.
Given that schools are back in session, IT admins entrusted with securing the records of students may find security expert David Mortman’s explanation for how to prepare for a FERPA audit useful.
Mortman also provides useful advice on a PCI DSS requirement for monitoring and testing security, PCI DSS compliance: ensuring data integrity and understanding PCI DSS compliance requirements for log management.
And “across the pond,” SearchSecurity.uk.co wrote about new products that aim to streamline compliance efforts.
SearchSecurity.com also publishes compliance resources that serve the fast-moving healthcare field, including stories like “FTC extends breach notification to Web-based health repositories” and “HIPAA compliance manual: Training, audit and requirement checklist.”
Again, Mortman provides expert advice on this areas, including guidelines to create a HIPAA-compliant data center, HHS HIPAA guidance on encryption requirements and data destruction and information on writing a patient identifier policy to prevent common HIPAA violations.
We’ve been covering healthcare at SearchCompliance.com as well, along with our sister site, SearchCIO.com, where senior writer Linda Tucci recently wrote that health care security and HIPAA compliance are on deck for CIOs.
We published “HITECH changes the game, but HIT standards still on way” this morning, in fact, following on our FAQ on the HITECH Act’s impact on IT operations and a tip about when is a data breach under HITECH is really ‘discovered.’
Here’s hoping you find these compliance resources useful in your own efforts. If you have other websites you regularly visit to find compliance resources to help you meet regulatory mandates, please let us know in the comments.
September 21, 2009 2:29 PM
Posted by: GuyPardon
, Federal Communications Commission
, Internet access
, Julius Genachowski
, Net neutrality
In a speech delivered to a packed briefing room at The Brookings Institution in Washington, D.C., Federal Communications Commission Chairman Julius Genachowski proposed two new principles for Net neutrality: nondiscrimination and transparency.
“The Internet is an extraordinary platform for innovation, job creation, investment and opportunity. It has unleashed the potential of entrepreneurs and enabled the launch and growth of small businesses across America,” said Genachowski. “It is vital that we safeguard the free and open Internet.”
The speech, “Preserving a Free and Open Internet: A Platform for Innovation, Opportunity, and Prosperity,” outlined the reasons for supporting a more aggressive approach to regulating broadband providers. You can watch it below:
[kml_flashembed movie="http://www.youtube.com/v/dF7rbOj-HPA" width="425" height="350" wmode="transparent" /]
“Network operators cannot prevent users from accessing the lawful Internet content, applications and services of their choice, nor can they prohibit users from attaching nonharmful devices to the network,” said Genachowski, reaffirming the four open Internet principles outlined by his predecessor, Michael Powell.
In today’s speech, Genachowski added two new principles for consideration. According to the prepared release provided by the FCC:
“The first would prevent Internet access providers from discriminating against particular Internet content or applications, while allowing for reasonable network management.
“The second principle would ensure that Internet access providers are transparent about the network management practices they implement.”
The chairman said that he will now seek to begin the process of codifying the commission’s existing four open Internet principles, along with the two additional principles, through a “Notice of Proposed Rulemaking” at an upcoming October meeting.
The FCC is now soliciting further input and feedback on the proposed rules and application of these rules, including how to determine “whether network management practices are reasonable, what information broadband providers should disclose about their network management practices and how the rules apply to differing platforms, including mobile Internet access services.”
“I look forward to working with my Commission colleagues on this important initiative,” Genachowski said. “Commissioners Copps, McDowell, Clyburn and Baker each bring a unique and important perspective to the complex issues at stake, and I look forward to getting their input and insight when we kick off the rulemaking process next month.”
As part of Genachowski’s commitment to openness and transparency, the FCC launched a new website, OpenInternet.gov, to encourage public participation in the process.
September 17, 2009 9:02 PM
Posted by: GuyPardon
By this point in 2009, most online users know about online social networking platforms like Facebook. Business users have seen many attempts to bring social networking and other Web 2.0 features within the enterprise, all of which are generally classified under the enterprise 2.0 label for social software.
This past weekend, I saw a preview of 3121, a social networking platform for Congressional staffers that can safely be termed “government 2.0,” applying the same technologies to link up the staffers roaming the halls of the House and Senate. For those unfamiliar, 3121 is the extension for the Capitol Hill switchboard.
As Andrew Nusca blogged yesterday, “Social media, digital directory collide on Capitol Hill with ‘3121’”. The launch of 3121, however, will now bring the same issues to government that exist in applying social software to businesses, given the compliance concerns that dog enterprise 2.0 collaboration platforms.
Chris Contakes, the chief technology officer (CTO) at National Journal Group, publisher of the National Journal, addressed a number of these concerns in an email interview yesterday, as 3121 launched. His interview follows, below. Here’s a look at the 3121 “dashboard:
What is the technological backbone for 3121? Is it just Jive, or is there more baked in?
The technological backbone is a combination of commercial software (Jive) and custom software developed by National Journal Group staff. Jive Software has developed software for numerous industries, including the intelligence community, where their solution was implemented for the building of A-Space.
What regulations or laws did you consider in building a commercial product for this particular audience?
Because 3121 is part of NationalJournal.com, we considered all the same laws and regulations we do for all of our editorial offerings that service the House, Senate and beyond. Additionally, we worked closely with the appropriate parties in the House and the Senate to address any potential product, security and ethics concerns in developing 3121.
How do they pertain to the terms of service you will pose to users?
What tracking technologies are part of 3121?
Tracking tools are limited to Google analytics for Web analytics. This information will be used to better understand how the community is using the application.
Will data from 3121 ever be offered or sold to third parties?
How is privacy managed? Is encryption used? If so, where? Can administrators see all updates into accounts? Is the system configured to be audited or easily dumped for e-discovery?
Privacy is managed via many of the built-in privacy mechanisms that Jive offers. Encryption is used and all transactions and data are protected by SSL. Community managers have access to updates but have no reason or need to monitor. Access to this data is on a strictly need-to-know/need-to-see basis for system or community troubleshooting purposes only.
What safeguards are there for the input of personally identifiable information?
Accounts are tied to valid Hill email addresses only. For example, a forgotten password can only be retrieved by the holder of a valid Hill email address. Also, all data is encrypted over SSL.
Should legislation be discussed in the context of this network (a likelihood, given the people who will be using it), how will that fit in with the overall themes of openness and transparency set out by the new administration and its CIO and CTO?
3121 is about giving members of Congress and their staff the tools they need to be efficient and effective in their jobs, borrowing from the best practices of cloud collaboration, another theme important to the new administration and its CIO and CTO.
I appreciate the time that 3121’s CTO spent offering my questions. Whether the provisions that have been made in designing the product are sufficient to prevent security issues or compliance headaches on Capitol Hill remains to be seen. The briefing I attended indicated that feeds from external Web services can be customized on Web pages. Given the acknowledged security risks of social networking platforms, there could be potential for malware or social engineering attacks, especially if shortened URLs enter the system. As Contakes pointed out, however, Jive has been responsible for a number of government enterprise 2.0 platforms, including A-Space, so these issues won’t be novel.
The larger issue here may rise above simple compliance or security concerns, however, to the existence of the platform. Staffers will have to decide whether the benefits of collaboration in the context makes more sense than the development of internal wikis, blogs or other alternatives, none of which are currently in much use around Congressional offices. There’s also a more philosophical question to consider: Should a commercial provider of news be the owner of a Congressional social network? The National Journal will have an opportunity to gain insight into both the legislative process and the information-sharing habits of staffers, although it won’t allow third-party access to such data. Contakes asserts that behavioral data will not ever be shared with a third party. The success of the platform may hinge on the National Journal sticking to its own policy — and to its administrators avoiding the urge to read the posts or messages designated “secret” by staffers.
On the Hill, after all, information represents power. That may turn out to be truer than ever as Congressional hallways become virtual.