November 17, 2009 10:55 PM
Posted by: GuyPardon
, Critical infrastructure
, Federal Information Security Management Act of 2002
, Government agency
, Ponemon Institute
, United States
, United States Department of Health and Human Services
A new study of top government IT executives conducted by the Ponemon Institute identified outsourcing, cyberterrorism and an increasingly mobile workforce as significant threats to data, government systems and the nation’s critical infrastructure.
IT executives from the Departments of Defense, Justice, Homeland Security and Health and Human Services represented the largest proportion of respondents to the study, which was sponsored by CA Inc.
The study found that 63 percent of respondents perceived the increasingly mobile workforce “as contributing significantly to endpoint security risks as a result of insecure mobile data-bearing devices that are susceptible to malware infections as well as insecure wireless connectivity.”
[Image by Getty Images via Daylife]
Perhaps reflecting the current zeitgeist around the “Government 2.0” movement and compliance concerns around enterprise 2.0 tools, the study showed that 79% of respondents see increased use of collaboration tools as a significant risk to data protection.
Specifically, the use of social computing platforms is increasing the storage of unstructured data that could contain sensitive information in a repository that is not effectively secured. Fifty-two percent of respondents identified the use of Web 2.0 applications as a vector for increased risk for sensitive data loss, including social networking, social messaging and wikis.
Unstructured data and outsourcing were viewed as the top two root causes creating increased cybersecurity risks for insecure sensitive and confidential information among respondents. This concern is reflected at the Department for Homeland Security, where application security has been referenced as both a supply chain risk and a cyberterrorism threat.
As reported by the study, 38% of respondents were unsure if there had been cybercrime on the network in the past year. What’s perhaps more significant is the 2% to 5% of people who know that it had happened. And that may not reflect the true total.
“I do feel the numbers are underreported,” said David Hansen, CA’s corporate vice president and general manager of the company’s security management unit. “In the past, cybercrime incidents have tended to be brushed under the carpet. More pressure on disclosure has forced some changes to happen and is helpful for awareness.”
Data breaches, by way of contrast, must be published or reported, and 34% of respondents said that their agency had experienced two to five data breaches in the past year. Overall, 75% of respondents said that their agency had experienced a data breach in the last year. Respondents overwhelming chose wireless networks as the primary threat vector, followed by endpoints and networks.
Finally, 48 % of respondents said their organization isn’t taking appropriate steps to comply with the Federal Information Security Management Act (FISMA) and 55% don’t have adequate security technologies to protect information assets and critical infrastructure.
“When I talk to government agencies, they look at FISMA compliance as a necessary evil,” said Hansen. “I think they might have to either redefine it to address new threats and create a lower common denominator or push for accountability.”
The question now, as bills like the ICE Act or the Cybersecurity Act work their way through Congress, is whether FISMA reform will adequately address the vulnerabilities that government IT executives are worried about.
“The problem is that, in many cases, government doesn’t have a lot of control of a lot of critical infrastructure, like manufacturing, power plants or private networks,” said Hansen. “Part of cybersecurity is about critical infrastructure and things that are not covered by FISMA. Most of those systems have no viruses or malware protection. That hasn’t been an issue because those systems weren’t connected to the Internet. Now, systems are being connected and are creating massive exposures that just weren’t there before.”
The Ponemon Institute’s “Cybersecurity Mega Trends” study is available for download from CA.com as a PDF.
November 9, 2009 10:10 PM
Posted by: GuyPardon
, cybersecurity threats
, ICE Act
, Melissa Hathaway
, United States Central Command
, United States Department of Defense
, White House
Yesterday, CBS News’ 60 Minutes devoted its opening story to cybersecurity threats to critical infrastructure in the United States, including the power grid, financial systems and military information systems. Threatpost, the information security blog associated with Kaspersky Labs, has embedded the 60 Minutes segment on cyberterrorism.
In an interview with correspondent Steve Kroft, cybersecurity expert Jim Lewis calls a federal data breach in 2007 “our electronic Pearl Harbor.” In the transcript of the segment, available at CBSNews.com, Lewis said. “Some unknown foreign power, and honestly, we don’t know who it is, broke into the Department of Defense, to the Department of State, the Department of Commerce, probably the Department of Energy, probably NASA. They broke into all of the high-tech agencies, all of the military agencies, and downloaded terabytes of information.”
Lewis also spoke about the penetration of U.S. military networks, specifically the United States Central Command (CENTCOM). Lewis believes the data breach was accomplished by foreign spies leaving corrupted thumbnail drives in locations where U.S. military personnel would be likely to pick them up. When a drive was inserted into a CENTCOM computer, a malicious application on the drive opened a back door for hackers to access the system. According to Lewis, the Pentagon has now banned thumbnail drives. (David Mortman offered advice last year about whether enterprises should also ban USB drives.)
60 Minutes has also posted several short video interviews online that offer more time with Lewis, including “Hacking the ATMs,” “Hacking the DOD” and “The Holy Grail,” where Lewis talks about the security of the financial system. In “Online Jihad,” Shawn Henry, assistant director of the FBI’s Cyber Division, discusses potential cybersecurity threats from Islamic fundamentalism.
The report from 60 Minutes coincides with our own coverage. Growing cybersecurity threats to critical infrastructure and the electric grid have put a new focus on NERC regulations, as well as FISMA, warned NERC’s chief security officer, Michael Assante. Melissa Hathaway, former acting senior director for cyberspace for the National Security and Homeland Security councils, also spoke of the need for better public-private cooperation at the same cybersecurity panel in Washington that Assante spoke at last month. And Lewis says that new rules for cyberwar are being defined as the risks grow.
IT security pros and analysts alike know that intrusions, breaches and a growing cybersecurity threat aren’t anything new. Dave Lewis, a veteran security practitioner and blogger, commented that “the overwhelming FUD was troublesome.” Dan Kennedy, CISO at the Praetorian Group, wished that “the FBI would knock off the cloak-and-dagger routine when they’re asked a follow up question.”
Regardless of where you stand on the 60 Minutes report, one fact remains clear: The White House still hasn’t appointed a cybersecurity coordinator.
As Marc Ambinder observed at TheAtlantic.com, “last night’s 60 Minutes feature on cybersecurity may add a sense of political urgency to the debate” about a cybersecurity coordinator.
Shane Harris, also writing about the broadcast of the segment on cybersecurity, also put the 60 Minutes report in perspective. “Although the piece didn’t make much news, it was news to most Americans. Full disclosure, I know the producer, Graham Messick, and while I don’t have any special insights into how he approached the subject, I think it’s fair to say that his work will change the cyber security debate in some fundamental ways.”
Harris wonders if the report could have an effect on legislation and subsequent regulatory compliance, like FISMA reform associated with further iterations of the ICE Act. “There are a number of bills pending in Congress that threaten to set requirements on companies to disclose the holes in their networks,” he wrote. “Those bills just got a major push last night. All in all, while 60 Minutes didn’t exactly blow the lid off anything last night, they have elevated the attention of this issue to new heights. That alters the political dynamics significantly.”
UPDATE: Wired Magazine has reported that the blackouts in Brazil in 2007 were “actually the result of a utility company’s negligent maintenance of high voltage-insulators on two transmission lines,” not computer hackers. 60 Minutes relied upon “unnamed sources” in claiming that the two-day outage described by Kroft in the Atlantic state of Espirito Santo “was triggered by hackers targeting a utility company’s control systems.”
Now, Wired reports the following:
The utility company involved, Furnas Centrais Elétricas, told Threat Level on Monday, it “has no knowledge of hackers acting in Furnas’ power transmission system.”
Brazilian government officials disputed the report over the weekend, and Raphael Mandarino Jr., director of the Homeland Security Information and Communication Directorate, told the newspaper Folha de S. Paulo that he’s investigated the claims and found no evidence of hacker attacks, adding that Brazil’s electric control systems are not directly connected to the internet.
November 6, 2009 10:10 PM
Posted by: GuyPardon
American Recovery and Reinvestment Act of 2009
, Federal government of the United States
, U.S. CIO Vivek Kundra
U.S. CIO Vivek Kundra, appearing Friday as the keynote speaker at the University of Maryland’s CIO Forum, touched on a number of topics affecting both public- and private-sector CIOs. Some of his comments follow:
“We found that the role of CIOs in the federal government is very much focused on data centers, networking and technology, not on how we can transform the function of the public sector itself.” He explained that he wants to “leverage tech to fundamentally change the way the public sector operates.” Now, as the federal government works to account for each of the $787 billion in spending from the American Recovery and Reinvestment Act of 2009 and publishes more data from its agencies, Kundra said, “we’re shifting away from democratizing data to thinking about how public policy can be powered by that information.”
Cloud computing, SOA and agile development
In tracing the path of technology from agrarian to industrial to the current information revolution, Kundra noted the transformative effect of both cell phones and social networking platforms like Facebook, YouTube and Twitter. “We’re seeing the impact that Twitter has on the geopolitical climate of the world,” he said. “Information is far more liquid than it has been in the history of civilization.” The disruptive effects of the online revolution in user-generated content are steadily filtering into government. The “Darwinian pressures” exerted upon real estate, real estate, consumer products and the automotive industry haven’t hit government yet, Kundra observed. “It’s easy to go online and compare consumer products, but it’s very difficult, if not impossible, to get information to make intelligent decisions.” In launching the contest Apps for Democracy, in fact, Kundra found a way to introduce an element of competition and innovation into an government IT ecosystem that was underserved in both areas.
Kundra has been a proponent of cloud computing for years, going back to his position as the CTO of the District of Columbia, where he signed a contract with Google for business services. Today, he emphasized the need for security, interoperability and data portability in federal government use of cloud computing. “As we make the shift towards cloud computing, security threats need to be addressed. Solutions cannot be bolted on afterwards. Data portability is central, so that as we move from Vendor A to Vendor B we architect this with interoperability and standards so that we don’t spend billions later.”
Questioned on whether service-oriented architecture still is an emphasis in a federal cloud computing paradigm, Kundra said SOA “absolutely” still matters. “Look at the Social Security Administration and what it’s done with SOA and local government,” he said. “They can build lightweight applications to interact with databases elsewhere.” That embrace of modern development practices extends beyond just SOA or upgrading programmers’ skills from COBOL. “How do we move towards an agile procurement or agile development methodology?” asked Kundra.
In some areas, the government is moving to make systems more interoperable. Kundra pointed to what what’s happening between the IRS and Department of Education in student aid. “Before, if you wanted to apply and get aid, you had to fill out a FAFSA,” said Kundra. “That form is more complex than a 1040.” Starting in January, there will be a brand new online way to fill out a Free Application for Student Aid, according to Kundra, which will eliminate 70 questions and 20 Web screens. “Students will be able to get IRS data and autopopulate it in the form for student aid.”
Government 2.0 and data-driven policy
As he grows into the U.S. CIO role, Kundra has continued to add to the areas where government IT spending and management has been and where he’d like it to go. IT systems were “not invested where they should be, which is at the intersection of the American people and government,” he said. As he put it, it’s a “simple change in default setting to being that of secretive, opaque and closed to transparent, open and participatory.”
The old mode involved the management of $70 billion of federal IT investments through a “closed, opaque, checklist-driven process,” Kundra said. Now USAspending.gov, the federal IT dashboard, tracks spending. The website has received more than 56 million hits since launch, according to Kundra. In the old way of thinking, there was a “presumption that the government has a monopoly on the best ideas,” said Kundra. Now, Data.gov provides machine-readable data for developers to mash up. Historically, there’s been a “complex, time-consuming, paper-based acquisition process,” said Kundra. Now, there’s Apps.gov.
Cybersecurity and FISMA reform
Kundra sees the same transition toward more flexible systems in cybersecurity. “We’re moving from a manual, reporting-based, compliance-focused approach to a real-time measurement of actual cybersecurity,” said Kundra, referring to the new “Cyberscope” system for online reporting of cybersecurity threats that launched in October. “You cannot address real-time threats with a solution that’s focused on reporting requirements on a quarterly basis.”
November 2, 2009 9:30 PM
Posted by: GuyPardon
, cybersecurity threats
, Federal Emergency Management Agency
, identity theft
, Melissa Hathaway
, National security
, United States
, White House
Melissa Hathaway, former acting senior director for cyberspace for the National Security and Homeland Security councils, spoke of the need for better public-private cooperation at a cybersecurity panel in Washington last week.
Hathaway was part of a panel at the International Spy Museum in Washington, D.C., held to draw attention to the growing dangers online as National Cybersecurity Month drew to a close.
“Thank god for Akamai, who redirected a lot of the bandwidth and kept the Department of Transportation and NYSE up and running,” she said, referring to the DDoS attacks on the U.S. government earlier this year. Hathaway highlighted the importance of moving forward on enacting the 25 recommendations included in the cybersecurity report she delivered to the White House.
Her remarks followed the same theme as the speech on cybersecurity threats she delivered to the ArcSight Conference earlier this month.
Hathaway was proud of the attention that the Obama administration has paid to the issue, observing that when President Obama spoke, it was “the first time the leader of any country spoke about cyberspace or cybersecurity for any length of time.” Obama’s speech on cybersecurity is embedded below.
[kml_flashembed movie="http://www.youtube.com/v/wjfzyj4eyQM" width="425" height="350" wmode="transparent" /]
Hathaway noted that cybersecurity threats are a personal issue to the president, referring to attacks against his BlackBerry, and to his staff, given “their data breaches, and policy documents that he lost.”
“Many people don’t realize their computer is already infected by a botnet” she said, emphasizing the importance of raising awareness of the risks. “How many people realize that when they buy a thumb drive that it comes with extra executables for marketing purposes to send data home?”
Hathaway called endemic data breaches in the business world “one of the biggest secrets that no one is talking about publicly” and drew attention to a rising tide of electronic fraud worldwide. “In Bulgaria,” she said, “one of our colleagues said you can’t withdraw cash at an ATM unless you have your cellphone and it geolocates you.” How many people now have to put ZIP codes in for gas? “That’s because POS terminals have been hijacked.”
Cybersecurity threats extend beyond fraud, identity theft and data breaches. “There is generally a lack of agreement about what is a crime in cyberspace, much less what is an act of war,” Hathaway said. “In the event of a digital disaster, who is going to restore the infrastructure?” Also key: Who will pay? “It’s not going to be the government,” she said, at least not under current Federal Emergency Management Agency frameworks. “There’s no equivalent of a national disaster in cyberspace yet.”
November 2, 2009 9:26 PM
Posted by: GuyPardon
Center for Strategic and International Studies
, International Spy Museum
, National security
, United States
James Lewis, director and senior fellow of the Technology and Public Policy Program at the Center for Strategic and International Studies, soberly assessed the risks to national security that lie ahead in cyberspace. “It’s primarily an espionage problem,” he said. “This is the easiest way to be a spy that has ever been invented … there’s zero chance of being caught and prosecuted if you’re smart about it.”
Lewis made that observation speaking on a panel at the International Spy Museum in Washington, D.C., held to draw attention to the growing dangers online as National Cybersecurity Month drew to a close.
Citing cyberattacks on Estonia, Lewis, the project director for the Commission on Cybersecurity for President Obama, said he anticipated more advanced attacks in future cyberwars, either by militaries or by non-state entities in the distant future.“All advanced militaries now include cyberattack capabilities.” As he put it, “you can send missiles, commando teams — or you can send hackers. And hackers are much cheaper.”
Lewis believes that those “attacks are not what we have to worry about,” however – it’s “those that disrupt critical infrastructure” that keep him up at night. “The challenge is that the Internet was built for scientists,” he said, which meant that it was built to assume trust. The U.S. has “built an exceptionally insecure environment that our military and economy now depend on.” As a result, Lewis said, “the U.S. is more vulnerable than any other country” because it has put the Internet to the best use for its economy, politics, research and military.
A central challenge in this new operational environment is that “the old Cold War notion of deterrence doesn’t work,” Lewis said. “We’ve put a lot of effort into the offensive side, but it hasn’t helped us on the cybersecurity side.” Moving forward with improving the nation’s exposure to cybersecurity risks is also challenging because of the traditional approaches to solving problems on a national scale in the U.S. “Do we wait for the market or wait for something that has a larger role for government,” asked Lewis. It’s difficult to discuss, he said, because “our ideology is to talk about a market solution, but we’re facing competitors who aren’t bound by that.”
There are also legal boundaries that must be considered in the context of new threat vectors and technologies. “The laws that we have to protect civil liberties and privacy were written 20 to 30 years ago,” said Lewis. “In the old days, you couldn’t look at traffic without understanding the content.”
Now, as he observed, the question is “How do you involve DHS? Or NSA? Some of this leads back to the FISA debate. To really defend cyberspace, you need better situational awareness. What we need to know for cybersecurity, you need to look at all the traffic coming into the U.S.” When Lewis, however, asked how many in the audience supported such a move from DHS, few hands went up, reflecting the complexity of such electronic filtering.
October 30, 2009 4:01 PM
Posted by: Linda Tucci
, file sharing
The Washington Post broke a story last night that should prick up the ears of information security and compliance officers. The names of more than 30 lawmakers under scrutiny by the highly secretive House ethics committee for possible ethics violations were leaked when a “low-level” staffer working from home put them on a peer-to-peer file sharing network.
The security breach brought swift action. The staffer was fired, and a lot of Congressional leaders were embarrassed. Statements came flying from all parties involved. The ethics committee does not make the names public (of their colleagues, no less!) until an official investigation is announced, for the obvious reason that these secret probes could unfairly damage a lawmaker’s reputation.
The leak does not appear to be politically motivated in any obvious way. The source who tipped off the reporters is not connected to the congressional investigations, according to the story. Which makes this security breach all the more scary.
The incident should add a big jolt to the Committee on Oversight and Government Reform hearings under way on inadvertent file sharing over P2P networks. And serve as another reminder to CIOs to revisit their P2P policies. As we reported in a story in August on the P2P hearings, research shows that 73% of companies take some kind of stance on P2P, but only 18% ban it outright. Companies tend to view P2P file sharing as more of a bandwidth issue than a security risk. Think again, and check out the story for peer-to-peer file-sharing tips.
October 27, 2009 7:43 PM
Posted by: Scot Petersen
California Data Security and Privacy Law
, data breach
, Massachusetts Data Security and Privacy Law
, SB 20
In case you missed it, California Gov. Arnold Schwarzenegger vetoed Senate Bill 20, which would have added a few more requirements to the state’s existing data breach notification law.
Sponsored by state Sen. Joe Simitian, the additions to the landmark data breach law would require holders of personal information to reveal the type of information that was lost and details of the actual breach incident, in addition to notifying data owners of the event.
In his veto letter, Schwarzenegger called the bill “unnecessary … because there is no evidence that there is a problem with the information provided to consumers.”
In an interview with SearchCompliance.com in September, Sen. Simitian said that final negotiations had eliminated any opposition to SB 20, and said the purpose of the bill was to provide consumers with more information. “My argument was, you want to let the state know, so we can get some sense of the scope of the problem,” he said. “And also so consumers have some sense. If I communicate to you that you are one of three files that were compromised, then you are probably a little more anxious and a little more likely to take some steps to protect yourself then if you were one of 500,000.”
In reacting to the veto, Sen. Simitian said, “I’m surprised as well as disappointed by the governor’s veto,” said Simitian in a statement. “This was a common sense step to help consumers. No one likes to get the news that personal information about them has been stolen. But when it happens, people are entitled to get the information they need to decide what to do next. This bill would have made one of California’s key consumer protections even better.”
What happens next is not clear. Simitian said in the interview that if SB 20 was passed he would not foresee any additional changes, arguing that the “light touch” of the existing law was enough to keep data holders responsible and proactive, rather than mandating encryption and other technologies like Massachusetts and Nevada have done.
October 23, 2009 1:52 PM
Posted by: GuyPardon
, carbon compliance
, data center
, Greenhouse gas
, Smart Grid
, United States Congress
On Monday, the White House announced a “bottom up” initiative to “green government,” launching a new initiative for federal employees to contribute ideas for energy efficiency. The GreenGov Challenge follows up on an Executive Order that President Barack Obama signed on Oct. 5 that directed federal agencies to appoint a sustainability officer and set emissions reductions targets for 2010.
Watch: Video of President Obama signing the Executive Order
In other words, so-called “carbon compliance” is now officially on the horizon line for the IT staff at federal agencies. If Congress decides to move forward with regulation of greenhouse gas emissions, CIOs at businesses in the private sector will also be faced with meeting new requirements.
Asking more than 1.8 million civilian employees and armed service members for their ideas on saving energy is bound to yield a good idea or three. Larger questions around implementation and measurement of enforcement of carbon emissions will be thornier and may not lend themselves to crowdsourcing.
As I wrote in today’s story, the role of sustainability software in carbon compliance is likely to be substantial. Another issue to be aware of is nascent competition in the market for electric metering in the smart grid. Google PowerMeter might run right up against the entrenched leader in smart metering software, a certain business software company located in Germany: SAP. As reported last year by SearchSAP.com, SAP is positioned for utility transformation as the smart grid develops. To be fair, Google is positioned at the consumer and small business level, while SAP is the definition of an enterprise software provider.
Given the pressure for homeowners, businesses and data center operators to become more sustainable in the years ahead, however, there’s likely to be room in the carbon compliance software market for both companies for some time to come.
October 8, 2009 9:18 PM
Posted by: GuyPardon
, Identity management
, National Institute of Health
, OpenID Foundation
, United States
As I reported last month, the U.S. federal government will try using OpenID as a federated identity framework for .gov authentication.
“The OpenID and .gov project’s goal is to make government more transparent to citizens,” said Don Thibeau, executive director of the OpenID Foundation at the OASIS Identity Management 2009 conference, referring the audience to IDManagement.gov.
There are now more than 1 billion OpenID-enabled accounts, according to Thibeau, with more than 40,000 websites supporting the framework, including technology companies Google, Yahoo, Facebook, AOL, MySpace, Novell and Sun Microsystems.
The OpenID identity management pilot at the National Institutes of Health (NIH) will be limited to conference registration, wiki authorization and library access, which require only Level of Access (LOA) 1 authentication.
Debbie Bucci, the integration services center program lead at the Center for Information Technology at NIH, talked about the success of existing identity management frameworks for authentication at the institute.
Bucci is cautious about implementing OpenID but sees utility in federated identity, given the success of InCommon, an identity framework at NIH. She expressed support for the “idea that you could take the same username and password and spread it around the business units.”
According to Bucci, NIH’s systems have more than 35,000 users, 250 service-level agreements and handle over 1 million transactions every day, 83% of which are external. Current user participation for InCommon is 21%, focused on higher education and research. The NIH’s electronic research administration supports more than 9,500 institutions and agencies, according to Bucci. By contrast, InCommon includes 165. More information about these identity management programs can be found at Federatedidentity.nih.gov.
According to Peter Alterman, senior advisor for strategic initiatives at NIH, the institute is continuing to work toward implementation of the Electronic Signatures in Global & National Commerce Act, also known as E-SIGN.
According to Thibeau, the core design principle for the trust framework is “openness,” meaning it will be open to all identity providers, qualified auditors, provider certification and evolution. He says that both the OpenID and Identity Card Foundations are working to collaborate with Harvard University’s Berkman Center and the Center for Democracy and Technology (CDT) to further expand the open trust framework.
That latter relationship may be important, as the CDT’s Schwartz said that “at Level 3 , we have a lot of concerns. If you don’t have limitations there, there will be a drive to ask for as much information as you can get.” Many high-priority citizen-to-government transactions are classified as LOA 3 or higher, including IRS tax filing, Social Security and Medicare. Given that limitation, there may be some roadblocks to address before government agencies that must address compliance under the Privacy Act implement this federated identity management framework.
Questioned about time frames and implementation metrics, Thibeau said in an email interview to “remember the effort under way is a pilot; a very deliberate beta test of new technology protocols, new integration and interoperability task. We don’t know when we will finish, but we do know we will make mistakes and wrestle with usability and security issues.
”Given all the players involved, it’s hard to say what will be completed and when. The most valuable new piece is how many people and many organizations are coalescing around a practical and far-reaching solution set for the challenges of identity from a user perspective. This goes beyond the tired truisms that often characterize privacy versus security debates. There is a real hunger for real solutions in identity authentication. Whether you frame it as open government, open source or open identity, there are powerful political, public and commercial drivers at work involving identity on the Web. The legal and policy discussions around open identity trust frameworks are a leading-edge indication that practical solutions are in play and
pragmatic (private and public sectors) organizations are involved.”
Thibeau was clear about the stage that the pilot is currently in. “We are at the beginning of a shakedown cruise on two tracks,” he said, referring to both the open source identity technologies and the open trust framework itself. “Both are parts of the GSA ICAM schema and both are on the agenda of the OpenID Foundation and Identity (IDF and ICF) boards to consider. They still have a review of and decision making around certification requirements, operations and strategy. As we begin technical testing of government pilots, we are also finalizing the certification of a trust framework process that is a critical element in government adoption and seen by some industry leaders as applicable for high value commercial applications.
Thibeau went on to explain that “the U.S. government is still finalizing requirements for credible, independent and industry standards-based identity certification.” The process holds interest beyond the borders of the U.S. as well, according to Thibeau. “Many international governments as well as U.S. state and local governments are studying the U.S. ICAM test of its ‘schema’ of technology protocols combined with industry self certification models. Identity provider certification of Open Trust Framework models have gained momentum after recent meetings with the Center for Democracy in Technology and feedback from various government agencies, including the GSA ICAM leadership, NIST, NIH and the national security staff in the White House.”
John Bradley, the chief security officer at ooTao Inc, serves on the OASIS XRI, XDI and ORMS Technical Committees and fielded questions about the details of the OpenID pilot at NIH. For more information, Bradley’s blog includes many useful links on the OpenID in government project.