This morning, the White House made it official: Howard Schmidt will be the nation’s next cybsersecurity coordinator. The longtime industry veteran will be returning to the executive branch, where he worked previously as vice chairman of the President’s Commission on Critical Infrastructure Protection. Schmidt will report to deputy national security advisor (NSA) John Brennan. You can watch video of Howard Schmidt on the cybersecurity coordinator role by clicking on the image below:
Schmidt was formerly chief information security officer (CISO) at eBay and chief security officer at Microsoft and has worked with federal and local law enforcement and the Defense Department. As Ellen Nakashima reported in The Washington Post, the new cybersecurity coordinator also served as special adviser for cyberspace security from 2001 to 2003, where he shepherded the National Strategy to Secure Cyberspace, a plan that Nakashima writes “was largely ignored.” Schmidt was also the president and CEO of the Information Systems Security Association, an international nonprofit organization that focuses on risks and research in the cyberworld. The question now will be whether a man hailed as a good communicator can also ensure better cybersecurity across industry and government.
“Howard is a good match for this task,” said Vint Cerf, Google’s chief Internet evangelist, as quoted by The Atlantic Monthly’s Marc Ambinder. “I’ve been impressed by his consensus-building style. He’s thoughtful, knowledgeable and he knows Washington.”
Cerf, as quoted in the New York Times article on the cybersecurity coordinator, said that “I’ve come away with a strong sense that Vivek Kundra, chief information officer, and Aneesh Chopra, the chief technology officer, and participants at the N.S.C. are aligned on this effort.”
Filling the position at the National Security Council was overdue, given the time that has elapsed since Melissa Hathaway delivered a cybersecurity report that called for a cybersecurity coordinator to coordinate the nation’s efforts. As SearchSecurity.com Editorial Director Mike Mimoso reported, “Obama announced on May 29 he intended to personally select a cybersecurity coordinator who would coordinate cybersecurity policies across government agencies.”
In May, Threatpost Editor Dennis Fisher recorded a podcast with Schmidt. In the podcast, the incoming cybersecurity coordinator talks about the role, cybercrime and how to fix federal cybersecurity.
CSO Online Senior Editor Bill Brenner enjoyed excellent timing yesterday when he published an email interview with Schmidt. Schmidt made a number of predictions for 2010, including that he believed that cloud computing will be a security enabler. Schmidt wrote that “2010 will be the tipping point as to much wider adaption in all sectors. The overall net effect will give us a better chance to develop more security in the cloud using better vulnerability management/reduction, strong authentication, robust encryption and closer attention to legal jurisdictions.”
The timing of the White House appointment of a cyber coordinator is, as Ambinder wrote, something of an early Christmas gift, though perhaps not for Schmidt himself. As Ambinder observed, “It’ll be a thankless job: given the near-certainty that the government will experience some massive data breach or a major cyber terrorism attack, Schmidt will be both the point person — and the person seen as responsible, even though he lacks the statutory authority to prevent these catastrophes.”
In the security industry, reactions to the appointment have been generally positive. Like Ambinder, Dave Lewis, a Canada-based IT security practitioner and editor at Liquidmatrix Security Digest, also sees a tough challenge ahead for Schmidt. “I think that this is an extremely unenviable position for him to take,” he said. “There are numerous turf wars that he will be at risk of becoming collateral damage in the crossfire. I would like to see him succeed. There needs to be a central point of control for IT security.”
George Moraetes, an information security and enterprise architect, related a similar sentiment: “I really don’t know if congratulations or even condolences are in order.”
Moraetes supports the appointment of Schmidt, stating he “is the best advocate and most experienced individual to take on this incredibly difficult job that basically has no teeth or jurisdiction to preside over federal agencies. He is the only person capable of this job, having solid federal government and corporate experience at top levels, and knows the ropes.”
Patricia Titus, former CISO for the Transportation Security Administration and now CISO for Unisys Federal Systems, is similarly supportive. “He comes with exactly the type of credentials to rally the right people at the needed levels. His private- and public-sector background lends itself well to knowing who needs to sit at the table. There hasn’t been that level of IT credentials and security experience in a similar position before.”
Titus sees the position of the cybersecurity coordinator directly under the deputy NSA as “critical to the success of the position. The fact that John has publicly stated that Howard will have regular access to the president shows that cybersecurity is a national priority.” Schmidt will be charged with assessing and mitigating a complex mix of threats and authorities. ‘I think that all of us in cybersecurity look at the difference between compliance and verifiable security carefully. Are we spending too much time writing documents, versus in real-time monitoring of security controls? Howard’s role may be to address that from a policy standpoint, with regards to securing critical infrastructure, government websites and agencies.”
“I’m cautiously pessimistic about anyone in that job, but I think Howard has a better shot than most,” said David Mortman, CSO-in-residence at Mason, Ohio-based security consultancy Echelon One. “Howard is a known quantity and knows how to play the game. Gives him a huge advantage, since it’s like he’s simultaneously an insider and an outsider. Hopefully the best of both worlds.”
Dan Kennedy, CISO of the Praetorian Security Group, also wrote in to share his take on the appointment of the new cybersecurity coordinator: “I am familiar with Howard, having watched him speak numerous times, being introduced to him a few times, having sat at a dinner round table across from him, and having been an ISSA member for years who reads his introductions every month. I think Howard Schmidt is both a smart guy and one who understands the issues of information security. I don’t always agree with what he has to say, but if you are quoted as much as Howard is that will happen. He doesn’t say completely crazy things, as a few senior security executives do now and then, and has a conservative approach to IS concerns. Howard is a competent choice, and clearly better than many alternatives having worked in the private sector and having been involved very closely and nearly exclusively in the infosec industry. This is much better than, say, a competent technologist, a lawyer who understands technology at a high level, or related choices taking on their first big information security job with this position.”
“That said, he is a safe choice, one who has had an opportunity already in what was a very similar position under the Bush administration. I, like many folks, wanted to be excited by the choice of cybersecurity czar, to see someone I thought would really shake things up. A safe choice doesn’t do that. I voted for Obama to make competent but also pushing the envelope decisions. I hoped for an appointment that would inject some discomfort into an established information security hierarchy in need of a change agent. Howard may be that; perhaps he wasn’t given enough of a chance or shackled by a lack of organizational power the last time around.”
“Don’t get me wrong: this appointment is a positive. There’s a more empowered position (especially now that the nonsense on reporting line is resolved) and a competent person in it helps information security. It was a long time coming. Howard is not afraid to speak uncomfortable truth to power, one of the hallmarks of a great CISO. I congratulate him and look to this appointment with optimism.”
FTC compliance now means new rules for social media marketing. By next year, FTC compliance could also mean ensuring that online advertising doesn’t violate tougher consumer privacy regulations — or even la
On Monday, Yahoo launched a new online privacy tool that, in theory, allows users to gain more insight into the data that the media company has gathered about their interests. According to the press release, the tool provides users with the ability to “assert greater control over their online experience,” providing Yahoo’s “educated guesses about their interests” and granular controls for those users to opt out of those categories or out of interest-based advertising entirely.
The “Ad Interest Manager” was announced and released in beta on the same day the Federal Trade Commission held its first roundtable on privacy in Washington, D.C. The privacy workshop agenda (PDF) for the FTC privacy roundtable includes academics, advocates and representatives from media, data mining, software and analytics companies.
This introduction of an online privacy tool for consumers by Yahoo follows the addition of an online privacy dashboard from Google last month and the July release of self-regulatory online privacy principles for the use and collection of behavioral data for Internet advertising.
Whether such efforts are enough to preemptively address attention from the FTC will be an open question in 2010. As FTC chairman Jon Leibowitz stated earlier in the year, this is “industry’s last chance to get its act together on behavioral targeting.”
Capitalizing on this regulatory focus, the Center for Democracy & Technology (CDT) also began a consumer online privacy campaign last week called “Take Back Your Privacy.”
“All social media should have granular privacy controls,” said Leslie Harris, president and CEO of CDT. An important element of the CDT’s online privacy campaign effort includes the release of an online privacy Compliant Tool that allows people to register online privacy concerns with the FTC and share that action with connections with social media.
Harris, who was at the FTC’s privacy roundtable yesterday, says that next year will be “the first time there will be serious consideration of consumer privacy legislation in many years.”
According to the CDT’s Ari Schwartz, Rep. Richard Boucher has put forward an outline of a consumer privacy bill that will be a framework for action in January.
As Kara Swisher pointed out at CNET, the addition of this online privacy tool by Yahoo coincides with a “bigger backdrop” of “the pending regulatory approval of the massive search and advertising partnership between Yahoo and Microsoft. The two companies announced last week that they had completed the definitive agreement for the deal.”
As Swisher observed, “one of the key issues for regulators, of course, is the privacy implications of combining the search and online ad technologies of the No. 2 and No. 3 players.” Google, Yahoo and the online advertising industry as a whole will be watching carefully to see what FTC compliance and action from Congress will mean for all in the year ahead.
For insight into the way that the regulator sees the relationships here, review the FTC graphic below describing a user’s “personal data ecosystem.”
Compliance requirements have long since pushed organizations to adopt better log management. Like IT staff at public companies, where the Sarbanes-Oxley Act and log management go together like peanut butter and jelly, IT execs working in government have to retain, manage and store logs to comply with FISMA, the Federal Information Security Management Act.
Last month I moderated a panel on enterprise log management at the CA IT Government Expo in Washington, D.C. The two panelists were Jon Kim, security architect of GTSI Corp., and Joe Ford, CTO and vice president of professional services at Patriot Technologies.
Both men offered useful best practices and practical insight into the challenge of managing logs within the enterprise. When it came to distinguishing between log management and security event management (SEM), both men observed that the utility of SEM lay in monitoring for specific access issues. Identifying changes in access or usage patterns that could indicate an issue is a key component and capability of security information and event management systems. In this context, the importance of of keeping raw log data around for forensic analysis was clear.
Department of Defense (DoD) directives on log management are directly related to achieving better “situational awareness,” a much-used concept in the cybersecurity world. The DoD Information Technology Security Certification and Accreditation Process, or DITSCAP, includes groups of activities and tasks that must be performed over the lifecycle of any existing or upgraded DoD system that collects, stores, transmits or processes either unclassified or classified information.
Better automation and monitoring of logs has the potential to reveal issues across critical infrastructure, which is critical as threats emerge. Recently released Consensus Audit Guidelines specifically refer to the importance of tying identity to activity. Both panelists saw considerable maturation in this area of log management. The next challenge will be building better authentication into enterprise systems that more effectively interoperate with log management software. Security vendors are actively pursuing this goal.
The final question that the panel considered was the lack of clear guidance from the government on how long logs should be retained. Both men acknowledged the issue, referring to the National Institute for Standards and Technology (NIST) SP 800-53 document, which describes several controls related to log management, including the generation, review, protection and retention of audit records.
Both panelists suggested that best practices should be based on the risk profile for each organization and the potential relevancy of the information to audits. In the back and forth between the panel and the audience, it was also clear that considerable differences exist within the defense community on how much and how far back to retain.
Few network, security or compliance managers would dispute that log management formats and standards are in a state of flux. In fact, enterprise log management may be the least standardized area of IT. There are efforts under way to agree on common standards for logs, however. Earlier this year, the Open Group Security Forum updated its log standard, Distributed Audit Services, or XDAS. The Security Forum has also announced work on a new compliance standard, automated compliance expert markup language, or ACEML.
Standards aren’t the only challenge for enterprise log management. The sheer volume of log files generated across networks is a huge issue. As NIST offered in its SP 800-92 guidance on security and log management:
A fundamental problem with log management that occurs in many organizations is effectively balancing a limited quantity of log management resources with a continuous supply of log data. Log generation and storage can be complicated by several factors, including a high number of log sources; inconsistent log content, formats, and timestamps among sources; and increasingly large volumes of log data.
Best practices — and regulatory mandates — mean logs should be generated, archived and monitored regularly for insight into employee activity and to detect and prevent system outages and security breaches. Without effective enterprise log management and analysis, more organizations will be found noncompliant and remain at greater risk.
As IT professionals log back in after the Thanksgiving holiday break, meeting regulatory compliance mandates continues to occupy significant amounts of both time and budget. The top regulatory compliance trends that affected IT this year have added more areas in which to manage risk and new challenges for reporting, all in the context of increased enforcement. IT governance, risk management and compliance software, or “GRC,” has been pitched by many vendors in suites that give IT compliance professionals better tools to manage processes, resources and reporting.
As senior writer Linda Tucci recently reported, IT is increasingly turning to enterprise risk management as uncertainty in the macroeconomic climate continues. Even as some enterprises have held off on further investments in GRC software, she observed, “the more budgets tightened, the more imperative it became that both IT and the business target their biggest exposures and eliminate redundant controls and audits.” For instance, in some areas, like carbon compliance, specialized GRC software has the potential to help turn carbon footprint management into cost savings.
Given continued interest in the potential of GRC software, we published a new governance, risk and compliance FAQ yesterday. If you know of neutral, useful governance, risk and compliance resources online that should be added to the FAQ, please let us know in the comments or by sending an email to email@example.com. As we add more resources to SearchCompliance.com, you’ll be able to find them at our IT governance, risk and compliance topic page. Also, make sure to check in throughout the week here on the IT Knowledge Exchange, which features two GRC blogs: “Regulatory Compliance, Governance and Security,” by Charles Denyer, and “IT Governance, Risk, and Compliance,” by Robert E. Davis.
As 2009 comes to a close, we know that the ways you are finding IT compliance resources, news and fellow compliance professionals are changing as the online environment evolves. We know your inbox isn’t the only place to find you. Last week, we created two new communities on Facebook and LinkedIn.
- Image via Wikipedia
If you’ve been staying up to date with the activities of friends, family members and colleagues on Facebook, now you can keep up with IT compliance there, too.
Just become a fan of SearchCompliance.com on Facebook.
If you focus your social networking activity on connecting with colleagues, please consider joining our IT compliance group on LinkedIn.
And, as many of you may be aware, SearchCompliance.com has been active on Twitter since our launch in January. You can follow us there at @ITcompliance.
Regardless of the platform, you’ll find updates and discussions among fellow compliance and security professionals about our most recent news, tips, interactive content, e-books and more.
How did the first U.S. “cyber czar” describe his time as the nation’s assistant secretary for Cybersecurity and Communications (CS&C)? Quoting Mark Twain, Greg Garcia observed that “a man who carries a cat by a tail learns something he can learn in no other way.”
It was “like a paintball fight in an Escher painting” at the Department of Homeland Security (DHS), Garcia described, “with great affection.”
Jokes aside, Garcia, who spoke at the CA IT Government Expo this week in Washington, was clear in describing what it was like in the crucible of the DHS making cybersecurity policy. “Our adversaries right now are better organized and better motivated than we are,” he said. “We, as a nation, are at an inflection point in this national cybersecurity challenge. We have a foundation for organizational structure in the private sector. We need to build a trust framework. If you don’t have an affirmation of trust, even with the same team, you’re not going to be able to get to an effective real-time response.”
Garcia, who served as assistant secretary for CS&C from 2006 to 2008, broke down the components of the Comprehensive National Cyber Security Initiative (CNCI) that President Bush signed in January 2008. The CNCI consists of 12 elements aimed at improving cybersecurity on federal networks. “We were seeing terabytes of data flowing out of .gov networks,” said Garcia.
CNCI components include intrusion detection and prevention, research and development into so-called “leap ahead” technologies and better situational awareness, coordinated through the National Cybersecurity Center.
Garcia advocated for better counterintelligence for cybersecurity, “classified network security,” perhaps referring to the Einstein monitoring tool and improved cybereducation and training.
Echoing the NERC CSO’s remarks last month, Garcia has had to think through how deterrence strategy changes in cyberwar, especially when other nation states are in the electric grid or government networks. “What point does a cyberattack become an act of war?” he asked. “How do you make it more dangerous for our adversaries to attack us? A lot of it has to do with attribution.”
Garcia affirmed the need for a Federal Information Security Management Act (FISMA) for ISPs, but said that “it needs to be market-driven, at least for now, until we can determine if there’s market failure. Every infrastructure sector has different business models and risk models.” Garcia provided what may be a controversial example: an initiative where major investment banks came together and “designed their own FISMA, if you will,” with auditors to assess financial network security.
When it came to the utility of FISMA in assessing cybersecurity readiness, however, Garcia had few kind words. “FISMA has not been successful, primarily because it has been a box-checking exercise,” he said. “It is not evaluating security. That’s a very hard thing to do, because you have different threat models and vulnerability environments.”
U.S. Rep. Edolphus Towns (D-N.Y.) this week introduced H.R. 4098, “The Secure Federal File Sharing Act,” which would require the Office of Management and Budget to issue guidance to prohibit the personal use of peer-to-peer file-sharing software by government employees.
Towns, who sits on the House Oversight Committee, might have been motivated to prevent another Congressional data breach. As Senior News Writer Linda Tucci reported last month, P2P file sharing exposed secret Congressional investigations at the House Ethics committee. As Tucci observed:
The source who tipped off the reporters is not connected to the congressional investigations … Which makes this security breach all the more scary. The incident should add a big jolt to the Committee on Oversight and Government Reform hearings under way on inadvertent file sharing over P2P file sharing networks.
Tucci was right on the money here. The Secure Federal File Sharing Act was referred to the House Committee on Oversight and Government Reform. Should it be enacted, the director of the Office of Management and Budget, “after consultation with the Federal Chief Information Officers Council,” will have to issue guidance within 90 days, intended to:
- “prohibit the download, installation, or use by Government employees and contractors of open-network peer-to-peer file sharing software on all Federal computers, computer systems, and networks, including those operated by contractors on the Government’s behalf, unless such software is approved in accordance with procedures under subsection (b);” and
- ”address the download, installation, or use by Government employees and contractors of such software on home or personal computers as it relates to telework and remotely accessing Federal computers, computer systems, and networks, including those operated by contractors on the Government’s behalf.”
The introduction of the Secure Federal File Sharing Act comes at a time of heightened concerns about cybersecurity threats. As Tucci also reported in August, a congressional hearing on inadvertent P2P file sharing showed how much risk is involved:
Classified or sensitive files recently found on P2P file sharing networks included: the Secret Service safe house location for the first lady, the Social Security numbers of every master sergeant in the Army, medical records of some 24,000 patients of a Texas hospital and the entire Outlook calendar of an individual who handles all the merger and acquisition activity at a well-known, publicly traded company, with attachments detailing every proposed deal.
A listing of every nuclear facility in the U.S. turned up on four sites in France. Last week also showed that illicit music downloading can have serious legal consequences: a Boston University graduate student was ordered to pay $675,000 in damages for illegally downloading songs and sharing them online.
Tucci was also right about the cautionary tale involved here: CIOs and compliance officers should all revisit their policies on the use of P2P file sharing software. As she reported in August, research from Forrester shows that “73% of companies take some kind of stance on P2P, but only 18% ban it outright. Companies tend to view P2P file sharing as more of a bandwidth issue than a security risk.”
Given the steady leakage of personally identifiable information, proprietary data or other sensitive content into these networks over the past few years, security concerns may mean peer-to-peer file sharing days have come to an end both on and off of federal IT infrastructure.
A new study of top government IT executives conducted by the Ponemon Institute identified outsourcing, cyberterrorism and an increasingly mobile workforce as significant threats to data, government systems and the nation’s critical infrastructure.
IT executives from the Departments of Defense, Justice, Homeland Security and Health and Human Services represented the largest proportion of respondents to the study, which was sponsored by CA Inc.
The study found that 63 percent of respondents perceived the increasingly mobile workforce “as contributing significantly to endpoint security risks as a result of insecure mobile data-bearing devices that are susceptible to malware infections as well as insecure wireless connectivity.”
Perhaps reflecting the current zeitgeist around the “Government 2.0” movement and compliance concerns around enterprise 2.0 tools, the study showed that 79% of respondents see increased use of collaboration tools as a significant risk to data protection.
Specifically, the use of social computing platforms is increasing the storage of unstructured data that could contain sensitive information in a repository that is not effectively secured. Fifty-two percent of respondents identified the use of Web 2.0 applications as a vector for increased risk for sensitive data loss, including social networking, social messaging and wikis.
Unstructured data and outsourcing were viewed as the top two root causes creating increased cybersecurity risks for insecure sensitive and confidential information among respondents. This concern is reflected at the Department for Homeland Security, where application security has been referenced as both a supply chain risk and a cyberterrorism threat.
As reported by the study, 38% of respondents were unsure if there had been cybercrime on the network in the past year. What’s perhaps more significant is the 2% to 5% of people who know that it had happened. And that may not reflect the true total.
“I do feel the numbers are underreported,” said David Hansen, CA’s corporate vice president and general manager of the company’s security management unit. “In the past, cybercrime incidents have tended to be brushed under the carpet. More pressure on disclosure has forced some changes to happen and is helpful for awareness.”
Data breaches, by way of contrast, must be published or reported, and 34% of respondents said that their agency had experienced two to five data breaches in the past year. Overall, 75% of respondents said that their agency had experienced a data breach in the last year. Respondents overwhelming chose wireless networks as the primary threat vector, followed by endpoints and networks.
Finally, 48 % of respondents said their organization isn’t taking appropriate steps to comply with the Federal Information Security Management Act (FISMA) and 55% don’t have adequate security technologies to protect information assets and critical infrastructure.
“When I talk to government agencies, they look at FISMA compliance as a necessary evil,” said Hansen. “I think they might have to either redefine it to address new threats and create a lower common denominator or push for accountability.”
The question now, as bills like the ICE Act or the Cybersecurity Act work their way through Congress, is whether FISMA reform will adequately address the vulnerabilities that government IT executives are worried about.
“The problem is that, in many cases, government doesn’t have a lot of control of a lot of critical infrastructure, like manufacturing, power plants or private networks,” said Hansen. “Part of cybersecurity is about critical infrastructure and things that are not covered by FISMA. Most of those systems have no viruses or malware protection. That hasn’t been an issue because those systems weren’t connected to the Internet. Now, systems are being connected and are creating massive exposures that just weren’t there before.”
The Ponemon Institute’s “Cybersecurity Mega Trends” study is available for download from CA.com as a PDF.
Yesterday, CBS News’ 60 Minutes devoted its opening story to cybersecurity threats to critical infrastructure in the United States, including the power grid, financial systems and military information systems. Threatpost, the information security blog associated with Kaspersky Labs, has embedded the 60 Minutes segment on cyberterrorism.
In an interview with correspondent Steve Kroft, cybersecurity expert Jim Lewis calls a federal data breach in 2007 “our electronic Pearl Harbor.” In the transcript of the segment, available at CBSNews.com, Lewis said. “Some unknown foreign power, and honestly, we don’t know who it is, broke into the Department of Defense, to the Department of State, the Department of Commerce, probably the Department of Energy, probably NASA. They broke into all of the high-tech agencies, all of the military agencies, and downloaded terabytes of information.”
Lewis also spoke about the penetration of U.S. military networks, specifically the United States Central Command (CENTCOM). Lewis believes the data breach was accomplished by foreign spies leaving corrupted thumbnail drives in locations where U.S. military personnel would be likely to pick them up. When a drive was inserted into a CENTCOM computer, a malicious application on the drive opened a back door for hackers to access the system. According to Lewis, the Pentagon has now banned thumbnail drives. (David Mortman offered advice last year about whether enterprises should also ban USB drives.)
60 Minutes has also posted several short video interviews online that offer more time with Lewis, including “Hacking the ATMs,” “Hacking the DOD” and “The Holy Grail,” where Lewis talks about the security of the financial system. In “Online Jihad,” Shawn Henry, assistant director of the FBI’s Cyber Division, discusses potential cybersecurity threats from Islamic fundamentalism.
The report from 60 Minutes coincides with our own coverage. Growing cybersecurity threats to critical infrastructure and the electric grid have put a new focus on NERC regulations, as well as FISMA, warned NERC’s chief security officer, Michael Assante. Melissa Hathaway, former acting senior director for cyberspace for the National Security and Homeland Security councils, also spoke of the need for better public-private cooperation at the same cybersecurity panel in Washington that Assante spoke at last month. And Lewis says that new rules for cyberwar are being defined as the risks grow.
IT security pros and analysts alike know that intrusions, breaches and a growing cybersecurity threat aren’t anything new. Dave Lewis, a veteran security practitioner and blogger, commented that “the overwhelming FUD was troublesome.” Dan Kennedy, CISO at the Praetorian Group, wished that “the FBI would knock off the cloak-and-dagger routine when they’re asked a follow up question.”
Regardless of where you stand on the 60 Minutes report, one fact remains clear: The White House still hasn’t appointed a cybersecurity coordinator.
As Marc Ambinder observed at TheAtlantic.com, “last night’s 60 Minutes feature on cybersecurity may add a sense of political urgency to the debate” about a cybersecurity coordinator.
Shane Harris, also writing about the broadcast of the segment on cybersecurity, also put the 60 Minutes report in perspective. “Although the piece didn’t make much news, it was news to most Americans. Full disclosure, I know the producer, Graham Messick, and while I don’t have any special insights into how he approached the subject, I think it’s fair to say that his work will change the cyber security debate in some fundamental ways.”
Harris wonders if the report could have an effect on legislation and subsequent regulatory compliance, like FISMA reform associated with further iterations of the ICE Act. “There are a number of bills pending in Congress that threaten to set requirements on companies to disclose the holes in their networks,” he wrote. “Those bills just got a major push last night. All in all, while 60 Minutes didn’t exactly blow the lid off anything last night, they have elevated the attention of this issue to new heights. That alters the political dynamics significantly.”
UPDATE: Wired Magazine has reported that the blackouts in Brazil in 2007 were “actually the result of a utility company’s negligent maintenance of high voltage-insulators on two transmission lines,” not computer hackers. 60 Minutes relied upon “unnamed sources” in claiming that the two-day outage described by Kroft in the Atlantic state of Espirito Santo “was triggered by hackers targeting a utility company’s control systems.”
Now, Wired reports the following:
The utility company involved, Furnas Centrais Elétricas, told Threat Level on Monday, it “has no knowledge of hackers acting in Furnas’ power transmission system.”
Brazilian government officials disputed the report over the weekend, and Raphael Mandarino Jr., director of the Homeland Security Information and Communication Directorate, told the newspaper Folha de S. Paulo that he’s investigated the claims and found no evidence of hacker attacks, adding that Brazil’s electric control systems are not directly connected to the internet.
U.S. CIO Vivek Kundra, appearing Friday as the keynote speaker at the University of Maryland’s CIO Forum, touched on a number of topics affecting both public- and private-sector CIOs. Some of his comments follow:
“We found that the role of CIOs in the federal government is very much focused on data centers, networking and technology, not on how we can transform the function of the public sector itself.” He explained that he wants to “leverage tech to fundamentally change the way the public sector operates.” Now, as the federal government works to account for each of the $787 billion in spending from the American Recovery and Reinvestment Act of 2009 and publishes more data from its agencies, Kundra said, “we’re shifting away from democratizing data to thinking about how public policy can be powered by that information.”
Cloud computing, SOA and agile development
In tracing the path of technology from agrarian to industrial to the current information revolution, Kundra noted the transformative effect of both cell phones and social networking platforms like Facebook, YouTube and Twitter. “We’re seeing the impact that Twitter has on the geopolitical climate of the world,” he said. “Information is far more liquid than it has been in the history of civilization.” The disruptive effects of the online revolution in user-generated content are steadily filtering into government. The “Darwinian pressures” exerted upon real estate, real estate, consumer products and the automotive industry haven’t hit government yet, Kundra observed. “It’s easy to go online and compare consumer products, but it’s very difficult, if not impossible, to get information to make intelligent decisions.” In launching the contest Apps for Democracy, in fact, Kundra found a way to introduce an element of competition and innovation into an government IT ecosystem that was underserved in both areas.
Kundra has been a proponent of cloud computing for years, going back to his position as the CTO of the District of Columbia, where he signed a contract with Google for business services. Today, he emphasized the need for security, interoperability and data portability in federal government use of cloud computing. “As we make the shift towards cloud computing, security threats need to be addressed. Solutions cannot be bolted on afterwards. Data portability is central, so that as we move from Vendor A to Vendor B we architect this with interoperability and standards so that we don’t spend billions later.”
Questioned on whether service-oriented architecture still is an emphasis in a federal cloud computing paradigm, Kundra said SOA “absolutely” still matters. “Look at the Social Security Administration and what it’s done with SOA and local government,” he said. “They can build lightweight applications to interact with databases elsewhere.” That embrace of modern development practices extends beyond just SOA or upgrading programmers’ skills from COBOL. “How do we move towards an agile procurement or agile development methodology?” asked Kundra.
In some areas, the government is moving to make systems more interoperable. Kundra pointed to what what’s happening between the IRS and Department of Education in student aid. “Before, if you wanted to apply and get aid, you had to fill out a FAFSA,” said Kundra. “That form is more complex than a 1040.” Starting in January, there will be a brand new online way to fill out a Free Application for Student Aid, according to Kundra, which will eliminate 70 questions and 20 Web screens. “Students will be able to get IRS data and autopopulate it in the form for student aid.”
Government 2.0 and data-driven policy
As he grows into the U.S. CIO role, Kundra has continued to add to the areas where government IT spending and management has been and where he’d like it to go. IT systems were “not invested where they should be, which is at the intersection of the American people and government,” he said. As he put it, it’s a “simple change in default setting to being that of secretive, opaque and closed to transparent, open and participatory.”
The old mode involved the management of $70 billion of federal IT investments through a “closed, opaque, checklist-driven process,” Kundra said. Now USAspending.gov, the federal IT dashboard, tracks spending. The website has received more than 56 million hits since launch, according to Kundra. In the old way of thinking, there was a “presumption that the government has a monopoly on the best ideas,” said Kundra. Now, Data.gov provides machine-readable data for developers to mash up. Historically, there’s been a “complex, time-consuming, paper-based acquisition process,” said Kundra. Now, there’s Apps.gov.
Cybersecurity and FISMA reform
Kundra sees the same transition toward more flexible systems in cybersecurity. “We’re moving from a manual, reporting-based, compliance-focused approach to a real-time measurement of actual cybersecurity,” said Kundra, referring to the new “Cyberscope” system for online reporting of cybersecurity threats that launched in October. “You cannot address real-time threats with a solution that’s focused on reporting requirements on a quarterly basis.”