This afternoon, the Department of Defense (DoD) announced the release of its new policy for the secure use of social media. The policy officially recognizes that “Internet capabilities are integral to operations across the Department of Defense” and formalizes the use of collaborative technology by service members from within the Non-classified Internet Protocol Router Network (NIPRNET). The NIPRNET is used to exchange sensitive but unclassified information between “internal” users as well as providing users access to the open Internet, as opposed to the SIPRNET, which is used for more secure communications.
“This directive recognizes the importance of balancing appropriate security measures while maximizing the capabilities afforded by 21st Century Internet tools,” said Deputy Secretary of Defense William J. Lynn III in an official release posted at Defense.gov.
Deputy CIO David Wennergren spoke to SearchCompliance.com about the new policy for the secure use of social media late Friday afternoon. (Skip ahead to read the interview in full).
Secure access from the NIPRNET
The NIPRNET will now be configured to allow services members to use such services. The new policy “will allow access to SM [social media] sites but balance it with the need to be secure,” tweeted Price Floyd, Principal Deputy Assistant Secretary of Defense for Public Affairs. He went on tweet out more details.
“Here’s the meat of it: the NIPRNET shall be configured to provide access to internet-based capabilities across all DoD components.”
“Commanders at all levels and Heads of DoD Components shall continue to defend against malicious activity affecting DoD networks (e.g., distributed denial of service attacks, intrusions) and take immediate and commensurate actions, as required, to safeguard missions (e.g., temporarily limiting access to the internet to preserve activity via social media sites (pornography, gambling, hate-crime related activities).
All use of Internet-based capabilities shall comply with paragraph 2-301 of chapter 2 of the Joint Ethics Regulation (reference (b)) and the guidelines set forth in attachment 2.”
The DoD’s social media directory shows the use of social media platforms by over a hundred different accounts, all of which will be governed by the new policy.
The full policy for the secure use of social media from the NIPRNET is embedded below:
[kml_flashembed movie=”http://d1.scribdassets.com/ScribdViewer.swf?document_id=27525669&access_key=key-7e9v31jockuzwxvljwg” width=”600″ height=”450″ wmode=”transparent” /]
“External official presences shall clearly identify that DOD provides their content,” tweeted Floyd. “External official presences shall receive approval from the responsible OSD or DoD Component Head. Official presences shall be registered on the external official presences list.”
The new DoD policy for the secure use of social media is clear on avoiding the addition of personally identifiable information (PII). “Official use shall ensure that the info posted is relevant and accurate, includes no PII, includes a disclaimer,” tweeted Floyd. “ASD (NII)/DOD CIO shall provide implementation guidance, education, training, and awareness activities. USD Intel shall develop and maintain threat estimates on current and emerging Internet-based capabilities. USD Intel shall integrate proper SM use into OPSEC training.”
“This policy is a recognition that information sharing is so important to us,” said Wennergren. “It’s about being able to connect to the right people at the right time. We’ve done a lot of work on bringing in collaboration tools inside of the Department. Now, Web 2.0 tools allow us to extend that. You need look no further than Haiti, where non-governmental organizations were able to connect and share information in a crisis.”
Raising awareness of security
Wennergren observed that there’s a big imperative around both information sharing and security. “The purpose of this memo was to help people think about both,” he said. “When you think about security, you tend to think about individual access. And when you think about sharing on its own, you don’t think about security. What we had was inconsistent application of both. Some services blocked access entirely, others opened up. This memo was issued to enforce consistency around the use of technologies that are really powerful in helping people get their jobs done better. When Second Life happened, people looked at avatars and said that was a game. Now, universities are teaching classes there. Virtual worlds are clearly not a game. There’s a reason the Chairman of the Joint Chiefs of Staff uses Twitter. And there’s a reason the Navy set up an account on Facebook.”
Flexibility in applying policy
Wennergren was thoughtful about how blurred the lines between the workplace and home have become, along with the need to allow commanders flexibility in applying the policy for the secure use of social media within operational constraints. “You have to think about having good information hygiene,” he said, “which means being thoughtful about what you share in public or in private. Commanders at all levels need to take the steps they need to guarantee uptime and security. If the networks are under attack, by all means take the steps you can. If you’re in a limited bandwidth environment, then you need to consider that context.”
Web 2.0 is the office phone of the 21st Century
Wennergren also drew a historical parallel between Web 2.0 technologies and past communication technologies. “If this were a decade ago, we would have been having this conversation about the use of the Internet. It’s an interesting thing to think about whether you have a responsible workforce. People have phones. If people make a lot of long distance calls, there are ways to deal with that. If an employee is not doing their work because they’re spending all their time on the Internet, you can address that. We don’t unplug all the phones because someone might be using it inappropriately. The regulations are clear about use of government equipment. Existing rules will continue to apply.”
He was also concerned about being an attractive employer to the generation of hyper-connected millennials that have entered the workforce over the past decade. “If we want to be an employer of choice, we better give people access to tools that let them release their creativity,” said Wennergren. “There’s huge power in using these kinds of tools but you have to use them responsibly.”
Ensuring secure access to social media
The DoD’s deputy CIO was also focused on keeping the NIPRNET secure as military personnel are accessing Twitter, Facebook or other social media platforms. “This is where we talk about a holistic approach,” said Wennergren. “People need to have access to the tools they need to get their jobs done. That means doing basic blocking and tackling; including firewalls, antivirus and monitoring of traffic going in and out of networks.”
That sort of tracking is why DISA invested in network mapping and leak detection in January. “By monitoring and keeping track of what’s going on, including anomalies, we’re better served than by blocking access to websites. The rest of the government also working on the Trusted Computing Initative,” said Wennergren, “which means putting public-facing content in a DMZ.”
Virtualization and the cloud will be involved
The shift that Wennergren described is significant, in terms of a way of thinking about enabling the workforce. “If what you really want to do is secure information sharing, you need to think about technology that allows you to do trusted computing from untrusted computers,” he said. “It’s sort of like the national park model — take nothing with you, leave nothing behind. The future is people accessing secure networks through iPhones, Android devices or other mobile clients. The next set of tools might include access through library kiosks.”
Wennergren sees desktop virtualization and server virtualization playing a role in the secure use of social media in the future. “Imagine the prize of having a trusted deskop in the cloud,” he said. “If apps and data are up in the cloud and you understand the perimeter, you can raise the boundaries of security overall. Whether it’s a bootable CD or other means, virtualization can allow you to be protect your environment from the PC you’re booting into.”
Risk management, not risk avoidance
Regardless of the technologies employed, Wennergren said “the prize” is security. “It’s like the clichéd phrase about defense in depth. People need to be thoughtful about the devices they use and what those devices can do.”
Like Howard Schmidt, the U.S. cybercoordinator, Wennergren is applying risk management to cybersecurity threats. “Moving from risk avoidance to risk management helps you to be more thoughtful,” he said. “Consider a BlackBerry. It’s inconsequential to send a trivial email using a fixed access signal. If it’s a more serious email, you should be using a Common Access Card and a PKI credential to transmit. Some people’s location data will be turned off. For others, that won’t matter as much.”
Wennergren’s focus is on delivering IT capabilities more quickly and still remaining secure. “Where it used to take a long time to deploy a new platform, now you can do much more,” he said. “If you expose your data, mash things up, use Google Earth, you can move much more rapidly. Whether it’s tracking IED locations, maritime situational awareness, aggregating Blue Team movements, Web 2.0 technologies are giving a us a huge advantage. We can share documents quickly and increase operational awareness through these technologies. Shame on you if you’re not taking advantage of them.”
A video of Wennergren from earlier this month is embedded below, in which he offers more perspectives on secure information sharing.
[kml_flashembed movie="http://www.youtube.com/v/rSlcW17SKG0" width="425" height="350" wmode="transparent" /]
The Defense Department’s Social Media Hub will host more “educational materials” on the policy in the future.
UPDATE: last year, David Meerman Scott interviewed Roxie Merritt, Director of New Media Operations at the Office of the Secretary of Defense for Public Affairs in the U.S. Department of Defense. In the video below, they talk about blogger relations, the role of Facebook and Twitter, and other ways to communicate. [Hat tip to Scott’s post about the DoD’s official policy on social media.
[kml_flashembed movie="http://www.youtube.com/v/GpVPNbrrQYs" width="425" height="350" wmode="transparent" /]
The Automated Audit, Assertion, Assessment, and Assurance API (A6) working group is newly organized under the brand of CloudAudit. The stated goal of CloudAudit is to “provide a common interface that allows cloud providers to automate the Audit, Assertion, Assessment, and Assurance (A6) of their environments and allow authorized consumers of their services to do likewise via an open, extensible and secure API.”
In this podcast, SearchCompliance.com associate editor Alexander B. Howard interviews Christofer Hoff, director of cloud and virtualization solutions at Cisco Systems, and one of Cloud Audit’s organizers. Prior to his work at Cisco, he was Unisys Corp.’s systems and technology division’s chief security architect. Hoff continues to participate in the Cloud Security Alliance. You can find Hoff’s blog at Rationalsurvivability.com/blog and follow him on Twitter as @Beaker.
Hoff says that forming A6 came out of the need for enterprise security professionals to have better tools for confirming security and cloud computing compliance at providers of these services.
When you listen to this podcast, you’ll learn:
• What Cloud Audit is.
• What problems A6 could solve for CISOs and CIOs faced with ensuring cloud computing compliance challenges.
• How Cloud Audit would map to compliance, regulatory, service-level, configuration, security and assurance frameworks, or third-party trust brokers.
For more information, visit CloudAudit.org, the relevant Google Group or the Cloud Audit code base at Google Code. Hoff has also collected recent press coverage and other information about A6 at his blog.
Is there a groundswell in business for using continuous controls monitoring to beef up corporate governance risk and compliance programs? Analysts and vendors have certainly launched a full-out campaign for its relevance.
AMR Research Inc. in Boston named CCM one of the top GRC software investments companies will make in 2010, right behind compliance management software and business process management BPM products.
Gartner Inc. analyst French Caldwell called out CCM for business and financial applications as a top trend when I spoke to him in December for our preview of the major GRC issues in 2010.
Of course, IT has long used controls automation for configuring servers, conducting audits, maintaining security and so on. CCM was used in complying with the Sarbanes-Oxley Act requirements for segregating duties. But CCM is increasingly being used for business performance issues — for example, to eliminate duplicate payments in real time rather than on a quarterly basis; or to ensure that invoices are paid on schedule but not in advance, to keep that working capital.
“Controls automation is moving up the stack. It’s making sure the business rules are being followed,” Caldwell said, adding that the big enterprise resource planning vendors such as SAP AG and Oracle Corp. are doing just that.
So are the point solution vendors. John Becker, CEO of Approva Corp., a CCM software provider, makes a case for his software in a white paper coming out in a March issue of Compliance Week. Unlike other GRC technologies, Becker argues, CCM delivers “tangible, hard-dollar savings,” and in his white paper he offers up some choice examples, presumably from Approva customers:
- Lower procurement costs: A telecom company reduced expenses by $2 million by flagging purchases that did not take advantage of available discounts, and preventing unnecessary purchases that circumvented corporate policies.
- Improved order accuracy and on-time shipments: A manufacturer of construction Lower procurement costs: A telecom company reduced expenses by $2 million by materials reduced the number of sales orders that were delayed and required manual rework by 60% by identifying incomplete and inaccurate information when the sales order was created, and flagging open sales orders that were not shipped within 20 days of their original commitment.
- Reduced accounting errors: A manufacturer in the midwestern United States reduced the number of financial reporting anomalies requiring manual follow-up and investigation by more than 50%, and significantly increased confidence in the accuracy of its financial reports.
- Lower audit and compliance costs: The internal audit organization of a $1 billion software company reduced the time its external auditor spent testing its controls by 80% for each key control that they automated.
- Reduced risk of fraud: A home improvement retailer reduced the risk of employee theft by monitoring the distribution of free samples to identify suspicious orders, excessive shipments and samples with alternate ship-to addresses.
So, is this all hype, or as the lobbyists like to say, a “conflict confluence of interest” for analysts and vendors? I’m curious if readers are seeing an uptick in continuous controls systems for GRC at their companies.
And I need to ask a really dumb question to boot: Would moving to the XBRL electronic data format for financial and other reporting accomplish the same transparency?
In my interviews for last week’s piece on the new ISO 31000 risk-management standard, risk expert Brian Barnier pointed out that one of the standard’s salient features is its concept of risk. ISO 31000 defines risk as the “effect of uncertainty on objectives,” acknowledging both the positive opportunities and negative consequences associated with risk.
I asked Brian if he could expound on this idea. I reached him at his home in Connecticut where a morning snowstorm was proving more ferocious than forecast. Schools that had opened were sending out word they were closing early. There were the sudden, predictable runs on milk and staples at local convenience stores. A good scenario, in other words, for our discussion.
One way to think about risk, Barnier said, is as variance from what is expected. Having too much milk is bad for a convenience store; too little milk is also bad, especially on a snowy day. Dealing successfully with risk depends on how prepared you are for the change.
“That word is very important in risk discussions,” Barnier said. “Some people think of preparedness as locking everything down. If you are coming out of the SOX [Sarbanes-Oxley] environment, you want to lock everything down, so your numbers are correct.” A big pharma company will want to lock everything down so it’s not slapped with a major recall of, say, its most popular painkiller.
“But for everybody else, risk is a lot more about being prepared for that snowy day — having the right tires on your car, driving defensively, having an emergency kit if your car goes off the road,” Barnier said. The convenience store with plenty of milk on hand is able to make hay on a snowy day.
Companies must be agile to take advantage of risk. Management guru Tom Peters, Barnier pointed out, was talking about opportunity risk 20 years ago in Thriving on Chaos” Barnier noted.
For IT departments, being prepared for risk opportunities calls for risk management at three levels, Barnier said:
- The investment portfolio: Are you investing in capabilities that will help you cope better with business change, whether that’s an acquisition or move into a new geography?
- The program and project-management layer: In addition to controlling budgets and meeting deadlines, are you prepared to take advantage of an upside opportunity — a pricing change or being able to step in when a competitor falters?
- Operations and service delivery: How can you take advantage more efficiently of opportunities that come your way?
How does your IT organization prepare for risk? Does preparing for the upside factor into your risk management? Or is it all about the lockdown?
Phil Cox, a contributor to SearchCloudComputing.com, recently shared some advice that will be helpful to those faced with understanding the challenges of cloud compliance.
In his tip, he focuses on the five major questions that every organization should ask before it moves into public cloud computing services. As Cox writes, “virtually every regulation requires organizations to adequately protect their physical and informational assets. To do this, there is an implied or assumed ability to control and prove:
- What information is stored on a system?
- Where is the information stored?
- Who can access the system?
- What they can access?
- Is the access appropriate?
All these questions imply some level of ownership of the assets in question, and that is where cloud compliance issues become apparent. In a public cloud environment, you are able to answer the first of those questions with certainty; the other four, however, end up posing a compliance problem.”
Read the rest of the cloud computing tip for Cox’s advice, and make sure to address compliance requirements in cloud computing contracts.
When it comes to technology predictions, there are a few certainties: Apple will grab the world’s attention with a new product, the iPad; Google will find a way to innovate with Web apps, including Google Voice mobile; and IT security budgets will remain strong this year, despite tough macroeconomic conditions.
David Mortman, a contributing analyst at Securosis LLC and SearchSecurity.com contributor, applied his lens to a less-covered area: regulatory compliance and security. He came away with a conclusion that won’t be shocking to many observers: more regulations, new technology.
As Mortman points out, “There are three different federal identity-theft protection bills working their way through Congress.” Certain provisions of the HITECH Act will go into effect Feb.17, including data breach notifications and extensions to HIPAA.
The fly in the regulatory-compliance alphabet soup, however, is likely to be cloud computing. As Mortman points out, “none of the existing regulatory requirements specifically address cloud computing, and few (HIPAA/HITECH and the FTC’s Red Flags Rule excepted) address outsourcing well.” Scale aside, cloud computing compliance still worries IT managers.
For more on the year ahead in compliance, review our compliance trends:
- Important regulatory compliance trends that will affect IT in 2010
- XBRL, PCI and social media to change compliance in 2010. The top regulatory compliance trends for 2010 include XBRL, PCI DSS, disaster recovery, vendor security management, carbon compliance and social networking risks.
Technology can both enhance the lives of consumers and create significant privacy issues, said David Vladeck, head of the Bureau of Consumer Protection at the Federal Trade Commission (FTC).
Speaking from the FTC’s second roundtable on online privacy at the University of California, Berkeley, today, Vladeck expressed concern that consumers have little awareness of how data is being collected or used online. That concern extends to social media privacy, mobile data, manufacturing and cloud computing security.
Vladeck summarized the lessons from the first FTC privacy roundtable, held last year in Washington, D.C. Consumers are “unaware of whether and how they can exercise control” over online data, he said, including practices in data broker industry. The “practice of behavioral advertising may be unfamiliar to consumers.”
The fact that consumers do care about online privacy is driven home in many ways, said Vladeck. He cited the popularity of a popup blocker for the Firefox Web browser and interest in resources for managing social media privacy settings. “The No. 1 most-emailed article from The New York Times was about how consumers can change privacy settings on Facebook,” said Vladeck. “That speaks volumes.”
The FTC privacy roundtable will examine both how technology enhance consumer privacy and how it can challenge or circumvent it, said Vladeck.
The FTC sees a “troubling technological arms race” between consumer empowerment tools and technologies that enable more data collection, he said, with countermeasures developed each time a means to protect privacy is developed.
In his remarks, Vladeck broke the FTC’s privacy roundtable into four areas:
- Social networking privacy: Social media is the “online equivalent to the water cooler,” revolutionizing how people interact. “It’s a boon to consumers, enabling us to reconnect and cement relationships. On the other hand, others can scrutinize the minutia of our lives.”
- Cloud computing security: “Cloud computing offers significant consumer benefit. At the same time, storing data on remote computers raises serious privacy and security concerns.” The issue with cloud computing security and privacy, as he observed, lies in the ease with which data may be shared, which increases the risk that data may be used in unanticipated ways.
- Mobile privacy: “Mobile devices have brought tremendous opportunities,” but also new privacy concerns. “How is location-based information being collected and used?” He also wondered how companies would be able to gain informed consent on devices with small screens. The FTC’s scrutiny confirms that GPS devices and geolocation data create privacy and security risks.
- Manufacturing: Vladeck indicated that the FTC will also be looking at how businesses are building privacy into services or devices at the outset. Ideally, he said, “privacy protections will be baked into products from the beginning.”
A full privacy roundtable agenda is available from FTC.gov.
The roundtable is being streamed online. Follow the conversation at #FTCprivacy on Twitter to read commentary in 140 characters or less or tune in to this list of privacy experts, workshop audience attendees and other commentators.
As Burton Group’s Mike Gotta blogged yesterday, the Financial Industry Regulatory Authority (FINRA) has issued new specific guidance to securities firms and brokers on the use of social media.
The regulatory authority’s updated guidance addresses the changes in usage, as workers spend more time on social networking sites in a business context.
As cited by the guidance, a recent report by the Pew Internet & American Life Project stated that 46% of American adults who use the Internet logged onto a social networking site in 2009. Now FINRA has addressed how rules governing communications apply to social media platforms that have been created by a firm or its registered representatives. For insight into one firm’s approach, check out “ href=”http://searchcompliance.techtarget.com/news/article/0,289142,sid195_gci1376108,00.html”>Brokerage invests in social media archiving for FINRA compliance last year.
“Social networking sites and blogs raise new regulatory challenges, particularly in the areas of supervision, advertising and books and records requirements,” said FINRA Chairman and CEO Rick Ketchum in a press release. “Our goal in issuing this notice is to ensure that firms and brokers use social networking sites in an appropriate manner.”
One of the recommendations in the new guidance for FINRA compliance is that covered firms create, distribute and adhere to an online privacy and social media policy. Another key requirement is that records of communications related to the broker or dealer’s business made through social media sites must be archived.
These new FINRA compliance rules, however, are technology-neutral in terms of how such archiving must be achieved. FINRA indicates that it is aware of different methods for social media archiving under development, including systems that interface with a firm’s network or the use of external systems by a registered representative working off-site.
FINRA guidance for social media now includes a best practice that “firms should consider prohibiting all interactive electronic communications that recommend a specific investment product and any link to such a recommendation unless a registered principal has previously approved the content.”
The full updated guidance on social media for FINRA compliance is linked below:
[kml_flashembed movie=”http://d1.scribdassets.com/ScribdViewer.swf?document_id=25869046&access_key=key-1ns5cbu4ozfcb0qb3k91″ width=”600″ height=”450″ wmode=”transparent” /]
In this podcast, former cybersecurity director Melissa Hathaway talks about emerging cybersecurity threats, reforms to FISMA compliance and corporate cyberespionage. Hathaway is a senior adviser at Harvard Kennedy School of Government’s Belfer Center for Science and International Affairs.
- How could the potential FISMA compliance reforms — so-called “FISMA 2″ — affect the quality of cybersecurity readiness in U.S. government agencies and contractors? Does FISMA compliance need reform?
- Other elements of legislation would introduce certification for IT security professionals. Is that a positive outcome, if it happens? Why or why not?
- The U.S. House passed a national data breach notification bill before the holiday break. If it passes the Senate, there will be a national standard. What do you think of the prospect? Is such a breach notification bill needed to supplement HHS and FTC data breach regulations?
- One critical area in cybersecurity lies in the many data breaches of corporate intellectual property. How does that unfortunate trend relate to compliance? Will a federal data breach notification law help to at least expose the scope of the issue?
- There’s considerable concern in the defense community about electronic espionage. How can those entrusted with maintaining cybersecurity balance privacy issues, civil rights and the need to protect or defend critical infrastructure? What does privacy mean in the context of cyberwar?
The Defense Information Systems Agency (DISA) has entered into a multiyear enterprise contract to use Lumeta Inc.’s IPsonar for network mapping and leak detection for the Department of Defense (DoD) global networks.
TKC Global, a systems integrator, will deploy the system.
Why is IPsonar considered necessary?
The short answer is, you can’t defend what you don’t know. We consider leak detection and mapping as key requirements to fully understand DoD’s networks and our external connections. This capability directly supports one of the actions in DISA’s recently signed Campaign Plan, where we want to conduct cross-domain searches for leaks between networks. IPsonar will provide a good start towards that requirement.
What networks will it be used on?
IPsonar will be used on SIPRNet [Secret Internet Protocol Router Network] and NIPRNet [Nonsecure Internet Protocol Router Network].
How well has it worked on the SIPRNet?
The “good” news is that we’ve had limited success with this tool on SIPRNet. I view it as good news because the problems we have getting a network mapping tool to work are directly tied to the security controls we’ve implemented to limit the ability of an adversary to maneuver on our networks. The vendor has made some changes to make it easier to work through some of these issues, plus we are now working a revised CONOPS [Concept of Operations] that will put the tool in the hands of those best able to make the network changes needed for the tool to be fully effective.
Is the software used for one-time or periodic network mapping? Or does it run continuously?
I would like to see this run continuously, at least the portion of the tool that supports leak detection. We are working now with JTF GNO [Joint Task Force-Global Network Operations] and the services to finalize the CONOPS.
Once the network or networks are mapped, then what does DISA do?
DISA’s role here is as the acquisition and support agency for an enterprise information assurance capability that will be operated by the COCOMS [DoD’s combatant commands], services and agencies. We are responsible for lifecycle support of the capability.
Is DISA planning other steps to increase network security?
Absolutely. We have a large information assurance program that includes a number of initiatives to reduce the attack surface, improve information sharing and provide the global situational awareness needed to assure mission success in the face of cyberattack.
How will IPsonar relate to the transition from IPv4 to IPv6?
We will always have a requirement to understand our network topology and identify leaks. Today, IPsonar can detect, query and capture info from IPv6 assets. The IPsonar solution is sitting on an IPv4 stack but they have identified in their roadmap and are on track to be IPv6-compliant. We will work with the vendor and IPv6 test efforts in DoD to make sure this and all of our IA [Internal Audit] capabilities remain effective as we transition to IPv6.
How will this deployment relate to complying with the Trusted Internet Connections Initiative?
We have strong policy and procedures to support the Trusted Internet Connection Initiative. The leak-detection capability of IPsonar provides the technology to help identify any unapproved Internet connections.
How will this implementation allow DISA and the DoD to meet FISMA compliance standards?
This will support the FISMA requirement for “asset awareness” by providing a mapping capability.
Why choose IPsonar, vs. other networking mapping software?
Our most critical requirement was leak detection. When we considered that, along with the mapping requirements, we found IPsonar to be the best solution.
How will IPsonar integrate with existing network, storage and endpoint security software at DISA to ensure better cybersecurity?
We have a number of cybersecurity solutions providing valuable data for our network defenders, but integration is largely manual. One of the top priorities for us in FY10 is to address this issue. We have two efforts ongoing: one focused on configuration management and vulnerability management requirements leveraging the SCAP data standards, with the other focused on attack detection, diagnosis and response. Both of these efforts will integrate IPsonar to help put data from other sources into context.