IT Compliance Advisor


March 26, 2010  7:46 PM

Experts provide a scary peek into cloud computing security risks

Linda Tucci Linda Tucci Profile: Linda Tucci

The cloudiness of cloud computing security is already getting to be an old joke — certainly, an overused headline. But it was no joke this week listening to the head of IT security at Boston College, the CISO of Brown University, a prominent Boston intellectual property lawyer and the CEO of a cloud-based application security provider talk about the lack of transparency in cloud computing.

“Security in the Cloud” was the title for the session at the Olin Innovation Lab, an annual event at the Olin College of Engineering in Needham, Mass. As the four panelists made clear, however, there is precious little of that in most cloud computing arrangements.

For attorney Iuean Mahony, a partner at Holland & Knight LLP, the opacity or transparency of the cloud computing engagement is determined by the contract. “But the problem now is that these contracts are fairly impenetrable,” he said. Indeed, the ability of the potential consumer of cloud computing services to actually do the due diligence required to write a contract that provides transparency is a “huge problem,” agreed David Escalante, the director of computer policy and security at Boston College, because the market is so immature.

In fact, when Boston College’s legal and contract people have been asked to look into cloud computing, they tend to “pull out the old paradigms,” used for the typical earthbound vendor for whom service-level agreements, software escrow, and spelling out where the data is stored is standard operating procedure. Pinning down cloud providers on such cloud computing security issues, or on how employees are vetted, or on legal protections if that provider goes belly up, is very tough.

“All those provisions of a customer contract, these cloud computing providers can’t do and still utilize their economies of scale,” Escalante said.

(CEO Matt Moynahan, of the cloud-based application security provider Veracode, for example, said his company offers service-level objectives rather than SLAs, in part because his customers’ demands are morphing so quickly. The Veracode promise to return an application within 72 hours is harder to guarantee when customers are now sending 100 million lines of code to vet, compared with earlier days, when customers were sending 50, 000 lines of code.)

The solution will be to move to a set of agreed-upon cloud computing security standards, and the legal and technology communities are inching toward standards. Meantime, Escalante has leaned on the due diligence being done by Shared Assessments, a member organization in the banking industry that was formed to evaluate the security of controls of service providers.

Dave Sherry, CISO at Brown University, where student email is now on Google Apps, said he consulted the nonprofit Cloud Security Alliance, which puts out free guidance and recently issued its Top Threats to Cloud Computing report, for his “many, many rounds” of contract negotiations with Google. Sherry had one big advantage going into negotiations, namely, Google’s strong desire to add a name-brand Ivy League university to its roster of academic customers. So the 800-pound gorilla proved flexible on some counts, he said, including allowing Brown to look at its hiring practices. Still, when it came to a formal risk assessment, Sherry added, Google “was going to give us precious little.”

On the other hand, Brown gets four years of Google Apps for free, and it’s hard to argue with free, Sherry said.

We’ll be writing more about the conundrums CIOs face in deciding whether and exactly how to utilize the cloud in an upcoming ezine for our sister site, SearchCIO.com. Meantime, I’d like to hear from you about how your organizations are weighing the risks and benefits of cloud computing.

March 26, 2010  3:50 PM

SEC provides additional guidance on XBRL compliance

GuyPardon Guy Pardon Profile: GuyPardon

In a public seminar held earlier this week on eXtensible Business Reporting Language (XBRL) compliance, the Securities and Exchange Commission (SEC) offered further guidance on how it expects companies to file their interactive financial records.

Mark Green, senior special counsel at the Division of Corporate Finance, described the issues around noncompliance this way: “As long as someone has an outstanding posting or submission of interactive dating, that entity is ineligible for short posting.” Green added that as soon as interactive filings are submitted or posted as XBRL, a company will once again be eligible for short-form posting.

Green said that short-term hardship exemptions of six business days are possible through application, adding that third-party involvement is not required for assurance or its preparation.

More information on the seminar can be found at XBRL.SEC.gov. An archive of the XBRL Public Education Seminar webcast is available (.WMV).

The instruction to use XBRL for business filings goes back to 2008, when the SEC proposed the use of the format. The SEC‘s mandate to use XBRL initially applied only to the nation’s 50 largest companies in 2009. Companies in the next group must post XBRL filings by June 15, Green said. Tagging of mutual funds will start in 2011.

In each case, companies are required to send XBRL filings to the SEC and post them to their corporate websites, said Joel Levine, assistant director of the Office for Interactive Disclosure. Levine said that technical and taxonomy issues prevent the SEC from accepting early submissions for IFRS. “There is no requirement that interactive data appear in identical format to the traditional format of financial statements,” said Levine.

As senior news writer Linda Tucci reported in January, XBRL financial reporting has been a hard sell. The SEC is encouraging companies subject to XBRL compliance to contact the Office of Interactive Disclosure with questions about compliance. XBRL reporting may not just be for the SEC anymore, but businesses have been slow to adopt it.

Slides from the SEC’s seminar on XBRL compliance are embedded below:

[kml_flashembed movie=”http://d1.scribdassets.com/ScribdViewer.swf?document_id=28986246&access_key=key-2htkx7j1dnbwi3ppdk9i” width=”600″ height=”450″ wmode=”transparent” /]


March 11, 2010  9:15 PM

Web application security matters, even without a compliance mandate

GuyPardon Guy Pardon Profile: GuyPardon

Jeremiah Grossman, the CTO of White Hat Security, has been at the forefront of documenting the dangers of Web application security.

As he points out on his blog, however, state and federal regulations are lagging behind in addressing Web application security, even though many enterprises are increasingly being targeted online. While the Massachusetts data protection law addresses many security controls, as Grossman observes in his blog, there’s nothing in the regulation that specifically addresses the area .

That doesn’t mean that an enterprise might not be held accountable for a data breach that results from a Web application exploit. In the presentation below, which Grossman shared at the RSA Conference, he offers his top 10 Web application security hacks — and some ideas on how to address them.


March 8, 2010  7:33 PM

Can anybody find a way to put a value on a risk management program?

Linda Tucci Linda Tucci Profile: Linda Tucci

If your company is finding it difficult to weigh the costs vs. benefits of a formal risk management program, Standard & Poor’s (S&P) feels your pain.

I caught up with Steven Dryer, managing director at the New York-based credit rating agency, for an update on S&P’s 2008 announcement that it intended to factor enterprise risk management (ERM) measures into its credit ratings of nonfinancial companies. Nearly two years later, the effort is not where Dryer hoped it would be. (Learn more in “What’s a risk management strategy worth to your S&P credit rating?”)

While just about everybody would agree — post-financial meltdown — that a balance sheet is insufficient for gauging a company’s risk exposure, Dryer told me that the agency is really struggling with assigning a value to the more qualitative aspects of a risk management program (company culture, staff roles and where those roles fit in the organization chart, risk policies and metrics).

To ascertain management’s credibility, S&P has to compare what it’s been told about the company’s enterprise risk management program with how the company actually handles anticipated and unanticipated risks — and that will take time, he said. As you’ll read in my story this week, S&P can draw on decades of data it has collected on companies to help set benchmarks. But that will take time, too.

A startup crowdsourcing project called Riskfree.org contends that the business models of the ratings agencies are too narrow to provide sound guidance for investors. Riskfree.org argues that what investors need are the tools to create their own risk models — lots of them, including S&P’s — which can then be aggregated and compared over time to see which models hold up best.

Whether investors should continue to trust any model the credit rating agencies come up with anymore, given their failure to predict the worst financial crisis since the Great Depression, is probably a topic for another story. But somebody has to find a way to correlate the cost of an enterprise risk management program with the benefits. For another reminder, if you need one, of just how much damage this recession as wreaked, consider this: Half the companies that S&P rates today fall into its bottom two categories, CCC and D, or, in plain terms, close to or in default.


February 26, 2010  6:37 PM

DoD releases new policy for the secure use of social media on NIPRNET

GuyPardon Guy Pardon Profile: GuyPardon

This afternoon, the Department of Defense (DoD) announced the release of its new policy for the secure use of social media. The policy officially recognizes that “Internet capabilities are integral to operations across the Department of Defense” and formalizes the use of collaborative technology by service members from within the Non-classified Internet Protocol Router Network (NIPRNET). The NIPRNET is used to exchange sensitive but unclassified information between “internal” users as well as providing users access to the open Internet, as opposed to the SIPRNET, which is used for more secure communications.

“This directive recognizes the importance of balancing appropriate security measures while maximizing the capabilities afforded by 21st Century Internet tools,” said Deputy Secretary of Defense William J. Lynn III in an official release posted at Defense.gov.

Deputy CIO David Wennergren spoke to SearchCompliance.com about the new policy for the secure use of social media late Friday afternoon. (Skip ahead to read the interview in full).

Secure access from the NIPRNET

The NIPRNET will now be configured to allow services members to use such services. The new policy “will allow access to SM [social media] sites but balance it with the need to be secure,” tweeted Price Floyd, Principal Deputy Assistant Secretary of Defense for Public Affairs. He went on tweet out more details.

“Here’s the meat of it: the NIPRNET shall be configured to provide access to internet-based capabilities across all DoD components.”

“Commanders at all levels and Heads of DoD Components shall continue to defend against malicious activity affecting DoD networks (e.g., distributed denial of service attacks, intrusions) and take immediate and commensurate actions, as required, to safeguard missions (e.g., temporarily limiting access to the internet to preserve activity via social media sites (pornography, gambling, hate-crime related activities).

All use of Internet-based capabilities shall comply with paragraph 2-301 of chapter 2 of the Joint Ethics Regulation (reference (b)) and the guidelines set forth in attachment 2.”

The DoD’s social media directory shows the use of social media platforms by over a hundred different accounts, all of which will be governed by the new policy.

The full policy for the secure use of social media from the NIPRNET is embedded below:

[kml_flashembed movie=”http://d1.scribdassets.com/ScribdViewer.swf?document_id=27525669&access_key=key-7e9v31jockuzwxvljwg” width=”600″ height=”450″ wmode=”transparent” /]

“External official presences shall clearly identify that DOD provides their content,” tweeted Floyd. “External official presences shall receive approval from the responsible OSD or DoD Component Head. Official presences shall be registered on the external official presences list.”

The new DoD policy for the secure use of social media is clear on avoiding the addition of personally identifiable information (PII). “Official use shall ensure that the info posted is relevant and accurate, includes no PII, includes a disclaimer,” tweeted Floyd. “ASD (NII)/DOD CIO shall provide implementation guidance, education, training, and awareness activities. USD Intel shall develop and maintain threat estimates on current and emerging Internet-based capabilities. USD Intel shall integrate proper SM use into OPSEC training.”

UPDATE: SearchCompliance.com reached Dave Wennergren, deputy CIO at the Office of the Secretary of Defense, at his office late on Friday afternoon. Following are the highlights of the interview

“This policy is a recognition that information sharing is so important to us,” said Wennergren. “It’s about being able to connect to the right people at the right time. We’ve done a lot of work on bringing in collaboration tools inside of the Department. Now, Web 2.0 tools allow us to extend that. You need look no further than Haiti, where non-governmental organizations were able to connect and share information in a crisis.”

Raising awareness of security

Wennergren observed that there’s a big imperative around both information sharing and security. “The purpose of this memo was to help people think about both,” he said. “When you think about security, you tend to think about individual access. And when you think about sharing on its own, you don’t think about security. What we had was inconsistent application of both. Some services blocked access entirely, others opened up. This memo was issued to enforce consistency around the use of technologies that are really powerful in helping people get their jobs done better. When Second Life happened, people looked at avatars and said that was a game. Now, universities are teaching classes there. Virtual worlds are clearly not a game. There’s a reason the Chairman of the Joint Chiefs of Staff uses Twitter. And there’s a reason the Navy set up an account on Facebook.”

Flexibility in applying policy

Wennergren was thoughtful about how blurred the lines between the workplace and home have become, along with the need to allow commanders flexibility in applying the policy for the secure use of social media within operational constraints. “You have to think about having good information hygiene,” he said, “which means being thoughtful about what you share in public or in private. Commanders at all levels need to take the steps they need to guarantee uptime and security. If the networks are under attack, by all means take the steps you can. If you’re in a limited bandwidth environment, then you need to consider that context.”

Web 2.0 is the office phone of the 21st Century

Wennergren also drew a historical parallel between Web 2.0 technologies and past communication technologies. “If this were a decade ago, we would have been having this conversation about the use of the Internet. It’s an interesting thing to think about whether you have a responsible workforce. People have phones. If people make a lot of long distance calls, there are ways to deal with that. If an employee is not doing their work because they’re spending all their time on the Internet, you can address that. We don’t unplug all the phones because someone might be using it inappropriately. The regulations are clear about use of government equipment. Existing rules will continue to apply.”

He was also concerned about being an attractive employer to the generation of hyper-connected millennials that have entered the workforce over the past decade. “If we want to be an employer of choice, we better give people access to tools that let them release their creativity,” said Wennergren. “There’s huge power in using these kinds of tools but you have to use them responsibly.”

Ensuring secure access to social media

The DoD’s deputy CIO was also focused on keeping the NIPRNET secure as military personnel are accessing Twitter, Facebook or other social media platforms. “This is where we talk about a holistic approach,” said Wennergren. “People need to have access to the tools they need to get their jobs done. That means doing basic blocking and tackling; including firewalls, antivirus and monitoring of traffic going in and out of networks.”

That sort of tracking is why DISA invested in network mapping and leak detection in January. “By monitoring and keeping track of what’s going on, including anomalies, we’re better served than by blocking access to websites. The rest of the government also working on the Trusted Computing Initative,” said Wennergren, “which means putting public-facing content in a DMZ.”

Virtualization and the cloud will be involved

The shift that Wennergren described is significant, in terms of a way of thinking about enabling the workforce. “If what you really want to do is secure information sharing, you need to think about technology that allows you to do trusted computing from untrusted computers,” he said. “It’s sort of like the national park model — take nothing with you, leave nothing behind.  The future is people accessing secure networks through iPhones, Android devices or other mobile clients. The next set of tools might include access through library kiosks.”

Wennergren sees desktop virtualization and server virtualization playing a role in the secure use of social media in the future. “Imagine the prize of having a trusted deskop in the cloud,” he said. “If apps and data are up in the cloud and you understand the perimeter, you can raise the boundaries of security overall.  Whether it’s a bootable CD or other means, virtualization can allow you to be protect your environment from the PC you’re booting into.”

Risk management, not risk avoidance

Regardless of the technologies employed, Wennergren said “the prize” is security. “It’s like the clichéd phrase about defense in depth. People need to be thoughtful about the devices they use and what those devices can do.”

Like Howard Schmidt, the U.S. cybercoordinator, Wennergren is applying risk management to cybersecurity threats. “Moving from risk avoidance to risk management helps you to be more thoughtful,” he said. “Consider a BlackBerry. It’s inconsequential to send a trivial email using a fixed access signal. If it’s a more serious email, you should be using a Common Access Card and a PKI credential to transmit.  Some people’s location data will be turned off. For others, that won’t matter as much.”

Wennergren’s focus is on delivering IT capabilities more quickly and still remaining secure. “Where it used to take a long time to deploy a new platform, now you can do much more,” he said. “If you expose your data, mash things up, use Google Earth, you can move much more rapidly. Whether it’s tracking IED locations, maritime situational awareness, aggregating Blue Team movements, Web 2.0 technologies are giving a us a huge advantage. We can share documents quickly and increase operational awareness through these technologies. Shame on you if you’re not taking advantage of them.”

A video of Wennergren from earlier this month is embedded below, in which he offers more perspectives on secure information sharing.

[kml_flashembed movie="http://www.youtube.com/v/rSlcW17SKG0" width="425" height="350" wmode="transparent" /]

The Defense Department’s Social Media Hub will host more “educational materials” on the policy in the future.

UPDATE: last year, David Meerman Scott interviewed Roxie Merritt, Director of New Media Operations at the Office of the Secretary of Defense for Public Affairs in the U.S. Department of Defense. In the video below, they talk about blogger relations, the role of Facebook and Twitter, and other ways to communicate. [Hat tip to Scott’s post about the DoD’s official policy on social media.

[kml_flashembed movie="http://www.youtube.com/v/GpVPNbrrQYs" width="425" height="350" wmode="transparent" /]


February 24, 2010  10:10 PM

CloudAudit.org to offer tools for verifying cloud computing compliance

GuyPardon Guy Pardon Profile: GuyPardon

The Automated Audit, Assertion, Assessment, and Assurance API (A6) working group is newly organized under the brand of CloudAudit. The stated goal of CloudAudit is to “provide a common interface that allows cloud providers to automate the Audit, Assertion, Assessment, and Assurance (A6) of their environments and allow authorized consumers of their services to do likewise via an open, extensible and secure API.”

In this podcast, SearchCompliance.com associate editor Alexander B. Howard interviews Christofer Hoff, director of cloud and virtualization solutions at Cisco Systems, and one of Cloud Audit’s organizers. Prior to his work at Cisco, he was Unisys Corp.’s systems and technology division’s chief security architect. Hoff continues to participate in the Cloud Security Alliance. You can find Hoff’s blog at Rationalsurvivability.com/blog and follow him on Twitter as @Beaker.

Hoff says that forming A6 came out of the need for enterprise security professionals to have better tools for confirming security and cloud computing compliance at providers of these services.

When you listen to this podcast, you’ll learn:

• What Cloud Audit is.
• What problems A6 could solve for CISOs and CIOs faced with ensuring cloud computing compliance challenges.
• How Cloud Audit would map to compliance, regulatory, service-level, configuration, security and assurance frameworks, or third-party trust brokers.
For more information, visit CloudAudit.org, the relevant Google Group or the Cloud Audit code base at Google Code. Hoff has also collected recent press coverage and other information about A6 at his blog.


February 19, 2010  8:00 PM

Is continuous controls monitoring at the top of your GRC agenda?

Linda Tucci Linda Tucci Profile: Linda Tucci

Is there a groundswell in business for using continuous controls monitoring to beef up corporate governance risk and compliance programs? Analysts and vendors have certainly launched a full-out campaign for its relevance.

AMR Research Inc. in Boston named CCM one of the top GRC software investments companies will make in 2010, right behind compliance management software and business process management BPM products.

Gartner Inc. analyst French Caldwell called out CCM for business and financial applications as a top trend when I spoke to him in December for our preview of the major GRC issues in 2010.

Of course, IT has long used controls automation for configuring servers, conducting audits, maintaining security and so on. CCM was used in complying with the Sarbanes-Oxley Act requirements for segregating duties. But CCM is increasingly being used for business performance issues — for example, to eliminate duplicate payments in real time rather than on a quarterly basis; or to ensure that invoices are paid on schedule but not in advance, to keep that working capital.

“Controls automation is moving up the stack. It’s making sure the business rules are being followed,” Caldwell said, adding that the big enterprise resource planning vendors such as SAP AG and Oracle Corp. are doing just that.

So are the point solution vendors. John Becker, CEO of Approva Corp., a CCM software provider, makes a case for his software in a white paper coming out in a March issue of Compliance Week. Unlike other GRC technologies, Becker argues, CCM delivers “tangible, hard-dollar savings,” and in his white paper he offers up some choice examples, presumably from Approva customers:

  • Lower procurement costs: A telecom company reduced expenses by $2 million by flagging purchases that did not take advantage of available discounts, and preventing unnecessary purchases that circumvented corporate policies.
  • Improved order accuracy and on-time shipments: A manufacturer of construction Lower procurement costs: A telecom company reduced expenses by $2 million by materials reduced the number of sales orders that were delayed and required manual rework by 60% by identifying incomplete and inaccurate information when the sales order was created, and flagging open sales orders that were not shipped within 20 days of their original commitment.
  • Reduced accounting errors: A manufacturer in the midwestern United States reduced the number of financial reporting anomalies requiring manual follow-up and investigation by more than 50%, and significantly increased confidence in the accuracy of its financial reports.
  • Lower audit and compliance costs: The internal audit organization of a $1 billion software company reduced the time its external auditor spent testing its controls by 80% for each key control that they automated.
  • Reduced risk of fraud: A home improvement retailer reduced the risk of employee theft by monitoring the distribution of free samples to identify suspicious orders, excessive shipments and samples with alternate ship-to addresses.

So, is this all hype, or as the lobbyists like to say, a “conflict confluence of interest” for analysts and vendors? I’m curious if readers are seeing an uptick in continuous controls systems for GRC at their companies.

And I need to ask a really dumb question to boot: Would moving to the XBRL electronic data format for financial and other reporting accomplish the same transparency?


February 1, 2010  8:30 PM

ISO 31000, Tom Peters and preparing for the upside of risk

Linda Tucci Linda Tucci Profile: Linda Tucci

In my interviews for last week’s piece on the new ISO 31000 risk-management standard, risk expert Brian Barnier pointed out that one of the standard’s salient features is its concept of risk. ISO 31000 defines risk as the “effect of uncertainty on objectives,” acknowledging both the positive opportunities and negative consequences associated with risk.

I asked Brian if he could expound on this idea. I reached him at his home in Connecticut where a morning snowstorm was proving more ferocious than forecast. Schools that had opened were sending out word they were closing early. There were the sudden, predictable runs on milk and staples at local convenience stores. A good scenario, in other words, for our discussion.

One way to think about risk, Barnier said, is as variance from what is expected. Having too much milk is bad for a convenience store; too little milk is also bad, especially on a snowy day. Dealing successfully with risk depends on how prepared you are for the change.

“That word is very important in risk discussions,” Barnier said. “Some people think of preparedness as locking everything down. If you are coming out of the SOX [Sarbanes-Oxley] environment, you want to lock everything down, so your numbers are correct.” A big pharma company will want to lock everything down so it’s not slapped with a major recall of, say, its most popular painkiller.

“But for everybody else, risk is a lot more about being prepared for that snowy day — having the right tires on your car, driving defensively, having an emergency kit if your car goes off the road,” Barnier said. The convenience store with plenty of milk on hand is able to make hay on a snowy day.

Companies must be agile to take advantage of risk. Management guru Tom Peters, Barnier pointed out, was talking about opportunity risk 20 years ago in Thriving on Chaos” Barnier noted.

For IT departments, being prepared for risk opportunities calls for risk management at three levels, Barnier said:

  • The investment portfolio: Are you investing in capabilities that will help you cope better with business change, whether that’s an acquisition or move into a new geography?
  • The program and project-management layer: In addition to controlling budgets and meeting deadlines, are you prepared to take advantage of an upside opportunity — a pricing change or being able to step in when a competitor falters?
  • Operations and service delivery: How can you take advantage more efficiently of opportunities that come your way?

How does your IT organization prepare for risk? Does preparing for the upside factor into your risk management? Or is it all about the lockdown?


February 1, 2010  5:38 PM

Clear advice on cloud computing security and compliance

GuyPardon Guy Pardon Profile: GuyPardon

Phil Cox, a contributor to SearchCloudComputing.com, recently shared some advice that will be helpful to those faced with understanding the challenges of cloud compliance.

In his tip, he focuses on the five major questions that every organization should ask before it moves into public cloud computing services. As Cox writes, “virtually every regulation requires organizations to adequately protect their physical and informational assets. To do this, there is an implied or assumed ability to control and prove:

  • What information is stored on a system?
  • Where is the information stored?
  • Who can access the system?
  • What they can access?
  • Is the access appropriate?

All these questions imply some level of ownership of the assets in question, and that is where cloud compliance issues become apparent. In a public cloud environment, you are able to answer the first of those questions with certainty; the other four, however, end up posing a compliance problem.”

Read the rest of the cloud computing tip for Cox’s advice, and make sure to address compliance requirements in cloud computing contracts.


February 1, 2010  3:08 PM

Regulatory compliance predictions for 2010: More regulations, new tech

GuyPardon Guy Pardon Profile: GuyPardon

When it comes to technology predictions, there are a few certainties: Apple will grab the world’s attention with a new product, the iPad; Google will find a way to innovate with Web apps, including Google Voice mobile; and IT security budgets will remain strong this year, despite tough macroeconomic conditions.

David Mortman, a contributing analyst at Securosis LLC and SearchSecurity.com contributor, applied his lens to a less-covered area: regulatory compliance and security. He came away with a conclusion that won’t be shocking to many observers: more regulations, new technology.

As Mortman points out, “There are three different federal identity-theft protection bills working their way through Congress.” Certain provisions of the HITECH Act will go into effect Feb.17, including data breach notifications and extensions to HIPAA.

The fly in the regulatory-compliance alphabet soup, however, is likely to be cloud computing. As Mortman points out, “none of the existing regulatory requirements specifically address cloud computing, and few (HIPAA/HITECH and the FTC’s Red Flags Rule excepted) address outsourcing well.” Scale aside, cloud computing compliance still worries IT managers.

For more on the year ahead in compliance, review our compliance trends:


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: