IT Compliance Advisor


April 27, 2010  3:58 PM

Principal agent risk needs to be in your risk management model

Linda Tucci Linda Tucci Profile: Linda Tucci

The courts will eventually determine whether the profiteers at Goldman Sachs who spun toxic securities into gold were extremely skilled players in the legal gambling dens of Wall Street, or whether they rigged the house (mortgages). Meantime, even companies that are not being sued for fraud by the U.S. Securities and Exchange Commission might want to bone up on principal agent risk — and waste no time in making it part of their risk management models.

I got a primer on principal agent risk a couple months ago from risk expert Ali Samad-Khan, the CEO of Stamford Risk Analytics and a keynote speaker at the Risk Management Association’s GCOR IV operational risk seminar in Boston. Principal agent risk speaks to the reality that employees of a company are sometimes in the position to do things that are not in the best interest of the organization. Managers who are agents don’t always do things that are in the best interest of the stakeholders or the principals.

For example, if an employee’s bonus is based on the amount of money he makes, as opposed to the amount of money to be made on a risk-adjusted basis, Samad-Kahn explained, that employee can end up making a huge amount of money by taking a huge of amount of risk. When thinking about principal agent risk in terms of payout metrics, he advised his audience of risk managers to ask two questions: Who is the intended beneficiary, and who is the intended loss sufferer? In a criminal event, where the perpetrator is intent on benefitting himself, there is a clear intended loser. It’s a zero-sum game. So, how is that situation different from principal agent risk? When an employee of a firm takes excessive risk, the intent is not to harm the firm, usually, he said. The intent is not to harm anyone but to benefit the firm and himself.

“But if you look at the distribution of all potential outcomes of this excessive risk-taking, the expected value of all the outcomes is negative. What does this mean? It means that you know full well that what you are doing is not in the best interest of the firm on a risk-adjusted basis. It means this transaction you are engaging in destroys value, and yet you go ahead and do it because it benefits you,” Samad-Kahn said.

In an environment where compensation is frequently on a “heads-I-win-tails-somebody-else-loses” basis, you wind up with an environment where there is a lot of principal agent risk,” Samad-Kahn said. Principal agent risk — now not even part of the risk taxonomy — must be factored into risk management models.

Making piles of money on a risk-adjusted basis is fine, provided the risk model is not intent on doing harm. Making piles of money by taking excessive risk that could harm the principals and the shareholders is not okay.

Of course, when the agents are the principals, and the shareholder being harmed is the American taxpayer, that’s a matter for the courts.

April 16, 2010  7:58 PM

Follow the money in GRC management platforms

Linda Tucci Linda Tucci Profile: Linda Tucci

I’ll start with the possibly infuriating hypothesis: There’s money to be made from governance, risk and compliance (GRC) software by vendors, of course, but also for enterprise IT shops. And it is probably not in the standalone GRC management platforms that focus on documentation, but in platforms that focus on automated controls and continuous monitoring of risk.

SAP and Oracle have sniffed out where the money is in GRC, and maybe IBM. (Thomson Reuters, the global information company that acquired Paisley in 2008, and Wolters Kluwer, which bought Axentis last year, are also betting on a GRC jackpot, but it’s a different pot from the one pursued by the ERP players.)

To back up: The marketplace for GRC management platforms is premised on the belief that the problem with enterprise governance, risk and compliance is that the three areas are typically handled in separate parts of the enterprise and shouldn’t be. Coordinating the many compliance obligations that an enterprise faces — to reduce redundancy, improve efficiency, etc. — is hard. Coming up with an effective governance structure for the legions of corporate employees who have some hand in mitigating risk and meeting compliance requirements is also hard. GRC management platforms promise to identify, coordinate and document all the IT, operational and financial functions involved in GRC.

Compliance is good. Documentation is good. Fixing is even better. Back to the money part.

ERP vendors Oracle and SAP are upping their stakes in the GRC market. Oracle, which has offered GRC software for a number of years, announced in December its Oracle Enterprise GRC Manager (the acquired Stellent product re-architected to run on Fusion middleware) and its latest release of Oracle Enterprise GRC Controls for, if it should say so itself, “a unique, closed-loop approach to regulatory compliance, risk management and controls automation.” SAP’s focus in GRC has always been on the automated controls, rather than documentation, claiming that extensive manual reporting is unnecessary if problems are corrected as they arise. When this kind of continuous controls monitoring is applied to enterprise performance for the purpose of detecting and preventing violations of business rules — duplicate or late payments, for example, out of warranties — companies are not only staying in compliance, they are saving and making money. If, as some analysts believe, this is where the GRC market is heading, SAP and Oracle have a leg up — at least among their customers.

That’s the hot air. There’s a plan to bring the argument down the earth.

Starting soon, SearchCompliance.com will devote a section to new products and product developments in GRC, not just from the names mentioned here but also from the flotilla of vendors (about 40) that address the many aspects of this $30 billion spend. (Suggestions for a catchy name for our product news section, preferably with a double entendre, are welcome.) As we follow the products, we hope it will become clearer to you (and us) where GRC technology is headed and how these products can not only help keep your company out of jail but also contribute to its bottom line.

Meantime, rebuttals to the follow-the-money hypothesis are welcome. As well as any inside tips on which GRC vendors will end up owning the space. (I already got one from someone who’s followed Oracle for two decades. To paraphrase: Once they’ve made up their mind to go after this space, they’ll crush everyone in their path.)


March 31, 2010  3:52 PM

Coalition pushes ECPA update for online privacy in cloud computing age

GuyPardon Guy Pardon Profile: GuyPardon

A powerful collection of organizations has formed a new coalition to push for an update to the Electronic Communications Privacy Act (ECPA). Members of the coalition include Google, Microsoft, AT&T, AOL, Intel, the ACLU and the Electronic Frontier Foundation. The guidance from the coalition would enshrine principles for “digital due process,” online privacy and data protection in the age of cloud computing within an updated ECPA.

“The traditional standard for the government to search your home or office and read your mail or seize your personal papers is a judicial warrant,” said Jim Dempsey, vice president for public policy at the Center for Democracy & Technology, who has led the coalition effort. “The law needs to be clear that the same standard applies to email and documents stored with a service provider, while at the same time be flexible enough to meet law enforcement needs.”

Updates to ECPA may also be relevant to the public sector, as CIOs answer the call to arms issued by U.S. CIO Vivek Kundra for cloud computing services. Cloud computing compliance figures to be a growing concern as enterprises turn to rapidly maturing Google Apps.

The coalition has set up a website, DigitalDueProcess.org, containing its proposals for updating ECPA in the face of new cloud computing security and online privacy challenges. Google Public Policy released a video, embedded below, describing the concept of “digital due process”:
[kml_flashembed movie=”http://www.youtube.com/v/AYYjr3XNaGs” width=”425″ height=”350″ wmode=”transparent” /]

At issue is the reality that for many consumers and enterprises, sensitive data in email and other electronic communications no longer resides solely on a hard drive at home or in the office. “The statute was passed in 1986 and doesn’t reflect how people use online services today,” said Microsoft spokesman Mike Hennessey. “The U.S. Constitution protects data at home and on your computer at a very high standard. We don’t believe that that should be turned on its head.”

As more and more people embrace the benefits of cloud computing, there are challenges in terms of compliance, as well as friction in terms of law enforcement access. Congress has heard testimony on location-based services and online privacy as usage of mobile social networks has exploded.

“The majority of court decisions have found that the government needs to get a warrant if it needs to track citizens in real time,” said Kevin Bankston, a senior staff attorney at the ACLU.

The coalition is pushing for a set of simplified standards that defines legal protection for data in the cloud, and that also addresses the increasingly blurred distinctions among data provided through GPS, cell sites and network triangulation.

“The reality is that technology has advanced over the last 20 years, so that better technology and intrusive technology has become available to both government and consumers,” said Catherine Sloan, vice president of government relations at the Computer & Communications Industry Association. “When you’re talking about the government, folks don’t have a choice of whether to deal with the government or not. They can’t just change a provider. That’s why the statute, in terms of government, needs to be updated.”

The coalition has issued principles for updating the ECPA that would define the rules for government access to email and other files stored online. These include requirements that:

  • ”A governmental entity may require an entity covered by the ECPA (a provider of wire or electronic communication service or a provider of remote computing service) to disclose communications that are not readily accessible to the public only with a search warrant issued based on a showing of probable cause, regardless of the age of the communications, the means or status of their storage or the provider’s access to or use of the communications in its normal business operations.”
  • ”A governmental entity may access, or may require a covered entity to provide location information regarding a mobile communications device only with a warrant issued based on a showing of probable cause.”
  • ”A governmental entity may access, or may require a covered entity to provide, prospectively or in real time, dialed number information, email to and from information or other data currently covered by the authority for pen registers and trap and trace devices only after judicial review and a court finding.”
  • ”Where the Stored Communications Act authorizes a subpoena to acquire information, a governmental entity may use such subpoenas only for information related to a specified account(s) or individual(s).”

The complete list of coalition members include: ACLU, American Library Association, Americans for Tax Reform, AOL, Association of Research Libraries, AT&T, Center for Democracy & Technology, Citizens Against Government Waste, Competitive Enterprise Institute, Computer and Communications Industry Association, eBay, Electronic Frontier Foundation, Google, Information Technology and Innovation Foundation, Integra Telecom, Intel, Loopt, Microsoft, NetCoalition, The Progress & Freedom Foundation and Salesforce.com. The coalition will continue to add new members.


March 26, 2010  7:46 PM

Experts provide a scary peek into cloud computing security risks

Linda Tucci Linda Tucci Profile: Linda Tucci

The cloudiness of cloud computing security is already getting to be an old joke — certainly, an overused headline. But it was no joke this week listening to the head of IT security at Boston College, the CISO of Brown University, a prominent Boston intellectual property lawyer and the CEO of a cloud-based application security provider talk about the lack of transparency in cloud computing.

“Security in the Cloud” was the title for the session at the Olin Innovation Lab, an annual event at the Olin College of Engineering in Needham, Mass. As the four panelists made clear, however, there is precious little of that in most cloud computing arrangements.

For attorney Iuean Mahony, a partner at Holland & Knight LLP, the opacity or transparency of the cloud computing engagement is determined by the contract. “But the problem now is that these contracts are fairly impenetrable,” he said. Indeed, the ability of the potential consumer of cloud computing services to actually do the due diligence required to write a contract that provides transparency is a “huge problem,” agreed David Escalante, the director of computer policy and security at Boston College, because the market is so immature.

In fact, when Boston College’s legal and contract people have been asked to look into cloud computing, they tend to “pull out the old paradigms,” used for the typical earthbound vendor for whom service-level agreements, software escrow, and spelling out where the data is stored is standard operating procedure. Pinning down cloud providers on such cloud computing security issues, or on how employees are vetted, or on legal protections if that provider goes belly up, is very tough.

“All those provisions of a customer contract, these cloud computing providers can’t do and still utilize their economies of scale,” Escalante said.

(CEO Matt Moynahan, of the cloud-based application security provider Veracode, for example, said his company offers service-level objectives rather than SLAs, in part because his customers’ demands are morphing so quickly. The Veracode promise to return an application within 72 hours is harder to guarantee when customers are now sending 100 million lines of code to vet, compared with earlier days, when customers were sending 50, 000 lines of code.)

The solution will be to move to a set of agreed-upon cloud computing security standards, and the legal and technology communities are inching toward standards. Meantime, Escalante has leaned on the due diligence being done by Shared Assessments, a member organization in the banking industry that was formed to evaluate the security of controls of service providers.

Dave Sherry, CISO at Brown University, where student email is now on Google Apps, said he consulted the nonprofit Cloud Security Alliance, which puts out free guidance and recently issued its Top Threats to Cloud Computing report, for his “many, many rounds” of contract negotiations with Google. Sherry had one big advantage going into negotiations, namely, Google’s strong desire to add a name-brand Ivy League university to its roster of academic customers. So the 800-pound gorilla proved flexible on some counts, he said, including allowing Brown to look at its hiring practices. Still, when it came to a formal risk assessment, Sherry added, Google “was going to give us precious little.”

On the other hand, Brown gets four years of Google Apps for free, and it’s hard to argue with free, Sherry said.

We’ll be writing more about the conundrums CIOs face in deciding whether and exactly how to utilize the cloud in an upcoming ezine for our sister site, SearchCIO.com. Meantime, I’d like to hear from you about how your organizations are weighing the risks and benefits of cloud computing.


March 26, 2010  3:50 PM

SEC provides additional guidance on XBRL compliance

GuyPardon Guy Pardon Profile: GuyPardon

In a public seminar held earlier this week on eXtensible Business Reporting Language (XBRL) compliance, the Securities and Exchange Commission (SEC) offered further guidance on how it expects companies to file their interactive financial records.

Mark Green, senior special counsel at the Division of Corporate Finance, described the issues around noncompliance this way: “As long as someone has an outstanding posting or submission of interactive dating, that entity is ineligible for short posting.” Green added that as soon as interactive filings are submitted or posted as XBRL, a company will once again be eligible for short-form posting.

Green said that short-term hardship exemptions of six business days are possible through application, adding that third-party involvement is not required for assurance or its preparation.

More information on the seminar can be found at XBRL.SEC.gov. An archive of the XBRL Public Education Seminar webcast is available (.WMV).

The instruction to use XBRL for business filings goes back to 2008, when the SEC proposed the use of the format. The SEC‘s mandate to use XBRL initially applied only to the nation’s 50 largest companies in 2009. Companies in the next group must post XBRL filings by June 15, Green said. Tagging of mutual funds will start in 2011.

In each case, companies are required to send XBRL filings to the SEC and post them to their corporate websites, said Joel Levine, assistant director of the Office for Interactive Disclosure. Levine said that technical and taxonomy issues prevent the SEC from accepting early submissions for IFRS. “There is no requirement that interactive data appear in identical format to the traditional format of financial statements,” said Levine.

As senior news writer Linda Tucci reported in January, XBRL financial reporting has been a hard sell. The SEC is encouraging companies subject to XBRL compliance to contact the Office of Interactive Disclosure with questions about compliance. XBRL reporting may not just be for the SEC anymore, but businesses have been slow to adopt it.

Slides from the SEC’s seminar on XBRL compliance are embedded below:

[kml_flashembed movie=”http://d1.scribdassets.com/ScribdViewer.swf?document_id=28986246&access_key=key-2htkx7j1dnbwi3ppdk9i” width=”600″ height=”450″ wmode=”transparent” /]


March 11, 2010  9:15 PM

Web application security matters, even without a compliance mandate

GuyPardon Guy Pardon Profile: GuyPardon

Jeremiah Grossman, the CTO of White Hat Security, has been at the forefront of documenting the dangers of Web application security.

As he points out on his blog, however, state and federal regulations are lagging behind in addressing Web application security, even though many enterprises are increasingly being targeted online. While the Massachusetts data protection law addresses many security controls, as Grossman observes in his blog, there’s nothing in the regulation that specifically addresses the area .

That doesn’t mean that an enterprise might not be held accountable for a data breach that results from a Web application exploit. In the presentation below, which Grossman shared at the RSA Conference, he offers his top 10 Web application security hacks — and some ideas on how to address them.


March 8, 2010  7:33 PM

Can anybody find a way to put a value on a risk management program?

Linda Tucci Linda Tucci Profile: Linda Tucci

If your company is finding it difficult to weigh the costs vs. benefits of a formal risk management program, Standard & Poor’s (S&P) feels your pain.

I caught up with Steven Dryer, managing director at the New York-based credit rating agency, for an update on S&P’s 2008 announcement that it intended to factor enterprise risk management (ERM) measures into its credit ratings of nonfinancial companies. Nearly two years later, the effort is not where Dryer hoped it would be. (Learn more in “What’s a risk management strategy worth to your S&P credit rating?”)

While just about everybody would agree — post-financial meltdown — that a balance sheet is insufficient for gauging a company’s risk exposure, Dryer told me that the agency is really struggling with assigning a value to the more qualitative aspects of a risk management program (company culture, staff roles and where those roles fit in the organization chart, risk policies and metrics).

To ascertain management’s credibility, S&P has to compare what it’s been told about the company’s enterprise risk management program with how the company actually handles anticipated and unanticipated risks — and that will take time, he said. As you’ll read in my story this week, S&P can draw on decades of data it has collected on companies to help set benchmarks. But that will take time, too.

A startup crowdsourcing project called Riskfree.org contends that the business models of the ratings agencies are too narrow to provide sound guidance for investors. Riskfree.org argues that what investors need are the tools to create their own risk models — lots of them, including S&P’s — which can then be aggregated and compared over time to see which models hold up best.

Whether investors should continue to trust any model the credit rating agencies come up with anymore, given their failure to predict the worst financial crisis since the Great Depression, is probably a topic for another story. But somebody has to find a way to correlate the cost of an enterprise risk management program with the benefits. For another reminder, if you need one, of just how much damage this recession as wreaked, consider this: Half the companies that S&P rates today fall into its bottom two categories, CCC and D, or, in plain terms, close to or in default.


February 26, 2010  6:37 PM

DoD releases new policy for the secure use of social media on NIPRNET

GuyPardon Guy Pardon Profile: GuyPardon

This afternoon, the Department of Defense (DoD) announced the release of its new policy for the secure use of social media. The policy officially recognizes that “Internet capabilities are integral to operations across the Department of Defense” and formalizes the use of collaborative technology by service members from within the Non-classified Internet Protocol Router Network (NIPRNET). The NIPRNET is used to exchange sensitive but unclassified information between “internal” users as well as providing users access to the open Internet, as opposed to the SIPRNET, which is used for more secure communications.

“This directive recognizes the importance of balancing appropriate security measures while maximizing the capabilities afforded by 21st Century Internet tools,” said Deputy Secretary of Defense William J. Lynn III in an official release posted at Defense.gov.

Deputy CIO David Wennergren spoke to SearchCompliance.com about the new policy for the secure use of social media late Friday afternoon. (Skip ahead to read the interview in full).

Secure access from the NIPRNET

The NIPRNET will now be configured to allow services members to use such services. The new policy “will allow access to SM [social media] sites but balance it with the need to be secure,” tweeted Price Floyd, Principal Deputy Assistant Secretary of Defense for Public Affairs. He went on tweet out more details.

“Here’s the meat of it: the NIPRNET shall be configured to provide access to internet-based capabilities across all DoD components.”

“Commanders at all levels and Heads of DoD Components shall continue to defend against malicious activity affecting DoD networks (e.g., distributed denial of service attacks, intrusions) and take immediate and commensurate actions, as required, to safeguard missions (e.g., temporarily limiting access to the internet to preserve activity via social media sites (pornography, gambling, hate-crime related activities).

All use of Internet-based capabilities shall comply with paragraph 2-301 of chapter 2 of the Joint Ethics Regulation (reference (b)) and the guidelines set forth in attachment 2.”

The DoD’s social media directory shows the use of social media platforms by over a hundred different accounts, all of which will be governed by the new policy.

The full policy for the secure use of social media from the NIPRNET is embedded below:

[kml_flashembed movie=”http://d1.scribdassets.com/ScribdViewer.swf?document_id=27525669&access_key=key-7e9v31jockuzwxvljwg” width=”600″ height=”450″ wmode=”transparent” /]

“External official presences shall clearly identify that DOD provides their content,” tweeted Floyd. “External official presences shall receive approval from the responsible OSD or DoD Component Head. Official presences shall be registered on the external official presences list.”

The new DoD policy for the secure use of social media is clear on avoiding the addition of personally identifiable information (PII). “Official use shall ensure that the info posted is relevant and accurate, includes no PII, includes a disclaimer,” tweeted Floyd. “ASD (NII)/DOD CIO shall provide implementation guidance, education, training, and awareness activities. USD Intel shall develop and maintain threat estimates on current and emerging Internet-based capabilities. USD Intel shall integrate proper SM use into OPSEC training.”

UPDATE: SearchCompliance.com reached Dave Wennergren, deputy CIO at the Office of the Secretary of Defense, at his office late on Friday afternoon. Following are the highlights of the interview

“This policy is a recognition that information sharing is so important to us,” said Wennergren. “It’s about being able to connect to the right people at the right time. We’ve done a lot of work on bringing in collaboration tools inside of the Department. Now, Web 2.0 tools allow us to extend that. You need look no further than Haiti, where non-governmental organizations were able to connect and share information in a crisis.”

Raising awareness of security

Wennergren observed that there’s a big imperative around both information sharing and security. “The purpose of this memo was to help people think about both,” he said. “When you think about security, you tend to think about individual access. And when you think about sharing on its own, you don’t think about security. What we had was inconsistent application of both. Some services blocked access entirely, others opened up. This memo was issued to enforce consistency around the use of technologies that are really powerful in helping people get their jobs done better. When Second Life happened, people looked at avatars and said that was a game. Now, universities are teaching classes there. Virtual worlds are clearly not a game. There’s a reason the Chairman of the Joint Chiefs of Staff uses Twitter. And there’s a reason the Navy set up an account on Facebook.”

Flexibility in applying policy

Wennergren was thoughtful about how blurred the lines between the workplace and home have become, along with the need to allow commanders flexibility in applying the policy for the secure use of social media within operational constraints. “You have to think about having good information hygiene,” he said, “which means being thoughtful about what you share in public or in private. Commanders at all levels need to take the steps they need to guarantee uptime and security. If the networks are under attack, by all means take the steps you can. If you’re in a limited bandwidth environment, then you need to consider that context.”

Web 2.0 is the office phone of the 21st Century

Wennergren also drew a historical parallel between Web 2.0 technologies and past communication technologies. “If this were a decade ago, we would have been having this conversation about the use of the Internet. It’s an interesting thing to think about whether you have a responsible workforce. People have phones. If people make a lot of long distance calls, there are ways to deal with that. If an employee is not doing their work because they’re spending all their time on the Internet, you can address that. We don’t unplug all the phones because someone might be using it inappropriately. The regulations are clear about use of government equipment. Existing rules will continue to apply.”

He was also concerned about being an attractive employer to the generation of hyper-connected millennials that have entered the workforce over the past decade. “If we want to be an employer of choice, we better give people access to tools that let them release their creativity,” said Wennergren. “There’s huge power in using these kinds of tools but you have to use them responsibly.”

Ensuring secure access to social media

The DoD’s deputy CIO was also focused on keeping the NIPRNET secure as military personnel are accessing Twitter, Facebook or other social media platforms. “This is where we talk about a holistic approach,” said Wennergren. “People need to have access to the tools they need to get their jobs done. That means doing basic blocking and tackling; including firewalls, antivirus and monitoring of traffic going in and out of networks.”

That sort of tracking is why DISA invested in network mapping and leak detection in January. “By monitoring and keeping track of what’s going on, including anomalies, we’re better served than by blocking access to websites. The rest of the government also working on the Trusted Computing Initative,” said Wennergren, “which means putting public-facing content in a DMZ.”

Virtualization and the cloud will be involved

The shift that Wennergren described is significant, in terms of a way of thinking about enabling the workforce. “If what you really want to do is secure information sharing, you need to think about technology that allows you to do trusted computing from untrusted computers,” he said. “It’s sort of like the national park model — take nothing with you, leave nothing behind.  The future is people accessing secure networks through iPhones, Android devices or other mobile clients. The next set of tools might include access through library kiosks.”

Wennergren sees desktop virtualization and server virtualization playing a role in the secure use of social media in the future. “Imagine the prize of having a trusted deskop in the cloud,” he said. “If apps and data are up in the cloud and you understand the perimeter, you can raise the boundaries of security overall.  Whether it’s a bootable CD or other means, virtualization can allow you to be protect your environment from the PC you’re booting into.”

Risk management, not risk avoidance

Regardless of the technologies employed, Wennergren said “the prize” is security. “It’s like the clichéd phrase about defense in depth. People need to be thoughtful about the devices they use and what those devices can do.”

Like Howard Schmidt, the U.S. cybercoordinator, Wennergren is applying risk management to cybersecurity threats. “Moving from risk avoidance to risk management helps you to be more thoughtful,” he said. “Consider a BlackBerry. It’s inconsequential to send a trivial email using a fixed access signal. If it’s a more serious email, you should be using a Common Access Card and a PKI credential to transmit.  Some people’s location data will be turned off. For others, that won’t matter as much.”

Wennergren’s focus is on delivering IT capabilities more quickly and still remaining secure. “Where it used to take a long time to deploy a new platform, now you can do much more,” he said. “If you expose your data, mash things up, use Google Earth, you can move much more rapidly. Whether it’s tracking IED locations, maritime situational awareness, aggregating Blue Team movements, Web 2.0 technologies are giving a us a huge advantage. We can share documents quickly and increase operational awareness through these technologies. Shame on you if you’re not taking advantage of them.”

A video of Wennergren from earlier this month is embedded below, in which he offers more perspectives on secure information sharing.

[kml_flashembed movie="http://www.youtube.com/v/rSlcW17SKG0" width="425" height="350" wmode="transparent" /]

The Defense Department’s Social Media Hub will host more “educational materials” on the policy in the future.

UPDATE: last year, David Meerman Scott interviewed Roxie Merritt, Director of New Media Operations at the Office of the Secretary of Defense for Public Affairs in the U.S. Department of Defense. In the video below, they talk about blogger relations, the role of Facebook and Twitter, and other ways to communicate. [Hat tip to Scott’s post about the DoD’s official policy on social media.

[kml_flashembed movie="http://www.youtube.com/v/GpVPNbrrQYs" width="425" height="350" wmode="transparent" /]


February 24, 2010  10:10 PM

CloudAudit.org to offer tools for verifying cloud computing compliance

GuyPardon Guy Pardon Profile: GuyPardon

The Automated Audit, Assertion, Assessment, and Assurance API (A6) working group is newly organized under the brand of CloudAudit. The stated goal of CloudAudit is to “provide a common interface that allows cloud providers to automate the Audit, Assertion, Assessment, and Assurance (A6) of their environments and allow authorized consumers of their services to do likewise via an open, extensible and secure API.”

In this podcast, SearchCompliance.com associate editor Alexander B. Howard interviews Christofer Hoff, director of cloud and virtualization solutions at Cisco Systems, and one of Cloud Audit’s organizers. Prior to his work at Cisco, he was Unisys Corp.’s systems and technology division’s chief security architect. Hoff continues to participate in the Cloud Security Alliance. You can find Hoff’s blog at Rationalsurvivability.com/blog and follow him on Twitter as @Beaker.

Hoff says that forming A6 came out of the need for enterprise security professionals to have better tools for confirming security and cloud computing compliance at providers of these services.

When you listen to this podcast, you’ll learn:

• What Cloud Audit is.
• What problems A6 could solve for CISOs and CIOs faced with ensuring cloud computing compliance challenges.
• How Cloud Audit would map to compliance, regulatory, service-level, configuration, security and assurance frameworks, or third-party trust brokers.
For more information, visit CloudAudit.org, the relevant Google Group or the Cloud Audit code base at Google Code. Hoff has also collected recent press coverage and other information about A6 at his blog.


February 19, 2010  8:00 PM

Is continuous controls monitoring at the top of your GRC agenda?

Linda Tucci Linda Tucci Profile: Linda Tucci

Is there a groundswell in business for using continuous controls monitoring to beef up corporate governance risk and compliance programs? Analysts and vendors have certainly launched a full-out campaign for its relevance.

AMR Research Inc. in Boston named CCM one of the top GRC software investments companies will make in 2010, right behind compliance management software and business process management BPM products.

Gartner Inc. analyst French Caldwell called out CCM for business and financial applications as a top trend when I spoke to him in December for our preview of the major GRC issues in 2010.

Of course, IT has long used controls automation for configuring servers, conducting audits, maintaining security and so on. CCM was used in complying with the Sarbanes-Oxley Act requirements for segregating duties. But CCM is increasingly being used for business performance issues — for example, to eliminate duplicate payments in real time rather than on a quarterly basis; or to ensure that invoices are paid on schedule but not in advance, to keep that working capital.

“Controls automation is moving up the stack. It’s making sure the business rules are being followed,” Caldwell said, adding that the big enterprise resource planning vendors such as SAP AG and Oracle Corp. are doing just that.

So are the point solution vendors. John Becker, CEO of Approva Corp., a CCM software provider, makes a case for his software in a white paper coming out in a March issue of Compliance Week. Unlike other GRC technologies, Becker argues, CCM delivers “tangible, hard-dollar savings,” and in his white paper he offers up some choice examples, presumably from Approva customers:

  • Lower procurement costs: A telecom company reduced expenses by $2 million by flagging purchases that did not take advantage of available discounts, and preventing unnecessary purchases that circumvented corporate policies.
  • Improved order accuracy and on-time shipments: A manufacturer of construction Lower procurement costs: A telecom company reduced expenses by $2 million by materials reduced the number of sales orders that were delayed and required manual rework by 60% by identifying incomplete and inaccurate information when the sales order was created, and flagging open sales orders that were not shipped within 20 days of their original commitment.
  • Reduced accounting errors: A manufacturer in the midwestern United States reduced the number of financial reporting anomalies requiring manual follow-up and investigation by more than 50%, and significantly increased confidence in the accuracy of its financial reports.
  • Lower audit and compliance costs: The internal audit organization of a $1 billion software company reduced the time its external auditor spent testing its controls by 80% for each key control that they automated.
  • Reduced risk of fraud: A home improvement retailer reduced the risk of employee theft by monitoring the distribution of free samples to identify suspicious orders, excessive shipments and samples with alternate ship-to addresses.

So, is this all hype, or as the lobbyists like to say, a “conflict confluence of interest” for analysts and vendors? I’m curious if readers are seeing an uptick in continuous controls systems for GRC at their companies.

And I need to ask a really dumb question to boot: Would moving to the XBRL electronic data format for financial and other reporting accomplish the same transparency?


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: