The CSA Security, Trust & Assurance Registry (STAR) program enables cloud service providers to submit self-assessment reports that document compliance regarding best practices published by the alliance. According to the CSA, the searchable registry will allow potential cloud customers to review the security practices of providers and determine the level of compliance offered — or better yet, learn from the best how to secure their own cloud initiatives.
Some may find this a bit disconcerting and will worry that transparency will expose them to attacks and breaches. However, transparency also leads to better understanding and improvements in security by exposing possible flaws and weaknesses.
STAR offers a “major leap forward in industry transparency, encouraging cloud service providers to make security capabilities a market differentiator,” according to a CSA release. CSA STAR will be available in the fourth quarter. Cloud providers can submit two different types of reports — the Consensus Assessments Initiative Questionnaire and the Cloud Controls Matrix.
Find out more at www.cloudsecurityalliance.org/star/.]]>
Today, social networking and cloud technologies are all about sharing information — much to the chagrin of those responsible for keeping intellectual property safe and secure. For those seeking to share information freely using cloud technologies, private clouds become an ideology of choice. However, private clouds can be anything but private, especially if they’re using the Internet as a connection methodology between sites.
In effect, this means that private clouds will always have some form of connectivity to the outside world. Of course, a properly configured private cloud will incorporate several virtual and logical carriers that are designed to prevent unauthorized access to the content contained within (that’s the theory, at least).
Nevertheless, those managing and attempting to secure private clouds have to ask themselves a few questions, including: How can I be sure my cloud is protected from intrusion? Is my firewall, VPN or other security technology effective? How can I remediate any security problems?
The answers to those questions would dictate how to proceed with a security ideology that effectively protects data contained within private clouds. For many, the answer comes in the form of layered protection. By combining the benefits of a stateful packet inspection firewall, encrypted access, secure logins and extensive auditing, compliance managers should be able to achieve effective protection to secure private clouds. Yet, some will find that may not be enough.
Luckily for IT managers, the security market is evolving, bringing new technologies to the market that help prevent, remediate or detect security issues. Of course, the best approach is to avoid a breach altogether — a task that may be impossible but is nevertheless a worthwhile goal.
Companies such as Palo Alto Networks are re-engineering firewall technology to be more effective, and are offering new products that seem to be a more effective fit within the cloud community. Naturally, Cisco, Juniper, Check Point and many others are also hardening their security products to better protect IT assets, all of which will help make it easier to secure private clouds.
Nevertheless, cloud security still needs to be validated and maintained, and those tasks usually require auditing, forensics, continual testing and effective monitoring. These tasks usually fall to compliance officers and security administrators. Luckily, the tools in these arenas are evolving as well.
For example, networking forensics vendor NIKSUN launched a forensics platform that promises to give IT managers full insight into network activity. Ideally, administrators could use NIKSUN’s forensics utilities to diagnose breaches, gather evidence and plug holes.
Keeping private clouds private demands that IT managers take a different look at how security is enforced across a network and how interaction between networks is monitored. This requires effective monitoring and analysis that goes beyond validating firewall and user account settings. The key here is to catch anomalies as they occur, or taking a more proactive approach to protection.
Frank Ohlhorst is an award-winning technology journalist, professional speaker and IT business consultant with more than 25 years of experience in the technology arena. He has written for several leading technology and business publications, and was also executive technology editor at eWEEK and director at CRN Test Center.]]>
Major questions for enterprise adoption remain, however, particularly around the issues of security and compliance. When I attended “An Evening in the Clouds” at last year’s Enterprise 2.0 Conference, where Amazon.com, Google and Salesforce.com offered complete cloud computing environments, these paired challenges were cited by a California state CIO as major barriers to governmental adoption. The advantages posed by accessing the computing power and storage capacity offered in the cloud have simply not outweighed the risks posed by exposing sensitive data by moving it outside of encrypted internal networks.
The issue of compliance and cloud computing is of critical importance for organizations large and small, public or private, nonprofit or for-profit. As Lamont Wood notes in his excellent article for Computerworld on cloud compliance, “the user organization is responsible for figuring out who is doing what to its data and requiring assurances about the data staying in compliance.”
As Wood writes, there are specific issues raised by the usual suspects: SAS 70, Payment Card Industry Data Security Standards (PCI DSS) and the Health Insurance Portability and Accountability Act. For instance, Chris Day, senior VP at Terremark Worldwide Inc., noted in the article:
“PCI responsibilities of a cloud provider include firewalls, intrusion detection, disaster recovery, physical controls and appropriate segmentation of staff duties. Servers handling PCI data should be in a separate room with solid walls and a monitored door, rather than being placed in the main floor of the data center with the other servers.”
Moving personally identifiable information (PII) that an organization has assumed responsibility for protecting to a third party also requires a degree of trust that many governmental entities have been understandably hesitant to make. Or, in some cases, will never make. As Scott McClellan, VP and chief technologist of scalable computing at HP, said to Stacy Higgenbotham during an interview for GigaOm: “Solving security is a trust issue that can be surmounted, but the legal issues around location cannot be. There are also items, such as corporate data for financial results during a quiet period, that aren’t going to leave the enterprise walls.” CIOs, CTOs and CCOs tasked with SOX compliance will likely find that reality familiar.
In the wake of comments by Vivek Kundra, the new U.S. CIO, federal adoption of cloud computing and Software as a Service (SaaS) is likely about to increase. Consider what David Linthicum writes at the Intelligent Enterprise: “It’s clear that new White House appointee Vivek Kundra is part of a ‘new generation of CIOs’ that consider cloud computing as a viable architectural option.” Just read what Kundra told The Wall Street Journal last year:
“I’m all about the cloud computing notion. … I look at my lifestyle, and I want access to information wherever I am. … I am killing projects that don’t investigate Software as a Service first.”
The nation’s first CIO is going to face some stiff criticism in both private and public enterprises. I’m not worried about Kundra’s ability to weather such critiques — he handled the federal sting on his D.C. offices with grace, after all — but there are valid questions about whether certain data sets and institutions should ever be integrated into a cloud computing model.
Perhaps prompted by Kundra’s evangelism, Google’s lobbying or pervasive cloud security concerns, the FTC met this past week to discuss securing personal data in the global economy. The discussion was timely. (Although, as of today, there was no webcast posted for the session.)
As reported by Computerworld UK, Google is getting slammed for the security of its cloud services. According to Jeremy Kirk, “the Electronic Privacy Information Centre is calling on the U.S. Federal Trade Commission to investigate whether Google is making deceptive claims over the security of data stored in cloud-computing services such as Gmail and Google Docs.”
That’s precisely the issue that Chuck Goolsbee focused on when he shared his considerable skepticism about whether cloud computing providers can meet regulatory compliance requirements like PCI DSS in “Don’t buy cloud computing hype: Business model will evaporate.” If there isn’t an effective regulatory standard, framework and guidance from the appropriate regulatory bodies on security, compliance issues will derail enterprise adoption.
FTC guidance on cloud compliance or official recognition by the PCI Security Standards Council may be forthcoming later this year. Given the visibility, adoption and interest in cloud computing in the enterprise, cloud compliance under PCI or something like it seems likely. Michael Dahn addressed the issue late last year in“Cloud computing security and PCI,” relating the controversy to earlier discussions around compliance and virtualization. He reiterates a common mantra I’ve heard this month: “Regulatory compliance and PCI are NOT technology issues, but risk management issues.”
One potential way that enterprises can adopt cloud computing and still remain compliant may be to use multiple clouds: an external cloud and an internal cloud or private cloud. In fact, at least one Gartner analyst thinks private cloud networks are the future of corporate IT. As Higgenbotham reported in her excellent article on the cloud and the enterprise at GigaOm:
“HP and companies such as Elastra, Sun Microsystems and IBM are pitching highly virtualized and automated environments that mimic the agility of the public clouds. However, all of the people at enterprise-oriented companies I’ve spoken with believe that their customers should start turning over some computing tasks to external clouds, be they infrastructure providers like those offered by Amazon, Rackspace’s new CloudServers business, Sun’s planned cloud or platforms such as Microsoft’s Azure.
Many believe enterprise customers will source their computing to multiple clouds, both to avoid vendor lock-in and because some clouds will be optimized for certain types of computing tasks. That’s why tools to manage multiple clouds will be important. RightScale, Aptana, Sun and others are all trying to help manage multiple clouds.
And once an enterprise sends out data to external clouds, they will need to find ways to manage, secure and actually deliver this data from inside the corporation to a cloud. Some providers, like Rackspace and Voxel, are banking on customers using their hosting and cloud products as a way to keep the data inside the same company (and maybe data center). [Werner Vogels, CTO of Amazong's Web Services] says Amazon uses VPNs with enterprise clients, while startups such as Aspera are creating private highways to deliver data between clouds.”
In a throwback to a classic IT conundrum, this paradigm of multiple clouds and multiple cloud providers is complicated by interoperability issues. As Rich Miller writes at Data Center Knowledge, cloud interoperability is a hot topic at the moment as a result of the posed multiple cloud model and the diversity of infrastructure and standards presented by Google, Amazon, Microsoft and other cloud computing providers. As Craig Balding of Cloud Security told Wall Street & Technology, “Amazon and Google are walled gardens. You can’t take an app from Google and bring it over to Amazon because they’re architected differently.”
In the cited article, “Adoption of Cloud Computing Hindered by Security and Operations Concerns,” Penny Crossmon reported significant compliance issues rest in the ability of cloud computing providers to meet audit requirements. As she writes, “the biggest hurdle for cloud computing on Wall Street right now — especially among large firms — is a lack of visibility into cloud providers’ operations and security.” Balding suggested that nondisclosure agreements may provide a means for cloud computing providers to share more detailed information about where data is being handled, how, by whom and how it is being protected — all crucial under many compliance regulations. Amazon Web Services released a cloud security white paper to try to address the visibility issue but leaves key details like access to log files or access for digital forensic investigations related to e-discovery unanswered.
One project to keep an eye on is Eucalyptus, an open source system that enables organizations to create private clouds that will be interoperable with multiple cloud computing platforms. As Simon Wardley recently blogged on Radar O’Reilly, “Karmic Koalas love Eucalyptus.” This cryptic title refers to the fact that Mark Shuttleworth recently announced that the next release of Ubuntu 9.10 is code-named Karmic Koala. The project won’t address the thorny issues of compliance requirements but, given the ambition of Canonical (the company that sponsors and supports Ubuntu) to “provide our users with the ability to build their own clouds whilst promoting standards for the cloud computing space,” it may drive the industry towards the adoption of PCI or other frameworks.
For more commentary and perspectives on compliance in the cloud, review the following articles and blog posts: