Here’s some sound advice for merchants or service providers who are wondering if they are in compliance with the PCI DSS requirements.
A key question that needs to be addressed before implementing PCI is first figuring out how your operation is categorized under Visa’s compliance validation guidelines, either as a merchant or a service provider. Then determine which levels of compliance you are required to meet for that category. These levels range from filling out self-compliance reports up to having to submit to an annual on-site review by a Qualified Security Assessor (QSA).
Such questions are important now in the aftermath of data breaches at Heartland Payment Systems Inc. and RBS WorldPay in recent months. In a strange turn, Heartland officials have gone on the offensive in response to Visa’s statement that it had removed Heartland and RBS from its list of PCI-compliant vendors. The removal prompted some competitors to use the incidents to steal customers away, alleged Heartland CEO Robert Carr, who issued a statement threatening legal action if the misinformation campaign continues.
Visa then clarified its statement regarding the removal, saying that despite the delisting, Heartland was still able to process transactions, which may have caused even more confusion. Evan Schuman has a good take on the situation in “Heartland Taking Names And Kicking POS, With Visa’s Help.”
Gartner has come to the rescue somewhat, issuing a statement earlier this week with recommendations for merchants using Heartland or RBS WorldPay:
* Merchants and other card-accepting enterprises using Heartland or RBS WorldPay services: Take no action, because the processors will likely be recertified soon.
* Visa and other card brands: Clarify PCI DSS enforcement policy from this point on and publicly disseminate enforcement policies and ongoing clarifications and refinements to these policies. Strengthen U.S. payment system security by instituting measures (for example, end-to-end card data encryption and stronger cardholder authentication) that go beyond PCI DSS requirements.
* All parties that handle cardholder data: Focus on maintaining continuous cardholder data security, rather than on achieving PCI-compliant status.
For more coverage on PCI:
So you got the word, the compliance auditors are coming in. It’s like that big squash or tennis match. You’re feeling pretty good, and you think you’re ready. After all, you’re an IT professional, conscientious, hard-working and knowledgeable. But do you know what standard the auditors will be auditing you against? Like your opponent on the squash or tennis court, is it:
How did you do? The correct answer, as those of you know who have the scars to prove it, is f, “none of the above.” That’s right, not even COBIT. And “F” is what you may be about to get until you know how compliance auditors operate.
They’re actually auditing you against you and your company’s own standards and policies. Yup, that’s it. No, they’re not auditing you “against” a COBIT checklist. They’re looking at your own policies and standards and comparing your actual operation to what is stated in those policies.
So, Step 1: Get ahold of those policies and standards.
Step 2: Reality check. Do they represent TODAY’s state of your IT operation? Or are they aspirational? Do they say, for example, “Terminate access rights for all users within 24 hours of employment termination?” Is that really happening, 365 days a year? How about over weekends? Do your security staffers ever have delays getting lists of terminated employees from HR? Do they ever have a gap in coverage due to an unexpected absence? How often do you run a reconciliation report of terminated employees from the last 12 months vs. active usernames? Does HR have the ability to run regular reports of transferred employees, whose access needs to be handled as if they were terminated?
All operations, no matter how large or professional, can have gaps of greater than 24 hours between terminations and access cutoff. And if your operation is NOT among the largest, with a significant access control staff, chances are good you‘ve got terminated employees with access going 48 hours to one week or longer before it’s taken care of. Here’s a secret: Everyone does. The auditors know it, if you don’t.
Ezra B. French, Second Auditor of the US.
- [Image via Wikipedia]
I’ll cover Step 3 in a future post. In the meantime, let me know in the comments if you have any questions so far.
The hype around cloud computing may have subsided, but the issues around adapting and adopting the underlying model are as hot as ever. The enterprise IT headline of the week in The Wall Street Journal was, after all, that IBM is in talks to buy Sun Microsystems. Why? GigaOm thinks IBM wants to bolster its cloud computing arsenal
Major questions for enterprise adoption remain, however, particularly around the issues of security and compliance. When I attended “An Evening in the Clouds” at last year’s Enterprise 2.0 Conference, where Amazon.com, Google and Salesforce.com offered complete cloud computing environments, these paired challenges were cited by a California state CIO as major barriers to governmental adoption. The advantages posed by accessing the computing power and storage capacity offered in the cloud have simply not outweighed the risks posed by exposing sensitive data by moving it outside of encrypted internal networks.
The issue of compliance and cloud computing is of critical importance for organizations large and small, public or private, nonprofit or for-profit. As Lamont Wood notes in his excellent article for Computerworld on cloud compliance, “the user organization is responsible for figuring out who is doing what to its data and requiring assurances about the data staying in compliance.”
As Wood writes, there are specific issues raised by the usual suspects: SAS 70, Payment Card Industry Data Security Standards (PCI DSS) and the Health Insurance Portability and Accountability Act. For instance, Chris Day, senior VP at Terremark Worldwide Inc., noted in the article:
“PCI responsibilities of a cloud provider include firewalls, intrusion detection, disaster recovery, physical controls and appropriate segmentation of staff duties. Servers handling PCI data should be in a separate room with solid walls and a monitored door, rather than being placed in the main floor of the data center with the other servers.”
Moving personally identifiable information (PII) that an organization has assumed responsibility for protecting to a third party also requires a degree of trust that many governmental entities have been understandably hesitant to make. Or, in some cases, will never make. As Scott McClellan, VP and chief technologist of scalable computing at HP, said to Stacy Higgenbotham during an interview for GigaOm: “Solving security is a trust issue that can be surmounted, but the legal issues around location cannot be. There are also items, such as corporate data for financial results during a quiet period, that aren’t going to leave the enterprise walls.” CIOs, CTOs and CCOs tasked with SOX compliance will likely find that reality familiar.
In the wake of comments by Vivek Kundra, the new U.S. CIO, federal adoption of cloud computing and Software as a Service (SaaS) is likely about to increase. Consider what David Linthicum writes at the Intelligent Enterprise: “It’s clear that new White House appointee Vivek Kundra is part of a ‘new generation of CIOs’ that consider cloud computing as a viable architectural option.” Just read what Kundra told The Wall Street Journal last year:
“I’m all about the cloud computing notion. … I look at my lifestyle, and I want access to information wherever I am. … I am killing projects that don’t investigate Software as a Service first.”
The nation’s first CIO is going to face some stiff criticism in both private and public enterprises. I’m not worried about Kundra’s ability to weather such critiques — he handled the federal sting on his D.C. offices with grace, after all — but there are valid questions about whether certain data sets and institutions should ever be integrated into a cloud computing model.
Perhaps prompted by Kundra’s evangelism, Google’s lobbying or pervasive cloud security concerns, the FTC met this past week to discuss securing personal data in the global economy. The discussion was timely. (Although, as of today, there was no webcast posted for the session.)
As reported by Computerworld UK, Google is getting slammed for the security of its cloud services. According to Jeremy Kirk, “the Electronic Privacy Information Centre is calling on the U.S. Federal Trade Commission to investigate whether Google is making deceptive claims over the security of data stored in cloud-computing services such as Gmail and Google Docs.”
That’s precisely the issue that Chuck Goolsbee focused on when he shared his considerable skepticism about whether cloud computing providers can meet regulatory compliance requirements like PCI DSS in “Don’t buy cloud computing hype: Business model will evaporate.” If there isn’t an effective regulatory standard, framework and guidance from the appropriate regulatory bodies on security, compliance issues will derail enterprise adoption.
FTC guidance on cloud compliance or official recognition by the PCI Security Standards Council may be forthcoming later this year. Given the visibility, adoption and interest in cloud computing in the enterprise, cloud compliance under PCI or something like it seems likely. Michael Dahn addressed the issue late last year in“Cloud computing security and PCI,” relating the controversy to earlier discussions around compliance and virtualization. He reiterates a common mantra I’ve heard this month: “Regulatory compliance and PCI are NOT technology issues, but risk management issues.”
One potential way that enterprises can adopt cloud computing and still remain compliant may be to use multiple clouds: an external cloud and an internal cloud or private cloud. In fact, at least one Gartner analyst thinks private cloud networks are the future of corporate IT. As Higgenbotham reported in her excellent article on the cloud and the enterprise at GigaOm:
“HP and companies such as Elastra, Sun Microsystems and IBM are pitching highly virtualized and automated environments that mimic the agility of the public clouds. However, all of the people at enterprise-oriented companies I’ve spoken with believe that their customers should start turning over some computing tasks to external clouds, be they infrastructure providers like those offered by Amazon, Rackspace’s new CloudServers business, Sun’s planned cloud or platforms such as Microsoft’s Azure.
Many believe enterprise customers will source their computing to multiple clouds, both to avoid vendor lock-in and because some clouds will be optimized for certain types of computing tasks. That’s why tools to manage multiple clouds will be important. RightScale, Aptana, Sun and others are all trying to help manage multiple clouds.
And once an enterprise sends out data to external clouds, they will need to find ways to manage, secure and actually deliver this data from inside the corporation to a cloud. Some providers, like Rackspace and Voxel, are banking on customers using their hosting and cloud products as a way to keep the data inside the same company (and maybe data center). [Werner Vogels, CTO of Amazong’s Web Services] says Amazon uses VPNs with enterprise clients, while startups such as Aspera are creating private highways to deliver data between clouds.”
In a throwback to a classic IT conundrum, this paradigm of multiple clouds and multiple cloud providers is complicated by interoperability issues. As Rich Miller writes at Data Center Knowledge, cloud interoperability is a hot topic at the moment as a result of the posed multiple cloud model and the diversity of infrastructure and standards presented by Google, Amazon, Microsoft and other cloud computing providers. As Craig Balding of Cloud Security told Wall Street & Technology, “Amazon and Google are walled gardens. You can’t take an app from Google and bring it over to Amazon because they’re architected differently.”
In the cited article, “Adoption of Cloud Computing Hindered by Security and Operations Concerns,” Penny Crossmon reported significant compliance issues rest in the ability of cloud computing providers to meet audit requirements. As she writes, “the biggest hurdle for cloud computing on Wall Street right now — especially among large firms — is a lack of visibility into cloud providers’ operations and security.” Balding suggested that nondisclosure agreements may provide a means for cloud computing providers to share more detailed information about where data is being handled, how, by whom and how it is being protected — all crucial under many compliance regulations. Amazon Web Services released a cloud security white paper to try to address the visibility issue but leaves key details like access to log files or access for digital forensic investigations related to e-discovery unanswered.
One project to keep an eye on is Eucalyptus, an open source system that enables organizations to create private clouds that will be interoperable with multiple cloud computing platforms. As Simon Wardley recently blogged on Radar O’Reilly, “Karmic Koalas love Eucalyptus.” This cryptic title refers to the fact that Mark Shuttleworth recently announced that the next release of Ubuntu 9.10 is code-named Karmic Koala. The project won’t address the thorny issues of compliance requirements but, given the ambition of Canonical (the company that sponsors and supports Ubuntu) to “provide our users with the ability to build their own clouds whilst promoting standards for the cloud computing space,” it may drive the industry towards the adoption of PCI or other frameworks.
For more commentary and perspectives on compliance in the cloud, review the following articles and blog posts:
- CA’s Rob Zanella on Cloud Computing and Compliance
- Fact or FUD: Compliance and Security in the Clouds
- What Does PCI Compliance in the Cloud Really Mean?
- Cloud compliance: How to manage SaaS risk
- FTC questions cloud-computing security
- PCI Compliance in the Cloud: Get it in writing!
[One of our readers, compliance officer Ramon de Bruijn, wrote to the editors of SearchCompliance.com at firstname.lastname@example.org last month looking for some advice. Specifically, he asked “What is the best way to implement a risk assessment in an IT department that aligns COBIT controls with risks?” In her first post for IT Compliance Advisor, Sarah Cortes, PMP, CISA, provides an answer to his question. -Ed.]
Implementing a risk assessment that will align the COBIT control framework with risks is a valuable undertaking and a smart way to approach the challenge. If approached with a working knowledge of COBIT, it should take no longer than any other risk assessment approach.
In the long run, it will likely shorten the overall cycle:
Risk assessment -> Recommendation -> Solution implementation -> Audit
This is because COBIT can provide a thorough checklist of potential risk areas that might otherwise be missed, requiring multiple passes or potential wasted effort implementing solutions to lower-priority risks, while ignoring those with a higher priority.
One thing to keep in mind is that COBIT controls are not just “in an IT department.” They include controls for business interruption and other business problems that have traditionally fallen to IT to deal with, rightly or wrongly.
The first step is to obtain a copy of COBIT controls, which you can do from ISACA.org or other sources on the Web.
The second step is to provide education, if necessary. Make sure key individuals in your organization have heard of COBIT and understand it is an internationally accepted standard. No need to worry anyone will know it better than you. Even auditors and CISA professionals can achieve only a moderate level of memorization of all aspects of COBIT. COBIT changes all the time. Technology in some areas moves beyond it in areas. In general, COBIT is too far-reaching for even the most seasoned IT professional to avoid re-reading and referring to it frequently when working with it.
After obtaining a copy and getting buy-in, the third step is to put it away. You need to ask yourself and others where the known risks to IT and business lie. This bottom-up approach is critical to avoiding “over-COBITING,” a common affliction.
Once you have carefully listened to IT professionals and others with respect to control weaknesses and the risks that actually “keep them up at night,’ you are ready to pull out your COBIT framework again. Review a fuller set of risks with those same individuals. See if that uncovers risks they may have missed the first time. This checkpoint is one benefit of COBIT.
Finally, you should document your risk assessment and note areas listed in COBIT that individuals in your organization did not consider worthy of note. Each COBIT area should be covered. If the risk included in COBIT is not prioritized in the risk assessment, a specific reason should be noted, along with the individual who decided to assume or dismiss that risk. This will come in handy later, trust me.
If you follow these steps, you will be further ahead than 99% of professionals and IT departments in your shoes. Good luck, and happy documentation!
|Sarah Cortes is a senior technology manager with extensive experience in all aspects of delivering information technology systems and services to Fortune 500 firms in the financial services industry, as well as biotechnology, media and higher education. Sarah Cortes has managed numerous major Code Red business and system interruptions, including the 9/11 failover of trading, accounting and other critical business systems during Marsh McLennan’s WTC data center collapse. You can learn more her work at InmanTechnologyIT.|
As business owners are preparing for the new Massachusetts data protection law, also known as 201 CMR 17: Standards for The Protection of Personal Information of Residents of the Commonwealth, due next year, a potential quagmire is building.
Speaking at the TechTarget Compliance Decisions Summit March 12, Laurence Anker, engagement manager, technology risk management for Jefferson Wells International, said the coming influx of state privacy laws will create “a mess.”
Only about half of the states have laws governing personally identifiable information, but several more, including Massachusetts, are crafting tough laws that will put new burdens on businesses, especially SMBs, and businesses outside of the state that employ Massachusetts residents.
These laws will cover areas such as secure storage of data, encryption of data and access controls, as well as require businesses to create written, comprehensive security and privacy policies for personal data.
Such tasks are formidable, but not impossible, but multiply the Massachusetts law by 50 and it’s easy to see how difficult it will become for some businesses to make sure they are in compliance with every state’s privacy law.
Or, Anker said, there could be a day when state privacy regulators will join an organization similar to the National Association of Insurance Commissioners, which will seek to normalize the state privacy laws and help the states enforce them.
As I wrote yesterday, the Compliance Decisions Summit got off to a great start when Eric Holmquist and Richard Mackey considered the future of compliance in their talks before a crowded hall of auditors, compliance officers, CIOs and information security professionals.
The second half of the day featured Holmquist again, this time exploring a risk-based approach to information security governance, and Laurence Anker, speaking about managing the cost and complexity of compliance through governance.
We posted the following Twitter on our ITCompliance account over the course of the afternoon. The #CSD09 you see below is a hashtag we chose to track tweets related to today’s seminar. For a full explanation of what a hashtag is and how it works, please consult last week’s digest of compliance headlines from Twitter.
All four seminars from Compliance Decisions will be available soon from SearchSecurity.com and SearchCompliance.com, along with an exclusive interview with Mackey exploring the ramifications of virtualization to compliance management.
A Risk-Based Approach to Information Security Governance
Lunch over, video recorded w/Mackey on #virtualization & #compliance. Next: Holmquist on a risk-based approach to infosec governance. #CSD09
Information security must be approached as a business issue, not an IT issue. Then we can consider risk mgmt practices.” -Holmquist | #CSD09
RT @ scotpe Adding: “chief security officer does not belong in IT.” Where does s/he belong? [ <– Good question. Any answers? ]
Lundquist recommends forming a #security council. Give it authority, include senior execs, make cross-disciplinary, safe & visible. #CSD09
Key insight for creating a culture of cooperation vs. risk: “Make it safe to fail” -Holmquist | Don’t underestimate “gut feelings” #CSD09
“Insiders are exponentially more of a threat than outsiders. The ability to respond quickly & effectively is critical” -Holmquist | #CSD09
“You can approach assessing risk in 4 ways: IT systems, electronic data, physical files & third parties. Focus on accountability.” #CSD09
“Risk is quantified in 4 broad categories: What’s at risk? What would be the impact? What could be the source? What can we mitigate?” #CSD09
Paused for another message from another sponsor of #CSD09 & a networking break. Door prize drawing up next for a Flip, iPod & a GPS unit.
Managing the Cost and Complexity of Compliance through Governance
Insurance for IT risk? Anker notes standard policies may not address IT exposures like a data breach or reputational damage. #CSD09
“An organization’s info & other intangible assets account for 80%+ of its market value.” -IT Governance Institute (ITGI) | #CSD09
Conclusions from Compliance Decisions
You’ll be reading, hearing more and seeing more of Holmquist, Anker and Mackey on SearchCompliance.com. All three men will be contributing experts in upcoming articles, podcasts or video.
Writers from both SearchSecurity.com and SearchCompliance.com will continue reporting on the Massachusetts data protection law and its ramifications for IT professionals and businesses nationwide. Clearly, many questions remain about the regulatory impact of the law on IT operations.
As Robert Westervelt reported, the deadline for the Massachusetts data protection and encryption law was extended to Jan. 1.
“We understand the impact of the current business environment and feel this is an appropriate time frame for companies to implement the necessary protections,” Daniel C. Crane, the Undersecretary of the Office of Consumer Affairs and Business Regulation, said in a statement.
Westervelt noted a key change in the updated version of the regulation: “The extension includes a revision to the rules relaxing a requirement holding third parties accountable to the security rules. Under the original law, companies had to attest that a third-party provider was compliant with the regulations.”
As noted to the audience during the question-and-answer session with Anker, SearchCompliance.com recorded a podcast last month with Gerry Young and David Murray of the Massachusetts Office of Consumer Affairs and Business Regulation. The CIO and general counsel, respectively, discuss the details of the new data protection rules:
The provision of third-party compliance as proven by a “WISP” came up during the course the interview, if not under that name. Regardless of the documentation requirements, small businesses and enterprises alike considering outsourcing data protection and encryption compliance will need to make sure that service providers, VARs and consultants certify and appropriately explain where and how their work brings an organization into compliance with the Massachusetts statute.
The Compliance Decisions Summit taking place in Newton, Mass., got off to a great start this morning. Eric Holmquist and Richard Mackey both provided deep, engaging presentations on “future-proofing” an organization against compliance challenges and managing third-party risk.
Over the course of the morning, we posted to Twitter on our ITCompliance account more than 40 times, in lieu of a single blog post. As we noted to @cmneedles, #CSD09 is the hashtag we’ve chosen to track tweets related to today’s seminar. For a full explanation of what a hashtag is and how it works, please consult last week’s weekly digest of compliance headlines from Twitter.
Damore notes the breadth of compliance challenges: health, financial & proprietary data must all be secured with auditable processes.
Future-Proof Your Compliance Session
Eric Holmquist is up, explaining how to future-proof a compliance program vs. new regulations, including mitigating risk & GRC best practices.
“Every version of regulatory guidance around risk management boils down to three things: awareness, accountability & actionability.” #CDS09
Risk management boiled down to a continuum: Inherent Risk -> Controls -> Residual Risks | Compliance doesn’t just rest in controls. | #CSD09
“The 4 most important words for improving a compliance program: What could go wrong?” -Eric Holmquist | #CDS09
Key elements of an effective compliance program: subject matter expert, compliance committee (real or virtual), control library | #CSD09
More key elements of an effective compliance program: documentation, risk-aware culture, incident response team, wrap-around analysis #CSD09
Eric Holmquist is reflecting on the details of how Advanta implemented an effective compliance program. Gap analysis & visibility key #CSD09
“No regulation is only relevant to IT. There is a business component to every single one.” -Eric Holmquist | #CSD09
“We set the bar at a risk management & governance level. Regulatory guidance, frameworks & standards are a test.” -Eric Holmquist | #CSD09
Good question from the audience on email retention: What’s too much, too little? Establishing which emails = official documents is key. #CSD09
Sponsored Session from Symantec
Managing Third-Party Risk
Mackey talking about impact of regulatory project requirements on service providers. If they handle regulated info, compliance is key #CSD09
“The first step in understanding risk is understanding the information shared.” -Richard Mackey | Data mapping & tools help. | #CSD09
“FFIEC, PCI & GLB all require due diligence in assessing provider controls. Depth should correspond to risk.” -Richard Mackey | #CSD09
“When evaluating service providers for compliance, establish rules for evaluations. View them as a partnership.” -Richard Mackey | #CSD09
“Most regulations require YOU to be the regulator of service providers.” PCI, HIPAA & GLB all require co.’s to ensure compliance. #CSD09
“Standards-based assessments, like ISO 27002, are useful tools. Consumers of the reports, however, must understand what results mean” #CSD09
Excellent seminar on third-party risk management for meeting compliance by Richard Mackey. Video will be available later this month. #CSD09
We’ll be posting more to Twitter this afternoon when Holmquist presents again, this time on a “Risk-Based Approach to Information Security Governance,” and Laurence Anker talks about “Managing the Cost and Complexity of Compliance through Governance.”
As those of you who have followed the launch of SearchCompliance.com know, we’ve been using our @ITCompliance account on Twitter to share news, find our audience, get the freshest compliance news and pass on information about what’s happening on our site. Like Marshall Kirkpatrick and Richard MacManus at ReadWriteWeb, I see considerable applications for journalism there. (Here’s how they use Twitter for journalism.)
I was reminded recently, however, that many CIOs and compliance professionals are not on the microblogging platform yet.
It makes sense to share compliance-related news and resources that we’ve found on Twitter with you all in the form of a weekly digest. If you haven’t followed us on Twitter, here’s what you’ve missed:
NOTE: “RT” means “retweet” and “PRT” means that the retweeted content has been modified. If you need a quick primer on Twitter, try the post I wrote for WhatIs.com last year, “What is Twitter? Is this distributed microblogging platform ready for the enterprise?”
I will post digests more frequently if the volume of microblog posts (or “tweets”) merits it. You’ll certainly be able to follow our coverage of the Compliance Decisions Summit next week here and on Twitter. I’ll have a video camera and digital voice recorder, so expect to hear and see more from CIOs, security and compliance professionals.
We’re always looking for a way to feature our audience. If you’d like to write a case study of a difficult compliance-related business decision, technology implementation or user education opportunity, please write to email@example.com and let us know.
UPDATE: Rebecca Herold suggested via her Twitter account (@PrivacyProf) that I explain what the pound signs (#) above are and what their significance is to those unfamiliar with Twitter. (You may remember her as the compliance expert whose work on Windows compliance was the subject of a previous post.)
Here’s how hashtags.org puts it:
Hashtags are a community-driven convention for adding additional context and metadata to your tweets. They’re like tags on Flickr, only added inline to your post. You create a hashtag simply by prefixing a word with a hash symbol: #hashtag.
You can learn more about them at the Twitter Fan Wiki page for hashtag. Here’s the history:
Hashtags were developed as a means to create “groupings” on Twitter, without having to change the basic service. The hash symbol is a convention borrowed primarily from IRC channels, and later from Jaiku’s channels.
hashtags.org provides real-time tracking of Twitter hashtags. Opt-in by following @hashtags to have your hashtags tracked. Similarly, Twemes offers real-time tracking without the necessity of following a specific Twitter account. Also, with their purchase of Summize, Twitter itself now offers some support of hashtags at their search engine: http://search.twitter.com
How does that extend to compliance? Simple. Just go to http://search.twitter.com and enter compliance. You’ll see a real-time reflection of the news, commentary and resources being exchanged on Twitter. You can subscribe to the compliance hashtag using RSS. If you prefer email alerts, you can also use TweetBeep to get an hourly update of whenever someone uses compliance in a tweet.
Unstructured documents are the gorilla in the room in every conversation about compliance, or for that matter, data classification in general. In most companies, there’s a lot of sensitive information in spreadsheets, Word docs and the like, scattered about on client machines and network drives. Is it retained, protected and auditable? Heck, can you even make a list of all these docs? Probably not.
In a conversation with Brian Babineau, a senior analyst at Enterprise Strategy Group who talks to a lot of IT folks, as well as vendors, Brian mentioned to me that he’s hearing that a lot of IT shops are turning to Microsoft’s SharePoint as an easy unstructured document repository. I find this both believable and puzzling. Believable because it’s close to free for many large Windows customers and puzzling because, as a user myself, I’ve found its access controls to be rather blunt. Of course, that’s all a simple matter of programming, ultimately, but then, if SharePoint is the quick and dirty winner, who’s going to do that programming?
I’m interested in hearing from all of you about whether SharePoint has a place in your compliance plans, whether you’ve rejected it in favor of something else, or what else you are considering. In fact, are you paying much attention to unstructured docs at all at any level, except file server backup?
Leave your comments below. And check out my conversation with Brian Babineau on the infrastructure for compliance — it’ll be up March 10. As well, he will be doing a more thorough webcast soon going into detail on how to weave a compliance infrastructure out of the best parts you already should be using for other reasons.
“A day at the beach can turn into a hurricane fast.”
That’s the tagline Sarah Cortes chose for Inman TechnologyIT, her Cambridge, Massachusetts-based consultancy. What’s the context? Disaster recovery, security and preparation for IT compliance audits. I met Cortes at a meeting of the New England Tech Professionals LinkedIn group last night in Waltham, Massachusetts. She provided an overview of IT policies, standards and technical directives to a group of seasoned IT professionals before leading a discussion of how these frameworks relate to actual preparation.
I posted the following updates to @ITCompliance on Twitter while she spoke and engaged the audience.
- Cortes presenting on a true “alphabet soup” of standards/orgs: ISO/ISEC 27000, ITIL, NIST, PMBOK, TOGAF, CMMI for dev, SEI’s CMM & COBIT .
- Important note from Cortes: Many of the “standards” (like COBIT) are frameworks. Adopting them gives auditors a reference point.
- Excellent discussion here by IT pros of the difference between stating ISO/COBIT compliance & genuine quality in IT policy & processes.
- Discussion turning to ISACA technical directives & more granular IT processes & recommendations. Key reference: http://isaca.org
- Wrapping up; Cortes of Inman Tech moderated a useful discussion of compliance standards & audit concerns. http://twitpic.com/1prtm
Aside from the opportunity to meet a dozen enterprise IT professionals, the core of the SearchCompliance.com audience, I took away a number of insights that the tweets above highlight.
First, the number of standards and frameworks relevant to compliance is staggering. Compliance officers and CIOs have long since become well aware of the issue. When Cortes talked about ISO/ISEC 27000, her tongue-in-cheek comment was that 27000 referred to the number of standards it comprises.
Secondly, in Cortes’ eyes there’s a distinction between being compliant with a given framework, like COBIT or ITIL, and running a quality IT department that is prepared for a disaster and has consistently protected critical financial, health and intellectual property data. Demonstrated adherence to these frameworks, especially in documentation of internal processes and policies, will help when the compliance auditors come calling.
The latter part of the presentation ran through dozens of recommendations for given IT policies offered from the Information Systems Audit and Control Association (ISACA). As Cortes noted, the frameworks for security don’t offer specific advice for a given area. ISACA directives do. As I noted in the tweet, more information is available at http://isaca.org.
The final part of the night featured a wide-ranging discussion about life on the “front lines” of the IT department by engineers and administrators who had to mitigate data breaches, prepare for compliance audits and develop procedures to ensure compliance across multiple computing environments. Clearly, these tasks aren’t easy. If you’d like to tell us your story, please write to firstname.lastname@example.org.
Thanks again to Cortes for allowing us to publish her presentation and to Dennis Comeau for the invitation to the meeting.