Posted by: GuyPardon
compliance, GLBA, PCI DSS, Security
This is a guest post by John Rostern, Jefferson Wells’ Eastern Region Practice Leader for Technology Risk Management.
IT organizations spend billions annually on compliance-related projects. That includes hardware, software, external consultants and (sometimes) uncounted internal human resources. The underlying question, though, is whether compliance improves the overall level of controls and security within a company.
Several factors lead me to believe that may not be the case. At the top of the list is the very point-in-time nature of compliance. Take for example the Payment Card Industry Data Security Standard (PCI DSS). The various self-assessment questionnaires associated with PCI DSS provide testament to the described state of compliance at that point in time. Similarly, the required quarterly scans also provide a point-in-time view relative to specific vulnerabilities that may or may not exist.
While PCI DSS does provide for safe harbor in the event of the breach (if the reports can be subsequently validated), this does nothing to actually improve security. The same may be said for compliance with other regulations such as the Gramm-Leach-Bliley Act (GLBA) (15 USC, Subchapter I, Sec. 6801-6809).
The underlying issue is that compliance with regulatory standards such as PCI DSS and GLBA can lead to a “check the box” approach. In an era of concurrent constraints on budgets and increases in oversight, the temptation to find the quickest and least expensive way to check that box can be compelling. The checkbox approach can also hide the true state of IT controls and security in the organization. Having a report in your hand that “proves” your compliance provides little comfort in the face of an actual data breach or other security incident.
Another potential dead end is to become focused on point solutions and tools. The notion of the “silver bullet” tool that will solve multiple problems has existed throughout generations of information systems and technology professionals. The promise of many tools is unrealized due to incomplete or flawed implementations, or in many cases, lack of processes to support the use of these tools. At the same time, the effectiveness of the best of tools can be minimized by ineffective, poorly designed or missing processes and procedures. For example, the best user provisioning systems still require a regular review of entitlements based on job function. Without this basic “blocking and tackling,” users may accumulate excessive privileges or remain active on a system long after they have left an organization.
An effective IT controls environment encompassing the core tenets of information security — confidentiality, integrity and availability of information — must be viewed as a process. This paradigm also must include the elements of people and technology that combine with process to complete the picture. This provides the basis for a holistic approach to the design, implementation and operation of strong IT controls, rather than the desired outcome: compliance. This approach will provide the foundation for a strong controls environment, which in turn will help the organization to achieve and maintain compliance.
|John Rostern is Jefferson Wells’ Eastern Region Practice Leader for Technology Risk Management. He has more than 27 years of diverse experience in information systems management, architecture, application development, technology, audit and information security.|