When you think about application or software security, you usually think about the bad guys outside your company trying to get in. But just as often, if not more, the danger comes from within with employees accessing personal data.
The issue of protecting data comes up when testing applications. Testers need production-like data to ensure applications work correctly, but you don’t want to give them live data. To help with that, companies are employing data masking technologies.
DataGuise is one company that provides a data masking tool. This week the company announced the industry’s first masking in place (MIP) solution for multi-database environments, the DataGuise dgSolution suite. Company officials say the suite solves two of the biggest concerns for building non-production test environments: time-to-deployment and production data leakage.
The suite includes dgDiscover, which helps locate sensitive data across various databases, and dgMasker, which masks the data in non-production environments.
dgMasker comes with 15 masking options out of the box, including options for Social Security numbers, credit card numbers, addresses, etc. And because it runs across multiple databases, if you make a change on one database it will also be changed on the others. You get consistent test data.
Erik Jarlstrom, vice president of customer advocacy at DataGuise, said they tried to make it a high-performance suite. “We really tried to make it as fast as possible so you aren’t releasing unmasked data to development,” he said.
Say the word “agile” and people immediately have a reaction. Those in favor of it see it as an efficient way to create software that users actually want. Those against it see it as “cowboy” or rogue — developers doing whatever they want.
If you work at a company where people like the processes they have for developing software and push back against new ideas, needless to say it can be difficult implementing agile development practices. But it can be done if you make subtle changes and don’t even mention the word agile until you have to.
David Christiansen explained in his recent Webcast “How to introduce agile in a waterfall environment” how he used guerrilla-style tactics to introduce agile practices on his projects. Little by little he changed things until it was obvious that they were doing agile development, and then he had to admit he was using those techniques. But at that point, he could show management that agile development worked. He could show them proven success.
David said that usually it’s IT that pushes back when talking about agile — managers, testers, and sometimes developers. It isn’t the users or stakeholders. They don’t care what you do as long as you give them software that works the way they want it to, he said.
You need to be careful when following David’s secret strategy, as you don’t want to be fired for disobeying your boss. If you ask if you can do agile development and you’re told no, you probably shouldn’t go ahead and do it. But if you can show them first how agile practices work, then you’re likely to get more support. As David said, “Sometimes it’s better to ask for forgiveness later than to ask for permission first.”
As the year starts to wind down, we at SearchSoftwareQuality.com are looking back at what took place during 2008. One thing that we’re focusing on is the tools and solutions that were released. In an effort to help our readers understand what tools are available to help them, we are creating a guide to tools released in 2008 to be published in January.
In order for us to do that, we need your help identifying tools that were released. The tool categories we’re focusing on:
- Software testing
- Test management
- Code quality
- Application security
- Software requirements
- Agile development
- Project management
- Application lifecycle management
- Application performance monitoring & management
Please send us information about tools released between Jan. 1, 2008, and Oct. 31, 2008, that you’d like us to consider for the guide. The tools must be new products or significant upgrades. And you must include the following information:
- Product name and version/model number
- Company name
- URL for the product
- Product or company logo
- Date product was released
- Tool category (see above)
- Product description
- If it’s an upgrade, features that were added
- What makes it innovative?
- Details about how it performs
- Details about its ease of use and manageability
Send your product submissions to Editor@SearchSoftwareQuality.com by Friday, Dec. 12.
Virtualization, considered a cost-savings technology, is getting “greened up.” Increasingly groups and industry experts are talking about the ecological benefits of virtualization. By reducing a data center’s power and cooling requirements, a company can reduce its carbon footprint, they say. Companies may want to consider server consolidation, application virtualization, and virtual application test labs.
Hoping to shine more light on the “green” benefits of virtualization, the Computer Measurement Group has added a track about the subject to its upcoming annual conference (Dec. 7-12). Session titles include the following:
- Capacity modeling and planning in virtual environments
- Green data center: A case study
- Modeling/sizing techniques for different virtualization strategies
Other tracks at the conference include application load and stress testing and software performance engineering.
For more information about the conference, visit the CMG ’08 website.
There is an interesting debate about the pros and cons of unit testing taking place on TheServerSide.com. Despite the fact that test-driven development has increased, unit testing is not a priority, according to a recent poll conducted by Methods & Tools. Poll results show that 17% said unit testing is not performed, and 40% said it was informal. Meanwhile many people say unit testing is critical for improving the quality of software.
Why isn’t unit testing a priority? Developers give the following reasons:
- They don’t know about it
- Good unit tests are hard to write
- It’s a waste of time and productivity
- Writing the tests would take too long (especially if they’re doing frequent iterations)
- Regression testing is more effective
What do software testers, QA engineers, project managers, and IT managers think about unit testing? Do you like it when the developers on your team do unit testing — does it improve the quality of the software being created? Should they do it only if they’re going to do it well — not to simply say they did it?
I’d like to know what you think.
In response to economic issues and as a way to encourage more companies to test applications for security, Ounce Labs has reduced the cost of its Static Application Security Testing suite.
The shift in the pricing and licensing models will lower costs and complexity for Ounce 6 customers, the company said.
In a prepared statement Ounce CEO Gary Jackson said:
We intend to accelerate enterprise adoption and make source code analysis more accessible for every company concerned with application security, from the smallest shops to the largest enterprises. The new pricing schedule will speed time to value and ensure that every organization can afford, deploy, and capitalize on source code scanning to protect their critical data.
The fee for the defined organization will allow unlimited users or seats, and unlimited product installations. Third party contractors working for the organization will also have access to the products, at no additional fee.
“Site” is a single legal organization where all users are within a 3 mile radius. An organization with fewer than 200 employees, and operating revenue of $50 million or less is a Level A Site. This may be an independent company or the department of a larger company.
“Business Unit” is a geographically dispersed, single legal organization. An organization with operating revenue of $500 Million or less is a Level C Business Unit.
Two more stories about e-voting machines were reported this week. The first is about a report from Princeton University that says an e-voting machine in New Jersey can be hacked in seven minutes.
In its report, the university says it is possible to hack the Sequoia AVC Advantage 9.00H DRE (direct-recording electronic) voting machine by loading fraudulent firmware.
Sequoia has responded to the Princeton study with a report of its own, rebutting many of the claims in the Princeton report.
Princeton’s report, which was conducted during the summer as part of a lawsuit in New Jersey, was allowed to be released just a couple weeks ago. The lateness of the report — and the examination of the e-voting systems — is because of the time it has taken a 2004 lawsuit against the state for using DRE machines to progress.
In 2004 a group of public-interest plaintiffs sued the State of New Jersey over the State’s use of DRE voting machines. The plaintiffs argued that the use of DRE voting machines is illegal and unconstitutional.
The case was dismissed in January 2005 by a trial court, but then appealed. While the appeal was pending, the state legislature passed — and the governor signed — a bill requiring that no later than January 1, 2008, any voting system in New Jersey must produce a voter-verified paper ballot.
In 2006 the Appellate Court reinstated the lawsuit and instructed the trial judge to monitor the progress of State election officials in meeting the legislature’s deadline. In 2008 the executive branch twice requested delays to the deadline and the legislature obliged.
Based on concern that the state would not meet the deadline, the lawsuit was allowed to continue and the judge ordered that the state provide to the plaintiffs’ expert witnesses the voting machines complete with their source code. The witnesses, who are authors of the Princeton report, examined the voting machines and their source code during July and August 2008 and delivered their report to the court on Sept. 2. A court order permitted them to make their findings available to the public 30 days later.
So, the state of New Jersey had four years to improve its e-voting systems and prevent a lawsuit, yet it did not. And now voters in that state once again are using machines that can be tampered with and don’t produce paper ballots — and once again face the possibility that their votes may not count.
E-voting problems in Finland
The other story being reported is that usability problems in Finland’s pilot e-voting system caused 2% of votes cast to be lost.
With that system, voters were required to insert a smart card to identify the voter, type their selected candidate number, press “OK”, check the candidate details on the screen and then press “OK” again. Some voters did not press “OK” a second time and instead removed their smart card prematurely, causing their ballots not to be cast.
Since writing about the Florida voting experience, it was brought to my attention how the state of Florida commissioned an independent expert review of the remote voting software that is being used in Okaloosa County. A team from the Florida State University’s (FSU) Security and Assurance in Information Technology (SAIT) Laboratory reviewed the Pnyx.core ODBP 1.0 remote voting software developed by Scytl.
The software is for use in the Okaloosa Distance Balloting Pilot (ODBP), which will test remote e-voting for about 1,000 overseas voters whose permanent residence is in Okaloosa County. It replaces other absentee voting mechanisms for participating overseas voters.
Under this pilot, voters will enter their votes electronically, those votes will be transmitted over the Internet, and the votes will be tabulated electronically.
The state of Florida, which certified this system at the end of September, always certifies its voting technology and processes. And in the past an independent review was done of the then-named Diebold systems. What makes this review stand out is the vendor’s willingness to cooperate and provide a full build environment for the source code.
“Scytl provided VMWare virtual machine images containing a full build environment, scripts to drive the build process, and step-by-step documentation describing how to initiate the build process,” according to the team’s report.
Doing that saved the team “significant” time and made it possible to apply static analysis tools to the software. The team used reports from two static analysis tools:
- Fortify SCA, which Fortify donated, was used by the team.
- Klocwork Insight was used by the Florida Division of Elections.
Additionally, the team participated with the vendor in an online question-and-answer exchange that “proved invaluable to the study.”
The team’s final report was mixed; it reported some good things, but it also found some bad things. In general, it passed review and was certified by the state.
But the important thing to take from this was the process and the cooperation of the vendor. This is hopefully the start of how things are done for the 2012 election.
“There are very few developers engaging with vendors such as [Klocwork] or state-sponsored programs to make their code usable in four years time or eight years time,” said Gwyn Fisher, Klocwork’s CTO.
Brendan Harrison, director of marketing at Klocwork, said it’s hoped that this review is used as a model going forward.
“The e-voting marketplace burst on the scene, and what we see happening is that the e-voting vendors are going to have to change how they develop software and work more cooperatively with the authorities,” he said.
The e-voting market needs to transition to one that is regulated in order to enforce good standards, a high-quality process, and a secure development lifecycle.
On SearchSoftwareQuality.com we’re asking readers if they think voting machines (their usability, bugs, or security concerns) will cause problems for next month’s U.S. elections. What do you think? Take the poll and let us know.
Understanding what your stakeholders want in an application can be challenging, to say the least. You need to know what questions to ask and get your stakeholders to explain their needs and wants. It requires not just eliciting requirements but also validating that what you’re ready to send to the developers is in fact what your stakeholders want. Communication with the stakeholders and with the developers is essential.
That’s just one headache business analysts often have. Others that I’ve heard people talk about:
- Transitioning from legacy requirements or documents (Word, Excel files) to use cases
- Transitioning from per-project requirements to per-system requirements
- Being asked to specify many of the user interface details as requirements (Too much UI detail in the requirements constrains the designers and takes away from the functional requirements)
- Managing requirements across the enterprise
- Managing requirements for reusable components (How to achieve effective reuse across the enterprise)
Additionally, more people are asking about how to manage and define software requirements in agile environments. How do you handle changing software requirements?
Do any of those issues cause headaches for you? Are there other things related to software requirements that create problems for you or you need more information about? Tell me about your pains — be as specific as you want.
Think of me as your doctor: Tell me where it hurts, and I’ll try to help you get rid of the pain. Only I won’t charge you for an office visit.