As the year starts to wind down, we at SearchSoftwareQuality.com are looking back at what took place during 2008. One thing that we’re focusing on is the tools and solutions that were released. In an effort to help our readers understand what tools are available to help them, we are creating a guide to tools released in 2008 to be published in January.
In order for us to do that, we need your help identifying tools that were released. The tool categories we’re focusing on:
- Software testing
- Test management
- Code quality
- Application security
- Software requirements
- Agile development
- Project management
- Application lifecycle management
- Application performance monitoring & management
Please send us information about tools released between Jan. 1, 2008, and Oct. 31, 2008, that you’d like us to consider for the guide. The tools must be new products or significant upgrades. And you must include the following information:
- Product name and version/model number
- Company name
- URL for the product
- Product or company logo
- Date product was released
- Tool category (see above)
- Product description
- If it’s an upgrade, features that were added
- What makes it innovative?
- Details about how it performs
- Details about its ease of use and manageability
Send your product submissions to Editor@SearchSoftwareQuality.com by Friday, Dec. 12.
Virtualization, considered a cost-savings technology, is getting “greened up.” Increasingly groups and industry experts are talking about the ecological benefits of virtualization. By reducing a data center’s power and cooling requirements, a company can reduce its carbon footprint, they say. Companies may want to consider server consolidation, application virtualization, and virtual application test labs.
Hoping to shine more light on the “green” benefits of virtualization, the Computer Measurement Group has added a track about the subject to its upcoming annual conference (Dec. 7-12). Session titles include the following:
- Capacity modeling and planning in virtual environments
- Green data center: A case study
- Modeling/sizing techniques for different virtualization strategies
Other tracks at the conference include application load and stress testing and software performance engineering.
For more information about the conference, visit the CMG ’08 website.
There is an interesting debate about the pros and cons of unit testing taking place on TheServerSide.com. Despite the fact that test-driven development has increased, unit testing is not a priority, according to a recent poll conducted by Methods & Tools. Poll results show that 17% said unit testing is not performed, and 40% said it was informal. Meanwhile many people say unit testing is critical for improving the quality of software.
Why isn’t unit testing a priority? Developers give the following reasons:
- They don’t know about it
- Good unit tests are hard to write
- It’s a waste of time and productivity
- Writing the tests would take too long (especially if they’re doing frequent iterations)
- Regression testing is more effective
What do software testers, QA engineers, project managers, and IT managers think about unit testing? Do you like it when the developers on your team do unit testing — does it improve the quality of the software being created? Should they do it only if they’re going to do it well — not to simply say they did it?
I’d like to know what you think.
In response to economic issues and as a way to encourage more companies to test applications for security, Ounce Labs has reduced the cost of its Static Application Security Testing suite.
The shift in the pricing and licensing models will lower costs and complexity for Ounce 6 customers, the company said.
In a prepared statement Ounce CEO Gary Jackson said:
We intend to accelerate enterprise adoption and make source code analysis more accessible for every company concerned with application security, from the smallest shops to the largest enterprises. The new pricing schedule will speed time to value and ensure that every organization can afford, deploy, and capitalize on source code scanning to protect their critical data.
The fee for the defined organization will allow unlimited users or seats, and unlimited product installations. Third party contractors working for the organization will also have access to the products, at no additional fee.
“Site” is a single legal organization where all users are within a 3 mile radius. An organization with fewer than 200 employees, and operating revenue of $50 million or less is a Level A Site. This may be an independent company or the department of a larger company.
“Business Unit” is a geographically dispersed, single legal organization. An organization with operating revenue of $500 Million or less is a Level C Business Unit.
Two more stories about e-voting machines were reported this week. The first is about a report from Princeton University that says an e-voting machine in New Jersey can be hacked in seven minutes.
In its report, the university says it is possible to hack the Sequoia AVC Advantage 9.00H DRE (direct-recording electronic) voting machine by loading fraudulent firmware.
Sequoia has responded to the Princeton study with a report of its own, rebutting many of the claims in the Princeton report.
Princeton’s report, which was conducted during the summer as part of a lawsuit in New Jersey, was allowed to be released just a couple weeks ago. The lateness of the report — and the examination of the e-voting systems — is because of the time it has taken a 2004 lawsuit against the state for using DRE machines to progress.
In 2004 a group of public-interest plaintiffs sued the State of New Jersey over the State’s use of DRE voting machines. The plaintiffs argued that the use of DRE voting machines is illegal and unconstitutional.
The case was dismissed in January 2005 by a trial court, but then appealed. While the appeal was pending, the state legislature passed — and the governor signed — a bill requiring that no later than January 1, 2008, any voting system in New Jersey must produce a voter-verified paper ballot.
In 2006 the Appellate Court reinstated the lawsuit and instructed the trial judge to monitor the progress of State election officials in meeting the legislature’s deadline. In 2008 the executive branch twice requested delays to the deadline and the legislature obliged.
Based on concern that the state would not meet the deadline, the lawsuit was allowed to continue and the judge ordered that the state provide to the plaintiffs’ expert witnesses the voting machines complete with their source code. The witnesses, who are authors of the Princeton report, examined the voting machines and their source code during July and August 2008 and delivered their report to the court on Sept. 2. A court order permitted them to make their findings available to the public 30 days later.
So, the state of New Jersey had four years to improve its e-voting systems and prevent a lawsuit, yet it did not. And now voters in that state once again are using machines that can be tampered with and don’t produce paper ballots — and once again face the possibility that their votes may not count.
E-voting problems in Finland
The other story being reported is that usability problems in Finland’s pilot e-voting system caused 2% of votes cast to be lost.
With that system, voters were required to insert a smart card to identify the voter, type their selected candidate number, press “OK”, check the candidate details on the screen and then press “OK” again. Some voters did not press “OK” a second time and instead removed their smart card prematurely, causing their ballots not to be cast.
Since writing about the Florida voting experience, it was brought to my attention how the state of Florida commissioned an independent expert review of the remote voting software that is being used in Okaloosa County. A team from the Florida State University’s (FSU) Security and Assurance in Information Technology (SAIT) Laboratory reviewed the Pnyx.core ODBP 1.0 remote voting software developed by Scytl.
The software is for use in the Okaloosa Distance Balloting Pilot (ODBP), which will test remote e-voting for about 1,000 overseas voters whose permanent residence is in Okaloosa County. It replaces other absentee voting mechanisms for participating overseas voters.
Under this pilot, voters will enter their votes electronically, those votes will be transmitted over the Internet, and the votes will be tabulated electronically.
The state of Florida, which certified this system at the end of September, always certifies its voting technology and processes. And in the past an independent review was done of the then-named Diebold systems. What makes this review stand out is the vendor’s willingness to cooperate and provide a full build environment for the source code.
“Scytl provided VMWare virtual machine images containing a full build environment, scripts to drive the build process, and step-by-step documentation describing how to initiate the build process,” according to the team’s report.
Doing that saved the team “significant” time and made it possible to apply static analysis tools to the software. The team used reports from two static analysis tools:
- Fortify SCA, which Fortify donated, was used by the team.
- Klocwork Insight was used by the Florida Division of Elections.
Additionally, the team participated with the vendor in an online question-and-answer exchange that “proved invaluable to the study.”
The team’s final report was mixed; it reported some good things, but it also found some bad things. In general, it passed review and was certified by the state.
But the important thing to take from this was the process and the cooperation of the vendor. This is hopefully the start of how things are done for the 2012 election.
“There are very few developers engaging with vendors such as [Klocwork] or state-sponsored programs to make their code usable in four years time or eight years time,” said Gwyn Fisher, Klocwork’s CTO.
Brendan Harrison, director of marketing at Klocwork, said it’s hoped that this review is used as a model going forward.
“The e-voting marketplace burst on the scene, and what we see happening is that the e-voting vendors are going to have to change how they develop software and work more cooperatively with the authorities,” he said.
The e-voting market needs to transition to one that is regulated in order to enforce good standards, a high-quality process, and a secure development lifecycle.
On SearchSoftwareQuality.com we’re asking readers if they think voting machines (their usability, bugs, or security concerns) will cause problems for next month’s U.S. elections. What do you think? Take the poll and let us know.
Understanding what your stakeholders want in an application can be challenging, to say the least. You need to know what questions to ask and get your stakeholders to explain their needs and wants. It requires not just eliciting requirements but also validating that what you’re ready to send to the developers is in fact what your stakeholders want. Communication with the stakeholders and with the developers is essential.
That’s just one headache business analysts often have. Others that I’ve heard people talk about:
- Transitioning from legacy requirements or documents (Word, Excel files) to use cases
- Transitioning from per-project requirements to per-system requirements
- Being asked to specify many of the user interface details as requirements (Too much UI detail in the requirements constrains the designers and takes away from the functional requirements)
- Managing requirements across the enterprise
- Managing requirements for reusable components (How to achieve effective reuse across the enterprise)
Additionally, more people are asking about how to manage and define software requirements in agile environments. How do you handle changing software requirements?
Do any of those issues cause headaches for you? Are there other things related to software requirements that create problems for you or you need more information about? Tell me about your pains — be as specific as you want.
Think of me as your doctor: Tell me where it hurts, and I’ll try to help you get rid of the pain. Only I won’t charge you for an office visit.
We’re a few days into early voting for the U.S. presidential election, and some West Virginia voters are voicing concerns with e-voting machines from Election Systems and Software. In Putnam County and Jackson County, voters reported that their electronic votes for Democrats were switched to Republicans.
Clerks in both counties say the incidents are isolated, and they blamed voters for not being more careful. Additionally, West Virginia Secretary of State Betty Ireland says she is confident the machines are up to the task. This despite the numerous reports detailing problems with the company’s machines.
The machines being used, however, do not provide paper receipts. It’s important that voters double check their votes before leaving the voting booth. If a voter notices a discrepancy, he needs to tell a poll worker so the mistake can hopefully be corrected. Once a voter’s ballot is cast, he will not be allowed to vote again.
If you’re responsible for making sure stakeholders get the software that they want, then you’re probably all-too familiar with the four aspects of software requirements — elicitation, elaboration, validation, and acceptance.
Increasingly I hear how an iterative approach is best and how tools can help. One tool that sounds like it could help is Blueprint’s Requirements Center. It provides a single environment for everything — there’s no need to leave the environment.
The elicitation tool provides “rapid requirements capture.” You can use it to identify and capture the relationships between the different requirements, you can capture images to include, you can capture data definitions, and you can import requirements from Excel spreadsheets.
The elaboration tool helps you start to make sense of everything. You use it to start to model the business process and the applications. It provides a GUI center to show interfaces of the software. And because it records the traceability of requirements, you can see what is impacted if a requirement is changed.
When it comes to validating what you have with stakeholders, you can create an end-to-end workflow diagram. You create a simulation to review with the stakeholders, and then gather their feedback. That feedback is entered directly into the center. You also have the option of passing the simulation around, and stakeholders can enter their own comments.
When the stakeholders give their OK, signaling that you’ve got it right, you can then generate the standard documents required for signoff. And those signoffs can be recorded on the server. When all is said and done, you’re giving the designers “a very comprehensive and complete diagram of what stakeholders want,” said Tony Higgins, vice president of products at Blueprint.
Additionally, the requirements center can generate all of the functional tests that correspond to the requirements. These are “ready-to-run” tests, Higgins said.
Last week Blueprint released new features for the requirements center. Blueprint Requirements Center 2009 Feature Pack Two introduces the Blueprint Resource Center. It provides analysts with instructional materials such as videos, samples, and best practices; company-specific templates and guides; advice from Blueprint experts; and syndicated articles and tips from Web communities and blogs.
Feature Pack Two also enhances integration with HP Quality Center. Now, requirements definition meta data (including visual requirements, GUI prototypes, security requirements, and data elements) are seamlessly integrated with HP Quality Center’s Requirements Management and Test Management modules. In addition, HP Quality Center users also have the ability to import and leverage assets within Blueprint’s elicitation module to provide early visibility and to speed IT development and quality assurance teams.
Want to see how the various modules work? Blueprint provides online demonstrations of its Blueprint’s Requirements Center.