Expected Results: How to persuade developers to document them

0 pts.
Tags:
Software testing
What is the value proposition I can offer our developers and customers when persuading them to do a better job at documenting expected and actual results in their test cases? We exist in an environment where developers perform the tests up to UAT, developers and customers are domain experts, there is little turnover, and many have the luxury of allowing the customer to identify new requirements via testing. These facts cancel most of the benefits I can enumerate. The remaining benefit that can be argued is that they will avoid discipline when they fail a Sarbanes Oxley audit for failing to produce independently verifiable test cases.
ASKED: May 23, 2007  4:08 PM
UPDATED: May 24, 2007  6:35 PM

Answer Wiki

Thanks. We'll let you know when a new response is added.

So the question I have for you is why do *want* there to be a value proposition? I understand that you want to achieve SOX auditability, but I’ve spent more than a little time investigating SOX compliance and doing SOX compliant testing and I see no reason where recording actual results and recommendations (for example) wouldn’t count as auditable.

In fact, the easiest way to make it all auditable is to simply use a screen recorder to record all testing sessions and annotate the recordings with defect report numbers in appropriate locations.

The point is, that there is a strong case to be made that expected results actually causes inattentional blindness in testers (i.e. they ONLY look for what they were instructed to look for and miss significant defects as a result). It would seem to me that a little “out-of-the-box” thinking could result in a way to resolve the auditability question without the risk of sacrificing the quality of testing.


Scott Barber
President & Chief Technologist, PerfTestPlus, Inc.
Executive Director, Association for Software Testing
www.perftestplus.com
www.associationforsoftwaretesting.org

“If you can see it in your mind…
you will find it in your life.”

Discuss This Question: 2  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • PMOdoff
    Scott, If I read into your approach, "intentional blindness" as an argument against documenting expected results, sounds like an invitation to turn testing into a "fishing expedition." Given a tester motivated to break the software, maybe that's an acceptable contrarian view. After all, is there anything more demotivational than compliance to SOX? Or is your approach to simply document the expected results after the test is executed? That would still achieve the standard of "independently verifiable", while avoiding the risk of inattentiveness. Given that the developers are not passing the software over to testers, but rather testing it themselves, this may be viable. But then again, attentiveness is all about motivation. Whether I'm inattentive in designing a test or looking at the results, the same risks would seem to apply. Little clarification on you point, please.
    0 pointsBadges:
    report
  • SBarber
    Documenting expected results while testing rather than during test design is certainly a step in the right direction, in my opinion, but I still think it is flawed in a fundamental way for a majority of tests. If, for instance, you are testing some kind of computational device and there is only 1 acceptable "result" to a series of inputs, then documenting that result may make sense. However, most of the time, it doesn't make sense to design, build and execute a test simply to look for one *exact* response. What we testers *really* have (if we are doing our job well) is not a single expected result, we have a whole range of expectations that often take the form of oracles and heuristics. In that vein, documenting our expectations would be *better* than documenting an "expected result". The problem is that our expectations are generally well too numerous to document in a way that anyone would find valuable. In fact, I can think of over 100 expectations I have for what will happen when I finish typing and press the reply button without missing a beat. The reality is that you can document anything you want, as much as you want, but no test conducted by a human being that has the ability to judge goodness and badness, evaluate likes and dislikes, and make comparisons to their own personal experiences can be independently verified any further than to say "this test was conducted on this date, at this time, but this individual on this machine under these conditions." Which is why, rather than walking the fine line between grossly oversimplifying and patently lying via documentation, I recommend simply recording and filing the test execution. It's faster, more accurate, significantly harder to fake and all it costs is some storage space. Especially with the number of times I have seen and heard about development projects where all those binders filled with well documented test cases, expected and actual results were never actually tested after they were written. Think about it, what SOX auditor would actually prefer to conduct all of the tests themselves by reading a document as opposed to (more or less) watching a DVD movie at double speed to SEE all of the testing that was conducted?
    0 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following