Software Quality Insights


May 11, 2012  5:45 PM

The changing roles of Agile managers: Scrum Gathering Atlanta

Melanie Webb Profile: Melanie Webb

At Scrum Gathering Atlanta, Julia Dillon, application services manager at Capital Group Companies, delivered an informative session titled, “Do Agile Teams Need Managers? A Series of Fortunate Events,” in which she first described what happened when the coach left her team and then opened the discussion for suggestions on how to handle management of an Agile team.

Dillon explained the various points of view involved; managers consider themselves necessary and great “doers,” yet they are confused about their role and can be disorganized. Team members view management as confused about Agile and sometimes out of touch. When the Agile coach is gone, the team may find itself lacking the personality, experience, confidence or some combination of these traits that the coach brought to the team.

In response, the team Dillon worked on created an action plan that entailed collaboration and attendance of some members at an Agile leadership workshop to get some clarity about roles, strategies and accountability.

Among other topics, her team learned about the differences between organizational design and command and control, learned some tools and practices and learned how to bridge the gap between disparate visions on the team. They discovered that the role of management encompasses engaging actively with the business side, being “people” people, handling portfolio management and creating the right environment.

In the new world view, management team roles break down into a senior manager in charge of budgeting, strategy and investment planning; a manager serving as an “Uber” Product Owner, who is the portfolio manager, aligning investment strategy and business value; and a manager serving as an “Uber” Scrum Master, who delivers large-scale program planning and execution as well as removes organizational impediments, according to Dillon. Furthermore, the performance objectives are different for each level of management.

She highlighted the most prescient needs of an Agile organization from its management team, including setting a the bar high for goals, developing the talents of individual team members, delegating someone whose task it is to “keep it fresh” and creating an environment in which team members can take risks and ask questions.

Participants in the session offered additional suggestions, explaining that since Scrum teams are self-organizing, managers play a key role in developing personnel, training, quality initiatives and communities of practice. Others mentioned that excessive management is not needed as Scrum Masters act in a leadership role.

Related articles on SearchSoftwareQuality:

May 10, 2012  5:06 PM

Effective analysis and visualization of performance test data

Yvette Francino Yvette Francino Profile: Yvette Francino

Performance testing is a difficult and complicated skill to learn. Not only do you need to understand how to performance test applications, you need to know how to make sense of the vast amount of data you gather, and then present that to engineers, managers and customers in a way that makes most sense to them so that they’re able to make informed decisions.

On May 8th, Mais Tawfik presented “Performance Data Analysis” at Denver’s SQuAD (Software Quality Association of Denver) meeting. In her session, she showed ways of presenting data in visual formats so that they would highlight anomalies and help present a full picture when analyzing the results of a performance test effort.

Attributes for effective analysis and visualization of data discussed included:

  • Simple
  • Organized
  • Tells a story
  • Offers faster interpretation
  • Highlights what’s pertinent
  • Highlights trends and patterns
  • Designed for the target audience
  • Reflects the big picture
  • Suggests conclusions
  • Provides context
  • Maintains data integrity
  • Accounts for gaps in data

Though Tawfik highlighted how this is done with performance test data by using a real-life case study, these techniques are useful whenever analyzing and presenting large amounts of data.

Listen in to the video clip below to hear from Mais Tawfic about her presentation and the benefits of visualization of performance test data.

[kml_flashembed movie=”http://www.youtube.com/v/u2h2lHC4R_I” width=”425″ height=”350″ wmode=”transparent” /]


May 9, 2012  9:16 PM

Improve daily standups with lessons learned from improv

Yvette Francino Melanie Webb Profile: Melanie Webb

The connection between improv games and the daily standup may seem like a stretch at first, but as presenters Arlen Bankston and Jessie Shternshus demonstrated at their Scrum Gathering Atlanta session, “Standout Standups with Preparation, Practice and Applied Improv,” the same improvisational games used by improv actors can facilitate communication between project team members in a non-threatening way.

After briefly summarizing the Daily Scrum process, they led session participants in creating a list of impediments through a game where each group member begins by saying, “And to make matters worse…” Within a few moments, one problem had grown into a string of several problems. Participants came up with impediments including technical difficulties, lack of team cohesion, issues with tools, lack of preparation and team communication issues such as rambling team members and people distracted due to multi-tasking.

Bankston and Shternshus then led a number of activities, some that participants paired up for and others that worked well for several group members. One partner activity required that while one person is speaking, the partner had to listen to the very end of the sentence, so they could begin their response with the final letter of their partner’s sentence. This challenged partners to listen without interrupting, or assuming what the other would say.

It also stopped the natural process of preparing what to say next and shifted the focus onto the speaker. “Every person comes to a conversation with an agenda,” Jessie Shternshus pointed out.

Another game was “Color/Advance,” which helps alleviate rambling. One partner told a story, and the other could encourage more detailed information by saying, “Color,” or could prompt the speaker to move on by saying, “Advance.” This game provided feedback in a harmless and fun way. Participants suggested that this could be useful in a standup or in a retrospective.

Presenters elicited further feedback from participants about what helps improve communication during standups, and session attendees discussed ideas such as using a progress board with sticky notes for at-a-glance updates and passing around a physical object to indicate whose turn it is to speak.

Improvisational games promote active participation, build team engagement and generate positivity and laughter, according to the presenters. “We’re trying to make active listening possible,” summarized Arlen Bankston.


May 7, 2012  11:03 PM

Scrum Gathering Atlanta: Activities for co-creating a shared vision

Yvette Francino Melanie Webb Profile: Melanie Webb

In his presentation at Scrum Gathering Atlanta, “Creating a Shared Vision: From Compliance to Commitment,” Senior Coach and Vice President of agile42 Brad Swanson discussed the importance of having a vision and offered five strategies for building a shared vision.

In defining the word “vision,” Swanson gave some inspiring examples, such as Kennedy’s vision of landing a man on the moon by the end of the 60’s and the vision of creating a portable digital music player—the iPod. Inspiring visions include concrete, measurable details, he explained.

This interactive session gave participants opportunities to explore different strategies for building a shared vision. Groups prepared posters and gave presentations based on the five strategies for building a shared vision, from the book The Fifth Discipline Fieldbook by Peter Senge, et al: Telling, Selling, Testing, Consulting and Co-Creating. Groups practiced Innovation Games techniques such as “Remember the Future” and “Product Box.”

Team members had just minutes to apply techniques and produce analyses and product brochures. “The process you went through as a team is more important than what you ended up with on paper,” Swanson pointed out during the wrap-up.

Participants highlighted key takeaways from the presentation, including “applying the co-creation techniques generated passion and purpose.” One of the participants said that he liked how the product brochure activity, one that required the creation of a physical item, spurred participation from everyone in the group. Another participant commented on how these techniques helped sharpen focus, when so often people can get caught up in the details of a project.

How effective is your organization at implementing a shared vision?


April 25, 2012  2:01 PM

Can exploratory testing be automated?

Yvette Francino Yvette Francino Profile: Yvette Francino

Some might argue that exploratory testing, by its very definition, is the opposite of automated testing; however, testers are learning ways in which automation and exploratory can be used to complement with one another.

At STAREAST 2012, I spoke to one participant who used tools that would help facilitate her exploratory test session, allowing collection of logs and other information which would help with troubleshooting. Though this is not “automated testing,” automation is aiding exploratory testing through the tool.

In her keynote, Dot Graham talked about “automated exploratory testing,” giving an example of how it was used to test Bing. I asked Bob Galen, who presented “Session-Based Exploratory Testing,” about mixing automation and exploratory testing. He warned against the thinking that automation could replace exploratory testing, but did give an example of a case study in which the repository that was created from exploratory test documentation helped to generate automation test ideas.

Listen in below to Dot Graham and Mark Fewster, authors of Experiences of Test Automation, as well as to Bob Galen, as they explain their thoughts on the mixture of exploratory testing and automation.

[kml_flashembed movie="http://www.youtube.com/v/yE4DvqKhsRQ" width="425" height="350" wmode="transparent" /]
[kml_flashembed movie="http://www.youtube.com/v/yAYo2JDiviE" width="425" height="350" wmode="transparent" /]


April 23, 2012  7:56 PM

Test automation success: Breaking down managers’ misconceptions

Yvette Francino Melanie Webb Profile: Melanie Webb

One popular objective for test automation is to automate 100% of manual tests, according to independent test consultant and author Dorothy Graham. However, while some tests are better as automated tests, some tests are better performed manually. In her keynote at STAREAST 2012, “What Managers Think They Know about Test Automation—But Don’t,” Graham discussed the various misconceptions managers can have about automation and identified ways to set realistic goals.

Objectives are vital, as they determine a team’s direction, funding and assessment of the project as a success or failure, she explained. Unrealistically high goals will ensure failure. One measurement of success is ROI, which some managers misconstrue as the many benefits of automation: tests are run more frequently, they take less time to run and they require less human effort.

To measure the ROI, one can use a simple calculation of (benefit – cost)/cost. More information on this is available on Graham’s website. Without this quantifying of ROI, good ROI can be achieved, but it is essential that the benefits are made visible.

“Automation does not find bugs; tests find bugs,” said Graham. Automation in itself is not a cure-all for an organization’s testing needs. Furthermore, it does not replace testers. Automation, like other testing tools, supports the efforts of testers.

Graham emphasized the importance of implementing high quality testware architecture, as she cited poor architecture as the leading cause of abandoned automation efforts. The testers, test execution tools and positive relationships between developers and managers must all work in concert to produce successful results. “Good automation takes time and effort. It doesn’t just come out of the box,” said Graham.

She encouraged testers to educate their managers on the realities of test automation, as their support is critical to project success.

For more on test automation from Dorothy Graham and Mark Fewster, authors of Experiences with Test Automation, see Test automation: Exploring automation case studies in Agile development.

For comprehensive conference coverage, see our Software Testing Analysis and Review conference page.


April 20, 2012  10:25 PM

Tools and techniques for performance testing with Scott Barber

Yvette Francino Melanie Webb Profile: Melanie Webb

Many people think performance testing is a difficult effort riddled with complicated tools and measurements. As for those aspects that are complex, Scott Barber, Chief Technologist at PerfTestPlus, Inc., suggested leaving them to the specialists; there are easy parts that people who are just getting started with performance testing can take on. His presentation at STAREAST 2012, “Simple and Informative Performance Tests You Can Do Now” offered several entry points to this often misunderstood aspect of testing.

 

He discussed the definitions of performance testing floating around, from “performance testing makes websites go fast” to “performance testing optimizes software systems by balancing cost, time-to-market and capacity while remaining focused on the quality of service to system users.” Furthermore, he clarified definitions of some common testing terms. Load testing is about expected, anticipated conditions. Stress testing, conversely, is about outcomes you don’t expect, finding when and where something might happen or break.

 

Performance testing plays an important role in numerous possible objectives: determining compliance with requirements, evaluating release readiness, assessing user satisfaction, estimating capacity, validating assumptions and generating marketing statements. As such, it hardly makes sense for performance testing to come only at the end of production, explained Barber.

 

“Performance testing helps stakeholders make decisions regarding product value and project risk; specifically value and risk related to speed, scalability and the stability attributes of a system and its components throughout the product lifecycle,” he said.

 

He advocated raising visibility about performance testing within your organization by asking questions, which often boil down to “what’s the real goal?”; generating acceptance criteria and setting priorities. Report and talk about performance often, he suggested. He also recommended taking a few minutes to spot-check the performance of competitors’ sites; not many organizations do this, but a competitive analysis does not take much time and offers some useful data.

 

As far as getting started with actual performance testing, he provided links to several online tools:

 

Tools for determining speed:

https://browsermob.com/benchmarks/overview

http://www.websitepulse.com/help/tools.php

http://www.gomez.com/benchmarks/

 

Tools for making use of performance snapshots:

http://www.webpagetest.org

http://www.websiteoptimization.com/services/analyze/

https://developers.google.com/pagespeed/

http://www.softwareqatest.com/qatweb1.html

 

These tools can offer data, often with accompanying graphics, in a rather short amount of time, and they aren’t the only ones; more tools become available every day.

 

Check out this video of Scott Barber at STAREAST 2012:

[kml_flashembed movie="http://www.youtube.com/v/IQp5lstLC5Y" width="425" height="350" wmode="transparent" /]


April 20, 2012  4:31 PM

Software testing is an “intentional investigation”

Yvette Francino Melanie Webb Profile: Melanie Webb

 When testers “compare the product to specifications,” what they often find is that they identify intentions — and the intentions are different from the product. While the tester’s job may be to gather information, this becomes an intentional investigation, according to STAREAST 2012 keynote speaker Michael Bolton of DevelopSense, Inc.

His speech, “Evaluating Testing: The Qualitative Way,” explored qualitative research methods, compared the job of software testers to that of investigative journalists and offered the notion that testing is ultimately a human activity.

Read more in-depth coverage of this insightful speech here: Software testers as qualitative researchers: STAREAST 2012 keynote.

[kml_flashembed movie="http://www.youtube.com/v/QHfYqclaZHA" width="425" height="350" wmode="transparent" /]


April 19, 2012  1:28 AM

Don’t wait too late to start performance testing

Yvette Francino Yvette Francino Profile: Yvette Francino

As software quality professionals, we all know that the earlier we find defects, the easier they are to fix. However, all too often, performance testing is not done until the very end of a release cycle. In fact, some organizations wait until after the code has deployed, thinking they can only monitor performance, and fix it when there’s a problem, rather than doing testing up front.

Eric Gee is facilitating a session here at STAREAST 2012 titled, “Performance Testing Earlier in the Software Development Lifecycle.” In this short video clip, Gee gives us a short preview of his session and how performance test engineers can partner with architects and developers, testing at the component level ensuring that performance issues are found and addressed as early as possible.

[kml_flashembed movie="http://www.youtube.com/v/3YPTzQ3eL0E" width="425" height="350" wmode="transparent" /]


April 19, 2012  1:05 AM

Exploratory testing: Tools to help recreate the problem

Yvette Francino Yvette Francino Profile: Yvette Francino

Here at STAREAST 2012, we’ve been hearing quite a lot about the importance of exploratory testing. This technique allows testers to really dig into areas of high risk, using the knowledge they’ve learned as they’re testing to creatively test and uncover problems that aren’t on the “happy path.” However, when they do run across an anomaly or defect, they need to recreate that problem so that the developer is able to troubleshoot it and find out the root cause.

I asked conference attendee Thora Commins the tools she used in order to easily recreate the issues she uncovered during exploratory testing. Listen in to hear her response:

[kml_flashembed movie="http://www.youtube.com/v/4J0pI60E0l8" width="425" height="350" wmode="transparent" /]


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: