Companies face growing pressure to deliver applications quickly and continuously. “We’re living in a time where everything is a service, and as a result, the speed and innovation is accelerating. In fact, we see it as an exponential curve,” said Matthew Morgan, vice president of Project Marketing, Applications, Software, HP. He discussed HP’s new product offerings, explaining how they support DevOps and cloud innovation in a recent interview.
HP announced enhanced versions of HP Application Lifecycle Management (ALM) and HP Performance Center (PC) that facilitate continuous delivery of applications. As more and more organizations adopt DevOps, products that enable visibility, collaboration and quality are essential.
“Application innovation is hindered by the silos that exist between development, testing and operations teams, leading to delays, missed opportunities and potential application defects,” said Subbu Iyer, vice president, Product and Strategy, Applications, Software, HP. “By integrating information and processes from IT operations into application lifecycle management, HP provides a critical foundation for DevOps, enabling organizations to drive business results through the continuous delivery of innovative applications.”
In addition to updated ALM tools, they also announced new testing capabilities with HP LoadRunner 11.5, which simplifies the scripting process, and HP Sprinter, which offers new scanning capabilities and automates recurring test scripts to ease the manual testing process.
“We take a heterogeneous approach from a technology and infrastructure perspective, meaning that our technology works and adds value if you’re using traditional deployments, private cloud, hybrid cloud – we are not tied to one or the other,” explained Morgan. He continued, “Our capabilities are incredibly diverse, not just from an infrastructure perspective, but from a technology perspective. We don’t care if you’re building a mobile app, Web app, HTML5 game app or a client-server packaged application…We can monitor them all, we can test them all, we can validate for performance every last one.”
Related stories on SearchSoftwareQuality.com:
CollabNet recently announced the release of a video scribe that highlights five principles for adopting and scaling cloud development practices. I spoke with Jim Ensell, Chief Marketing and strategy officer, and Guy Marion, VP and GM of Cloud Division from CollabNet about this announcement and how their CloudForge offering evolved.
Ensell highlighted the three major trends that they are seeing in the IT industry right now, first identifying Agile adoption in the enterprise, and then noting, “Another big megatrend we see is the convergence of the development and operations world into what today is called DevOps. And then thirdly is really the cloud, which has become a big accelerator toward this movement toward DevOps.” In response to these trends, CollabNet has been moving to Enterprise Cloud Development services.
CollabNet is responding to the need for hybrid cloud services by offering their private cloud service TeamForge, and their public cloud service CloudForge. Customers can take advantage of aspects of both of these platforms.
Customers can create an account in minutes and start to use the service. It’s open to what platform you work in, what framework you’re developing in. Our number two most common use case is actually developing for mobile platforms like iOS or Android. We also, number one, support development for Web development, Web properties, Internet applications, whether it’s a website or Facebook app or whatever. There’s also a lot of enterprise development to be done in primarily Java and increasingly in languages like PHP. So we provide a set of open services, allow people to plug in with their own development environments and hook into the code repository that we host for them.
CollabNet is refocusing around enterprise cloud with CloudForge, a development platform as a service (dPaaS) that supports cloud development teams.
According to the video scribe, “Enterprise Cloud Development is a powerful new trend that enables development teams to leverage the power of Agile processes and integrate development and operations teams, all while harnessing the benefits of hybrid cloud computing to provide the enterprise with a centralized view of productivity, cost management and compliance. Think of it as the orchestration of application lifecycle management and DevOps across any platform, framework or cloud.”
Click here to view CollabNet’s video scribe, “Five Steps to Software Development and DevOps in the Hybrid Cloud.”
At Scrum Gathering Atlanta, Julia Dillon, application services manager at Capital Group Companies, delivered an informative session titled, “Do Agile Teams Need Managers? A Series of Fortunate Events,” in which she first described what happened when the coach left her team and then opened the discussion for suggestions on how to handle management of an Agile team.
Dillon explained the various points of view involved; managers consider themselves necessary and great “doers,” yet they are confused about their role and can be disorganized. Team members view management as confused about Agile and sometimes out of touch. When the Agile coach is gone, the team may find itself lacking the personality, experience, confidence or some combination of these traits that the coach brought to the team.
In response, the team Dillon worked on created an action plan that entailed collaboration and attendance of some members at an Agile leadership workshop to get some clarity about roles, strategies and accountability.
Among other topics, her team learned about the differences between organizational design and command and control, learned some tools and practices and learned how to bridge the gap between disparate visions on the team. They discovered that the role of management encompasses engaging actively with the business side, being “people” people, handling portfolio management and creating the right environment.
In the new world view, management team roles break down into a senior manager in charge of budgeting, strategy and investment planning; a manager serving as an “Uber” Product Owner, who is the portfolio manager, aligning investment strategy and business value; and a manager serving as an “Uber” Scrum Master, who delivers large-scale program planning and execution as well as removes organizational impediments, according to Dillon. Furthermore, the performance objectives are different for each level of management.
She highlighted the most prescient needs of an Agile organization from its management team, including setting a the bar high for goals, developing the talents of individual team members, delegating someone whose task it is to “keep it fresh” and creating an environment in which team members can take risks and ask questions.
Participants in the session offered additional suggestions, explaining that since Scrum teams are self-organizing, managers play a key role in developing personnel, training, quality initiatives and communities of practice. Others mentioned that excessive management is not needed as Scrum Masters act in a leadership role.
Related articles on SearchSoftwareQuality:
Performance testing is a difficult and complicated skill to learn. Not only do you need to understand how to performance test applications, you need to know how to make sense of the vast amount of data you gather, and then present that to engineers, managers and customers in a way that makes most sense to them so that they’re able to make informed decisions.
On May 8th, Mais Tawfik presented “Performance Data Analysis” at Denver’s SQuAD (Software Quality Association of Denver) meeting. In her session, she showed ways of presenting data in visual formats so that they would highlight anomalies and help present a full picture when analyzing the results of a performance test effort.
Attributes for effective analysis and visualization of data discussed included:
- Tells a story
- Offers faster interpretation
- Highlights what’s pertinent
- Highlights trends and patterns
- Designed for the target audience
- Reflects the big picture
- Suggests conclusions
- Provides context
- Maintains data integrity
- Accounts for gaps in data
Though Tawfik highlighted how this is done with performance test data by using a real-life case study, these techniques are useful whenever analyzing and presenting large amounts of data.
Listen in to the video clip below to hear from Mais Tawfic about her presentation and the benefits of visualization of performance test data.
The connection between improv games and the daily standup may seem like a stretch at first, but as presenters Arlen Bankston and Jessie Shternshus demonstrated at their Scrum Gathering Atlanta session, “Standout Standups with Preparation, Practice and Applied Improv,” the same improvisational games used by improv actors can facilitate communication between project team members in a non-threatening way.
After briefly summarizing the Daily Scrum process, they led session participants in creating a list of impediments through a game where each group member begins by saying, “And to make matters worse…” Within a few moments, one problem had grown into a string of several problems. Participants came up with impediments including technical difficulties, lack of team cohesion, issues with tools, lack of preparation and team communication issues such as rambling team members and people distracted due to multi-tasking.
Bankston and Shternshus then led a number of activities, some that participants paired up for and others that worked well for several group members. One partner activity required that while one person is speaking, the partner had to listen to the very end of the sentence, so they could begin their response with the final letter of their partner’s sentence. This challenged partners to listen without interrupting, or assuming what the other would say.
It also stopped the natural process of preparing what to say next and shifted the focus onto the speaker. “Every person comes to a conversation with an agenda,” Jessie Shternshus pointed out.
Another game was “Color/Advance,” which helps alleviate rambling. One partner told a story, and the other could encourage more detailed information by saying, “Color,” or could prompt the speaker to move on by saying, “Advance.” This game provided feedback in a harmless and fun way. Participants suggested that this could be useful in a standup or in a retrospective.
Presenters elicited further feedback from participants about what helps improve communication during standups, and session attendees discussed ideas such as using a progress board with sticky notes for at-a-glance updates and passing around a physical object to indicate whose turn it is to speak.
Improvisational games promote active participation, build team engagement and generate positivity and laughter, according to the presenters. “We’re trying to make active listening possible,” summarized Arlen Bankston.
In his presentation at Scrum Gathering Atlanta, “Creating a Shared Vision: From Compliance to Commitment,” Senior Coach and Vice President of agile42 Brad Swanson discussed the importance of having a vision and offered five strategies for building a shared vision.
In defining the word “vision,” Swanson gave some inspiring examples, such as Kennedy’s vision of landing a man on the moon by the end of the 60’s and the vision of creating a portable digital music player—the iPod. Inspiring visions include concrete, measurable details, he explained.
This interactive session gave participants opportunities to explore different strategies for building a shared vision. Groups prepared posters and gave presentations based on the five strategies for building a shared vision, from the book The Fifth Discipline Fieldbook by Peter Senge, et al: Telling, Selling, Testing, Consulting and Co-Creating. Groups practiced Innovation Games techniques such as “Remember the Future” and “Product Box.”
Team members had just minutes to apply techniques and produce analyses and product brochures. “The process you went through as a team is more important than what you ended up with on paper,” Swanson pointed out during the wrap-up.
Participants highlighted key takeaways from the presentation, including “applying the co-creation techniques generated passion and purpose.” One of the participants said that he liked how the product brochure activity, one that required the creation of a physical item, spurred participation from everyone in the group. Another participant commented on how these techniques helped sharpen focus, when so often people can get caught up in the details of a project.
How effective is your organization at implementing a shared vision?
Some might argue that exploratory testing, by its very definition, is the opposite of automated testing; however, testers are learning ways in which automation and exploratory can be used to complement with one another.
At STAREAST 2012, I spoke to one participant who used tools that would help facilitate her exploratory test session, allowing collection of logs and other information which would help with troubleshooting. Though this is not “automated testing,” automation is aiding exploratory testing through the tool.
In her keynote, Dot Graham talked about “automated exploratory testing,” giving an example of how it was used to test Bing. I asked Bob Galen, who presented “Session-Based Exploratory Testing,” about mixing automation and exploratory testing. He warned against the thinking that automation could replace exploratory testing, but did give an example of a case study in which the repository that was created from exploratory test documentation helped to generate automation test ideas.
Listen in below to Dot Graham and Mark Fewster, authors of Experiences of Test Automation, as well as to Bob Galen, as they explain their thoughts on the mixture of exploratory testing and automation.
One popular objective for test automation is to automate 100% of manual tests, according to independent test consultant and author Dorothy Graham. However, while some tests are better as automated tests, some tests are better performed manually. In her keynote at STAREAST 2012, “What Managers Think They Know about Test Automation—But Don’t,” Graham discussed the various misconceptions managers can have about automation and identified ways to set realistic goals.
Objectives are vital, as they determine a team’s direction, funding and assessment of the project as a success or failure, she explained. Unrealistically high goals will ensure failure. One measurement of success is ROI, which some managers misconstrue as the many benefits of automation: tests are run more frequently, they take less time to run and they require less human effort.
To measure the ROI, one can use a simple calculation of (benefit – cost)/cost. More information on this is available on Graham’s website. Without this quantifying of ROI, good ROI can be achieved, but it is essential that the benefits are made visible.
“Automation does not find bugs; tests find bugs,” said Graham. Automation in itself is not a cure-all for an organization’s testing needs. Furthermore, it does not replace testers. Automation, like other testing tools, supports the efforts of testers.
Graham emphasized the importance of implementing high quality testware architecture, as she cited poor architecture as the leading cause of abandoned automation efforts. The testers, test execution tools and positive relationships between developers and managers must all work in concert to produce successful results. “Good automation takes time and effort. It doesn’t just come out of the box,” said Graham.
She encouraged testers to educate their managers on the realities of test automation, as their support is critical to project success.
For more on test automation from Dorothy Graham and Mark Fewster, authors of Experiences with Test Automation, see Test automation: Exploring automation case studies in Agile development
For comprehensive conference coverage, see our Software Testing Analysis and Review conference page.
Many people think performance testing is a difficult effort riddled with complicated tools and measurements. As for those aspects that are complex, Scott Barber, Chief Technologist at PerfTestPlus, Inc., suggested leaving them to the specialists; there are easy parts that people who are just getting started with performance testing can take on. His presentation at STAREAST 2012, “Simple and Informative Performance Tests You Can Do Now” offered several entry points to this often misunderstood aspect of testing.
He discussed the definitions of performance testing floating around, from “performance testing makes websites go fast” to “performance testing optimizes software systems by balancing cost, time-to-market and capacity while remaining focused on the quality of service to system users.” Furthermore, he clarified definitions of some common testing terms. Load testing is about expected, anticipated conditions. Stress testing, conversely, is about outcomes you don’t expect, finding when and where something might happen or break.
Performance testing plays an important role in numerous possible objectives: determining compliance with requirements, evaluating release readiness, assessing user satisfaction, estimating capacity, validating assumptions and generating marketing statements. As such, it hardly makes sense for performance testing to come only at the end of production, explained Barber.
“Performance testing helps stakeholders make decisions regarding product value and project risk; specifically value and risk related to speed, scalability and the stability attributes of a system and its components throughout the product lifecycle,” he said.
He advocated raising visibility about performance testing within your organization by asking questions, which often boil down to “what’s the real goal?”; generating acceptance criteria and setting priorities. Report and talk about performance often, he suggested. He also recommended taking a few minutes to spot-check the performance of competitors’ sites; not many organizations do this, but a competitive analysis does not take much time and offers some useful data.
As far as getting started with actual performance testing, he provided links to several online tools:
Tools for determining speed:
Tools for making use of performance snapshots:
These tools can offer data, often with accompanying graphics, in a rather short amount of time, and they aren’t the only ones; more tools become available every day.
Check out this video of Scott Barber at STAREAST 2012:
When testers “compare the product to specifications,” what they often find is that they identify intentions — and the intentions are different from the product. While the tester’s job may be to gather information, this becomes an intentional investigation, according to STAREAST 2012 keynote speaker Michael Bolton of DevelopSense, Inc.
His speech, “Evaluating Testing: The Qualitative Way,” explored qualitative research methods, compared the job of software testers to that of investigative journalists and offered the notion that testing is ultimately a human activity.
Read more in-depth coverage of this insightful speech here: Software testers as qualitative researchers: STAREAST 2012 keynote.