It’s conference season again, and I’m looking forward to attending a couple of software quality conferences right here in the Denver area that will be coming up soon. The Software Quality Association of Denver (SQuAD) conference will be March 9-10, and Mile High Agile is April 7th. I’ll get to venture a bit farther for STAREast in Orlando in May. However, I have not been able to get out of the US for a software conference yet.
Fortunately, SSQ contributor and Agile expert Lisa Crispin was able to attend Belgium Testing Days in mid-February and report back on the experience. Not only did she share about several of the presentations she attended, but she also noted some of the cultural differences between the Belgium conference and those she has attended in the US. She writes:
Maybe it’s the fact that cultures mix more frequently in Europe, or that their software industry seems quite progressive, but attending Belgium Testing Days was a new and rewarding experience for me.
Other news across the ocean includes the expansion of Software Quality Systems (SQS) Belfast facility. In an interview with Rob McConnell, SQS Regional Director for Northern Ireland, I learned that the expansion is due to increased demands for “near-shore” outsourcing solutions from US and European clients.
This month we’ve explored automation throughout the application lifecycle, and our contributors have revealed that the role of the software tester is changing.
In his tip, Is automated testing replacing the software tester?, software consultant David Johnson discusses how both Agile methodologies and automation contribute to the changing responsibilities of the software tester.
There are several forces responsible for the changing role of testers, at least for on-shore testing resources. These include:
- Agile development techniques that integrate test automation into the development process.
- Mature test automation tools that simplify test creation, automation and test execution.
In all cases the need for testing has not decreased, but the responsibility has shifted to either development (i.e. TDD) or functional test automation. In all cases the test resource model for traditional testers is significantly reduced, if not completely removed.
So what skills do traditional testers need to deal with this transition?
In a recent expert response, Lisa Crispin answers the question: What kind of automation skills should a tester have and what’s the best way to get them? Check out her advice on ways that test engineers can build their knowledge and skills with test automation.
With the onset of cloud computing and utility-based pricing, “as-a-service” solutions are becoming more popular. HP has recently added Testing-as-a-Service as an outsourcing alternative for complex testing needs.
For the past 10 years or so, HP has provided Managed Testing Services, which offer a variety of testing services including staff augmentation or full-testing-service solutions, with pricing based on people-hours. The difference with the TaaS solution is that the pricing is “consumption-based.”
I spoke with Paul Ashwood of HP Enterprise Services, who explained:
The “as-a-service” model is all about consumption of service, which means you need to have a lot of repeatable activity. And when you’re doing the development of application, there’s a lot of new stuff there that you can’t really price. Where we apply TaaS is more around applications that are already in production where you need to do regression testing. Regression testing is well served by test automation. Test automation and test regression is all very repeatable.
The services that HP is offering in their TaaS Solution include:
1) Accelerated Test Service – Regression test automation. Pricing is based on test cases rather than the cost of people. The service also includes the maintenance and execution of the automated test cases.
2) Performance Test Service – Pricing is based on Performance Test Units (PTUs.) PTU values are based on things like the number of the steps in the test case or the number of virtual users, etc. Clients pay for what they use, rather than having to purchase the tools themselves.
3) Application Security Testing – Leverage WebInspect to for penetration testing and the Fortify tool to do code scanning. The price is per assessment or Website. The client doesn’t need to pay for the tools.
4) SAP Test Service – Accelerated Test and Performance Test specifically for SAP. HP has built out common business scenarios and common test cases for SAP.
The advantage to the client is that they are paying for what they use rather than paying for people or tools who may not be needed during certain downtime periods.
This type of solution is typically used post-production for mission-critical applications. As Ashwood points out, the solution does not depend on a relationship with specific people, and the success of their model includes having “test factories” in various countries and the ability to move resources to where the need is.
What is key to these models is the concept of a factory approach. So rather than the client saying, ‘I’m buying 20 of your test automation engineers for the next two years,’ instead what they’re buying is 20,000 executions of automated test cases over the next two years. So the people are not tied to a client anymore.
Obviously, this is a different approach than the tight coupling of developers and testers that is encouraged in Agile development models. Ashwood agrees that this model is not conducive to new development work, but, again, it is cost-effective for specialty testing of mission-critical applications.
Rona Shuchat, Director of Application Outsourcing Services at WW IDC Services Research Group say has this to say about TaaS solutions:
The reality around testing as a service or on demand testing as some refer to it, is that different vendors are positioning different offerings under this term “TaaS” or “On Demand Testing”. Companies like SOASTA, for example, are pushing a cloud based test service, also on demand, that is focused on performance testing and load testing – where multiple load generators can be combined to enable scalability out to hundreds or thousands of virtual users. SOASTA promotes its real-time BI engine to capture and report out on performance metrics. This service can be purchased for x hours, or monthly or annually.
A company like Skytap is also offering cloud based on demand test lab environments, and are positioning their cloud service as on demand virtual data centers that can drive parallel dev/test requirements or positioning as a test environment on demand to support unit testing, system testing, UAT or performance testing. We also see prepackaged domain solutions being offered under the term “TaaS” – for SAP, or other ERP solutions (e.g. Infosys, HP, etc. ). Perfecto Mobile offers automated testing solutions for mobile devices, enabling mobile developers to test mobile apps and content on a range of real devices over the web and so on.
For more about Testing-as-a-Service, read Testing-as-a-Service: Outsourcing your specialized software testing.
Recently, WhiteHat Security announced Sentinel PL (PreLaunch), a service for website security testing done before an application is released to production.
Senior Analyst from the 451 Group Wendy Nather says:
With Sentinel PL, WhiteHat is addressing the growing need to move security testing farther upstream in the software development lifecycle. Catching problems earlier means lower remediation costs and better-informed risk management in the release process.
It’s undisputed that the earlier in the software development cycle that defects are found, the easier and cheaper it is to fix them. However, security testing is often only done post-release, if at all. While there is still the need to test security post-release, it makes sense to test whatever can be tested pre-release as well.
The other day I spoke with WhiteHat CSO Bill Pennington and product specialist Ravi Iver about their service and the announcement. Up until now their service began once the code was complete and deployed in a production environment. They heard that their customers felt their service could be improved if some testing could be done in the pre-production environment.
I asked about some of the differences in testing in pre-production versus post-production.
Pennington answered that there were several tradeoffs that needed to be made for production testing to ensure availability, stability, performance and security.
To read more about application security testing, check out our security expert responses, as well as the Security Lesson: Beating Web application security threats. For more information about Testing-as-a-Service (TaaS), read Testing-as-a-Service: Outsourcing your specialized software testing.
The other day I attended Rally Software’s Agile Cafe in Boulder, giving me the opportunity to hear more stories of “real world” Agile. I had just attended the Boulder Agile Software Meetup group the week before and heard about some of the obstacles to Agile acceptance. Two of the speakers at the Agile Cafe were real world success stories, but both of them did experience obstacles with their transitions.
Bill Holst, President & Principal Consulting Software Engineer at Prescient Software, talked about two projects done for Colorado Springs Utilities, one done with waterfall and one with Agile. Though the Agile project was successful, it started with a lot of problems. A key factor in its success was that the team stopped halfway through and the customer group, in this case field engineers, reworked the requirements.
[kml_flashembed movie="http://www.youtube.com/v/LDq5mlLebfg" width="425" height="350" wmode="transparent" /]
The second presenter was Adam Woods, Director of Product Development at StoneRiver. Woods described a phased transition to Agile for a major development effort. One of the biggest obstacles encountered was the lack of buy-in and understanding of Agile principles from executives, project leaders and delivery teams. Woods, too, attributes a key to the eventual success to taking the time to stop and regroup when necessary. As the team did this, they inspected and adapted, going back to basic principles. Recognizing the importance of engagement from the product owners, they changed their model, and eventually gained the acceptance needed from all stakeholders.
Jean Tabaka, author of Collaboration Explained, followed up with a keynote, describing important factors for success in Agile transition. I had a chance to catch up with Tabaka after the presentation for a quick summary.
[kml_flashembed movie="http://www.youtube.com/v/yR-dccJMEzQ" width="425" height="350" wmode="transparent" /]
This month on SearchSoftwareQuality.com we are talking about automation in ALM. As I read through the various content, I’m recognizing what a vast amount of automation is involved in software development. In the context of software testing, I used to think of “automation skills” as skills with record and playback tools, such as HP’s QuickTestPro or maybe the popular open source tool, Selenium. However, there are many more types of automation and a variety of tasks that automation can perform.
Not only is automation used for multiple tasks, but often a specific type of automation can serve multiple purposes.
As Agile Testing co-author Lisa Crispin writes in Automated test scripts: The Smartphones of testing:
Automated test scripts also serve multiple purposes. We write tests up front so that we know what code to write and when it is finished. Creating the tests is a team activity which enhances collaboration and communication. Automated test scripts provide up-to-date documentation about how the system behaves. Automating our regression tests not only provides quick feedback as we check in code changes, keeping regression failures out of production, it frees time for all-important exploratory testing to find out whether the software delivers the expected business value.
In the end, that’s what we all want isn’t it? We want to automate the tedious, repetitive tasks, and to use our minds for the critical thinking.
It’s a new month and with that, we have a new theme at SearchSoftwareQuality.com: Automation in ALM. Throughout the lifecycle, there are ways that project teams are automating their processes and their work to provide efficiency so they can focus on those tasks that require critical thinking.
I’m excited about our lineup for the month, starting with these three pieces we published this week about automation in ALM:
Test automation: When, how and how much
Test automation has often been touted as an important part of an organization’s quality strategy. However, it’s not a silver bullet. In this tip, consultant David Johnson describes important considerations in determining when to invest in test automation, how to implement the program, and how much of your application space should be automated.
UI Testing: Automated and Exploratory
Should user interface (UI) testing be automated or exploratory? In this tip, SSQ contributor Chris McMahon answers that both automated and exploratory testing can be used for UI testing, and what is most effective is using them together. McMahon explains when each type of testing is appropriate and how used together, they can complement one another to provide the most comprehensive UI test coverage.
Six Tips for Selecting Automated Test Tools for Agile Development
Software consultant Nari Kannan describes the differences between agile application lifecycle management tools and traditional lifecycle management software. Agile ALM tools are more tightly integrated, easier to use, support distributed teams and seamlessly integrate many of the traditional lifecycle management functions.
Promoting Agile internally was the topic at last week’s Boulder Agile Software Meetup Group.
[kml_flashembed movie="http://www.youtube.com/v/5gK5Nq40n3Y" width="425" height="350" wmode="transparent" /]
In this video, Matt Weir speaks about transitioning to Agile and how he learned the most from the skeptics. He found that by listening and working together to address concerns, the skeptics became his best allies in the transition.
Want to read more about the challenges of gaining buy-in when your team is transitioning to Agile and how those are being addressed? Check out Real world Agile: Gaining internal acceptance of Agile methodologies on SearchSoftwareQuality.com
The January theme for SearchSoftwareQuality has been “Large-scale Agile.” It seems everyone is jumping on the Agile bandwagon, but is that bandwagon big enough to hold a very big team coding an enterprise application? Or will too many people weigh it down?
Thought leaders are studying issues associated with large-scale Agile and banding together to come up with solutions. Check out these five recent titles from expert SSQ contributors and find out how to address the challenges that come along with large-scale Agile.
Lisa Crispin offers advice on planning for agile in both big and small team environments.
Chris McMahon addresses issues of work flow in your ALM tool set for large development efforts.
David W. Johnson discusses ways to maximize your testing return on investment in agile environments.
Nari Kannan gives advice on how outsourcing can be best managed in a large agile environment.
Matt Heusser recommends a step-by-step approach to transitioning to Scrum.
Crowdsource test group uTest announced January 18th the release of a new mobile application for their customers and 30,000+ tester community. The application allows testers to use their iPhone or iPad to enter bugs, communicate with the uTest community and customers, accept invitations to projects, upload screenshots or videos and view their earnings.
As a member of the uTest community, I went ahead and downloaded the app to my iPhone, installed it, and in just a few short minutes was able to access my uTest account from my iPhone.
The new iOS apps were tested, of course, by the uTest community, the very group who had been asking for these apps. In this case, they had a vested interest in the application being user-friendly, as they were both testers and soon-to-be users.
Having just spoken with Arxan about the importance of security on mobile devices, I was particularly interested in security and testing on mobile applications. I spoke with uTest’s Matt Johnston, who said:
[We needed to consider both] security of customer’s data and tester’s data. We took every available precaution in working with our outside provider on this app. And of course, uTest tested the app to make sure it held up. So absolutely security was baked in from the beginning, from conception, through development, through testing. The other [security consideration] is IP protection. Again, we made sure the IP was secure for uTest, but more importantly, we made sure it’s a secure and private experience for testers and customers.
I wondered if uTest was thinking of entering the tools market, but Johnston said that at this time the tools are focused on catering to the needs of the testers and customers of uTest, allowing them to work more productively by tying into the uTest platform. Though the primary objective has always been to address the needs of the uTest testers and customers, an added benefit has been the interest and addition of new testers for uTest’s ever-expanding tester community. Johnston explained:
It’s only been available for a couple of days, but we’re seeing upwards of 20 new testers a day who are signing up through the iPhone and iPad apps, finding it in the iTunes store or reading about it in an article and coming to us through those applications as opposed to signing up through the Web interface. So we’ve been very pleasantly surprised that it’s been such a good thing for tester recruiting so quickly.
In less than two days, it’s been downloaded more than 500 times (with an even split between iPhone and iPad users), which is quite high for a specialized B2B application. For those 500 downloads, the app store has shown a rating of 4.5 out of 5 stars. Johnston attributes this to the “in the wild” testing that was performed vs. lab testing. He elaborated:
In doing the four test cycles, we tested on four separate continents with a team varying between 10 and 20 people, and getting a lot of OS coverage and carrier coverage. It’s been battle tested as much as it can be, ahead of launch.
How does the uTest community like the new app? All reviews on the iPhone app were rated 4 or 5 stars with only positive comments. In looking at the comments on uTest’s blog, again, they were unanimously positive. If there was any dissatisfaction, it may be from those using other mobile platforms who are wondering when their favorite platform will be supported. However, even those who are without an iPhone or iPad are positive, sending kudos to uTest for their hard work.