Catherine Powell of Abakus is another SSQ contributor who I had the pleasure of meeting in person here at STPCon in Dallas. Catherine led an Open Jam session, an interactive learning event in which we played games and learned about the different personality types a tester might have. Catherine explains more about the personality types in Software test professionals: Five tester personality types and about the Open Jam session in the following short video clip:
[kml_flashembed movie="http://www.youtube.com/v/bmuE2Q9pTQU" width="425" height="350" wmode="transparent" /]
We hear a lot about teams transitioning to Agile, but what does this mean for the tester? Robert Walsh of Excalibur Solutions presented, “Adapting Conventional Testing Strategies for an Agile Environment” this morning at STPCon.
[kml_flashembed movie="http://www.youtube.com/v/ugh6m5fAOjg" width="425" height="350" wmode="transparent" /]
Rob started with a list of myths or misconceptions:
- Agile doesn’t test.
- Agile doesn’t need testers.
- There’s no place in Agile for manual testing.
- Agile doesn’t write documentation.
- Agile doesn’t plan.
- Agile has a public release at the end of each iteration.
Agile methodologies do encourage planning, documentation and testing; however, unlike traditional methodologies, with Agile we want to do enough but not so much that we will have to do it again.
If we are going to write that documentation, let’s do it once, and make it valuable, so we don’t need to do it again. Agile believes in ‘just enough’ but no more. If we don’t see a need for it, don’t do it.
Traditional methodologies are notorious for requiring detailed documents throughout the lifecycle. Most of us know the pain of writing these documents and wondering if they’re ever even read. Walsh said that he heard of someone who stuck a brownie recipe in the middle of a document just to check if anyone was actually reading it. He was met with understanding nods from the audience.
Though there were many other examples that Walsh gave of working in a more Agile way, the anecdote that I enjoyed most was one that I’ve often used as well.
Walsh told the story of the family that cut the end off of roast for generations without really knowing why. It’s just that it was the way it had always been done. Upon investigation, it was learned that the practice started because Great Grandma’s roast pan was too small to fit a full roast.
“Once upon a time there was a reason. If it’s not valuable, don’t do it.”
We need to use our critical thinking skills and question why we are doing what we do. Is there a better way? We should never do things simply because it’s the way it’s always been done. And, in fact, Agile tells us to always be looking for ways to do things better.
Crowdsource test organization uTest has partnered with Mozilla to create a new test management system, CaseConductor™ . This system will allow testers to collaborate on test case creation, management and execution, regardless of where they are physically located.
CaseConductor beta is now available for download in a github.com repository, where developers and QA managers are encouraged to start leveraging the tool, view the source code, provide feedback on usability and ultimately help advance the development of this TCM system.
I caught up with Chief Marketing Officer Matt Johnston, who delivered the lunchtime keynote at this year’s STPCon, “In-the-wild testing: Your survival may depend on it.” He has this to say about the annoucement:
[kml_flashembed movie="http://www.youtube.com/v/LRaAPHPjDWc" width="425" height="350" wmode="transparent" /]
But that’s not all. Another announcement made here at STPCon was the expanded partnership with SOASTA to include uTest’s “JumpStart” service for the SOASTA CloudTest Platform. Now customers using SOASTA CloudTest Lite can conduct both performance and automated functional testing using uTests crowdsourced service.
When Karen Johnson told a barista that she was working on a data warehouse, the barista said, “Oh, where is it located?”
While most of us that work with software realize that a data warehouse is sort of a mega-database, not a physical building, we still might feel a little lost when tasked with testing data on a data warehouse project.
“A BI [Business Intelligence] project is like a foreign language. You can’t really see it. If you’re in management and you’re trying to staff for it, you’re not sure how to hire for it,” said Johnson, talking about the difficulty of trying to explain the skills needed for testing data on BI/DW systems.
I attended Johnson’s presentation this morning at STPCon titled, “Data Testing on Business Intelligence and Data Warehouse Projects.” She gives a quick overview in the short clip below.
[kml_flashembed movie="http://www.youtube.com/v/5QZYWwQ-yq4" width="425" height="350" wmode="transparent" /]
Johnson described a lot of the basics, including the term ETL: Extract, Transform, Load. She explained that this really was all about sucking data out of systems, transforming it to a common format, and then loading it into a database which will allow for some sophisticated analytics. One simple example she gave was with date formats. If you are working with multiple systems that are using different date formats, you would first extract the appropriate data, use an algorithm to transform the data into a consistent format, and then load it into a table in that common format so that reporting could be done.
So what should a tester look for when testing data? Here are some bug opportunities:
- data type mismatches
- boundary conditions
- data accuracy
- is the data current
- data refreshes
- data security
What about when you’re working in an Agile environment? Check out the interview I had with Ken Collier about his book, Agile Analytics: A Value-Driven Approach to Business Intelligence and Data Warehousing.
Going to industry conferences is wonderful, not just because of the additional learning and professional development that’s available, but because of the connections we make. We go beyond recognizing a name; we have the opportunity to really get to know some of the people who we’ve heard speak or whose blogs or books we’ve read.
I had the opportunity to really get to know one of the conference speakers, Agile test consultant Lanette Creamer. I’d done two pre-conference interviews with Lanette about her sessions, one about pairing programmers with non-programmers, and one about Agile games.
[kml_flashembed movie="http://www.youtube.com/v/b64QOflYSuc" width="425" height="350" wmode="transparent" /]
But in order to personally get to know someone, I am always interested in finding out how they started in their careers and what led them to the place they are now. I asked Lanette about her background and what series of events led her to the place she is today. She answered:
In the early 1990’s I was involved in a local bulletin board system and made a group of friends. They helped me put together a second-hand PC from parts at the computer swap meet for a little more than $200, all I could afford at the time, and the sysop (owner of the local BBS) put it together, and taught me enough DOS to get online to chat with my friends and play TeleArena.
Back in those days, it was pretty unusual for females to be interested in technology. In a few years, when I graduated with my AA from the community college, I went to Western Washington University, majoring in Graphic Design, but my favorite part of design was in the use of the computer programs, especially desktop publishing and photo editing. I didn’t enjoy the competitive nature of graphic design, and frankly, I learned that while I wasn’t a terrible designer, I wasn’t that great either. Not awful, just not talented enough to make the cut at that time, in 1995. I wasn’t sure what to do.
Honestly, I knew that I didn’t want to be a Graphic Designer at that point. I just wanted to be on the computer all day, every day, and I knew that. To this day my design background gives me an advantage in testing UI and usability aspects of software. I can identify font inconsistencies, and why something is attractive or not, in more description than before I had experience with graphic design. While I didn’t end up in the field of my major, I credit my love of the arts for getting me over the first barrier to move into a technology career. The creative part of testing remains my favorite.
After college, I was working at a charity bingo parlor, and I was known as the “computer whiz” who made our manual spreadsheet into a self-calculating excel workbook. I was the person who knew that if you had just a “flashing c,” it would tell the panicked manager, “Type “win” and press enter.” Of course, in college I’d already fallen in love with the Mac OS, but something like that was far out of my bingo budget. It took me six months to save up my money to buy my used car in cash, and I’d made myself ill with food poisoning twice my last year of college trying to eat cheaply off of leftovers for too long. I make it sound awful, but besides the stress of not enough money, being poor wasn’t much of a problem. Those were some of the happiest years in my life to date. I had friends and family all around me, and I was in love and newly married. Best of all, I had good health. I felt good most days, and those I didn’t, I believed were temporary.
I started in testing working on Adobe InDesign in 1999. I worked on several Adobe products, and ended up leading workflow testing across the Creative Suite products at Adobe. In 2010, I worked my first consulting job for Sogeti, on a project for a large Seattle based coffee company. It was quite a shift to go from a company that makes software to a company that makes something else, but needs the software to have a good user experience to efficiently deliver coffee. When the project for the coffee purveyor was completed, I had a great opportunity to work as a consultant and Agile testing coach for a great company working in the medical field. That is when I started Spark Quality LLC, my own consulting company! I’m currently consulting for a great company in California called Silicon Publishing, where I’m working with some amazing developers to create custom solutions for clients in the publishing, printing, and design industries.
In consulting, I’m finding that meeting with a client to get a clear picture of their needs is a wonderful advantage if it can be arranged. The more we understand about our users, the easier it is to delight them and target our tests to reflect their top priorities in addition to risk. Getting feedback as a result of your testing from the client is what I love about working with a small company. As much as I loved working on the Adobe Creative Suites, there are some corporate reasons why it is more difficult legally to share demos with customers and have a free and open conversation when you are representing a publically held company. In the context of a small business, we can demo real code for the client just as soon as it is ready with fewer privacy concerns.
Quality is imperative, yet it can be time-consuming and expensive to ensure quality on mission-critical applications. Keynote DeviceAnywhere offers a public cloud or a private cloud option for Software-as-a-Service models that provide testing through automated processes as well as real-time monitoring services.
The TCE Monitoring service has enabled TomTom to provide geographically specific monitoring to their market in France. Keynote DeviceAnywhere’s hardware integration solution allows TomTom to upgrade their own software without having to do re-integration on their devices and without redoing their test sets each time.
Ian Hammond discussed the expansion of these TomTom offerings to the U.S. and other countries, and how they will be able to continue monitoring performance in different geographical areas. He explained, “We don’t have offices everywhere, so this kind of solution allows us to put our devices wherever we want to put them and monitor them in any geography, which is invaluable when we’re trying to do full diagnoses.”
For more information about this partnership, check out Real-time performance monitoring for mobile apps.
Mobile application development is undeniably a hot and growing market and with it comes new challenges for the test community. This month, SSQ is focusing on mobility testing with a series of tips and stories covering everything from determining scope for your mobile test efforts to implementing an automated test strategy.
We started the month with a series from DeveloperTown’s Rick Grey who was tasked with creating a mobile applicaton test strategy in a resource-constrained environment. Grey reminds us that risk and mission criticality need to be taken into account when determining scope and creating a strategy. In the situation he describes, there was not a need for a highly available mobile application since the functionality was available through a website. Additionally, security risks were low. This allowed for a different approach than one in which availability and security are greater risks.
Be sure and check out his series and other recent mobility test coverage by reading the tips below:
Mobility application testing: Mobile devices on a budget
Mobile application testing: Cost effective strategies
Mobile Web applications: Monitoring test triggers
Mobile testing: Nine strategy tests you’ll want to perform
In the latest from a recent series of interviews with companies offering creative new mobile apps, SSQ spoke with Manish Jha, Vice President of Business Development at SwitchPoint, about their new Queuing Alert app, which enables service-oriented businesses to contact customers directly on their mobile phones to provide details about wait times and service availability.
With the Queuing Alert app, a restaurant can alert patrons when their table is ready, rather than relying on a handheld buzzer, which is limited to that area and can only flash and buzz. Instead, customers can leave the immediate location and receive updates via text message or voice mail; they can also look at the business’s schedule online to see where they are on the waiting list.
These capabilities are available to all types of hospitality businesses in addition to restaurants, such as spas or salons, which can choose the Queuing Alert app as an add-on to their existing online services, which they then provide as a convenience to their customers. Since it is a Web-based application, the testing process does not entail testing across different platforms such as iOS, Android and Blackberry.
Furthermore, SwitchPoint utilized a developer platform in the cloud that is hosted by the GENFuzion Developer Community called the “Sandbox” to develop and test the Queuing Alert application. This facilitates the process for developers, as system maintenance is performed by GENBAND A2.
Manish Jha offered some insights into the possibilities inherent in the combining of voice technologies and mobile apps:
When you go to the GenApps page, you can see the different ways in which we are using voice technology. There is a queuing alert system, but there is also time to open up the kind of data that the switching has in it so that customers can control access to their own line, like if they have two minutes or twenty seconds of time, they can actually block numbers on their cell phones themselves—I’m talking about controlling their home phones through their cell phones. There is an enormous potential in voice technology to do things other than just talk, because it’s a very secure, parallel technology.
If you’re interested in reading other recent posts about mobile apps, check out:
For related articles on SearchSoftwareQuality.com, check out:
On October 4th, the cloud-based performance test organization SOASTA announced its integration with Selenium, the popular open source testing tool. This will allow for functional and performance test results to be combined for enhanced analytics, as well as the ability for results to be fed back into continuous integration servers such as Hudson and Jenkins.
But perhaps even more interesting for users are the extensions SOASTA has added to Selenium’s functionality, including a visual test creation environment that allows testers to create tests without traditional coding or scripting.
I spoke with Tal Broder, VP of Engineering at SOASTA, who said:
We have significantly enhanced the capability of Selenium in terms of recording the detection of what element was actually interacted with on the page. We added a visual test environment, which we already had for load testing, and we also enriched analytics. We believe with our offering, even though we are using all the power of Selenium for driving browsers, we have a much faster and easier test creation without having to write a single line of code.
This announcement comes on the heels of two other recent announcements in the ALM test tool market: Replay Solutions and Coverity. Both of these announcements also were about improved automated testing, enhanced analytics, ALM tool integration and feeding results back into a continuous integration tool. I asked Broder about what was driving this trend. He answered:
I think this is all driven by Agile and this whole DevOps movement where people want to build very, very often and release small chunks of code into the user community in a very fast and efficient way. They need the tools that will allow them to find bugs in an automated way and test the performance before you push it, or even after you push it, into production so that you can protect your users from functional problems, from performance problems, and I think that’s why we’re seeing a lot of automation in the industry. I think that trend will continue.
I asked Gartner analyst Tom Murphy to compare and contrast the three announcements. He explains that each tool catches bugs in different ways, but they are very complementary:
There are a number of different places to find defects or ways to find them. Coverity is focused on the analysis of source code to find defects, Replay is focused on identifying defects that occur while the application is running by capturing what is happening in the environment and SOASTA is just “reading” functional testing to their story line, which is a way to automate tests from a user perspective. So rather than looking at the source like Coverity, I use SOASTA (or say HP QTP, or others), to drive the application and monitor for deviations from the expected behavior. I use Replay Solutions while I run tests to provide a detailed recording to the developer so that when a defect is found they can identify what is going wrong faster (ie. I don’t have to reproduce the defect), and I can also use this in production to capture crash info, etc. Now with their most recent product, ReplayLightning, there is more of a connection between what is happening at execution time, which is kind of a Dynamic Source Analysis rather than the Static Analysis that Coverity and others provide. This is important because in dynamic languages there are things you don’t know until run time.
In short, the tools are all complementary to each other.
SSQ spoke with Jeff Kuligowski, who recently became Senior Vice President of Sales and Marketing at MobileCause, a Web service that enables mobile giving, engagement, CRM and donor communication for non-profit organizations. Using their Software as a Service platform, MobileCause has facilitated thousands of campaigns that allow donors to give to non-profit agencies via text message pledges as well as on-the-spot event donating.
Kuligowski explained the importance of testing in regard to non-profit brand management:
In particular, when you’re looking at the non-profit marketplace as your customer base, testing becomes very important. I equate to sort of being the quarterback of a football team; you probably get too much credit when everything is going well, and you get too much blame when it goes bad. The same case is true for the brand of a non-profit. If somebody has a bad experience interacting with the organization through a tool—like a mobile donation tool—the brand is actually the one that gets the blame. And the same thing is true when you do a good job; you create value and add to the brand.
Testing informs many of the decisions made at MobileCause. For example, they chose the SaaS model to avoid the slippery slope of testing on the numerous available devices. The drawback to delivering services in this manner is that they do not take advantage of the many capabilities and aesthetically novel features that new smartphone models have to offer; their site is simple and straightforward.
However, the advantage is reliability. Containing the software to one manageable domain ensures quality, availability and integrity without running the risks inherent in running applications on multiple, frequently changing devices. This consistency of quality is imperative for a sector that relies so heavily on reputation and positive image.