Software Quality Insights

November 9, 2011  5:38 PM

Gorilla Logic announces automated testing tool for Android

Melanie Luna Melanie Luna Profile: Melanie Luna

Recently Gorilla Logic announced the release of their newest automated testing tool, FoneMonkey for Android. I spoke with Stu Stern, Gorilla Logic CEO, about this new tool, and he explained the FoneMonkey tool suite in general, which is available free and open source. Now Android developers and testers will have access to the same automated testing features that iOS developers have already been using, which includes record and playback functional testing.

One of the features that differentiates FoneMonkey testing tools from others according to Stu is the high-level action recording. He described how this works, saying:

We record an actual action. So rather than saying, ‘Oh you dragged your finger on the screen,’ we are recording ‘oh, you scrolled the table to row 3.’ Obviously, that is much more readable; you can create a command from scratch for that recording to try to specify a slight command manually… This high-level action recording, because we are native, and we live so-to-speak right inside the application that is being tested, we’re able to interpret a slight gesture and understand that you are scrolling a table.

He also mentioned how these tools aid in the increasing practice of collaboration between testers and developers in Agile development. FoneMonkey tools have an interactive point-and-click console that records the app in a straightforward manner, requiring no prior programming knowledge, which enables testers to become a more active part of the team in many cases. The tests also generate native code, and they are increasingly generating JavaScript, appealing to a broader audience.

With this testing tool, developers and QA testers can create tests that perform user operation sequences and confirm the results. It is also compatible with continuous integration environments.

For other recent SSQ articles related to mobile application testing, see:

Smart phones: Implementing a test automation architecture

Mobile application testing: Five differences testers must take into account

Mobile applications: Testing and monitoring using SaaS solutions

October 27, 2011  8:45 PM

Software Test Professionals Conference (STPCon) coverage

Yvette Francino Yvette Francino Profile: Yvette Francino

You know how when you go on a cruise you feel stuffed from all that delicious food that’s served around the clock? Well, that’s sort of how my brain feels at the end of a conference: happy and full!

I’ve been able to meet with so many test leaders here at STPCon and listen to some interesting presentations. As usual, I’ve been running around with my little camera, shooting videos of speakers. This time I wasn’t the only one with a camera. Stanton Champion of uTest was at the event, too, and I must admit, I think his videos outnumber mine. In fact, he even shot a video of me! It was kind of fun being on the other side of the camera!

Check out this short video clip to see Stanton’s view of the conference:

And for more, take a look at our STPCon page for conference videos, exclusive interviews, and these tips from well-known experts Peter Walen, Matt Heusser, Scott Barber and Catherine Powell.

October 27, 2011  3:11 AM

Catherine Powell on testers’ personality types

Yvette Francino Yvette Francino Profile: Yvette Francino

Catherine Powell of Abakus is another SSQ contributor who I had the pleasure of meeting in person here at STPCon in Dallas. Catherine led an Open Jam session, an interactive learning event in which we played games and learned about the different personality types a tester might have. Catherine explains more about the personality types in Software test professionals: Five tester personality types and about the Open Jam session in the following short video clip:

October 26, 2011  1:26 AM

STPCon: The tester’s role on an Agile team

Yvette Francino Yvette Francino Profile: Yvette Francino

We hear a lot about teams transitioning to Agile, but what does this mean for the tester? Robert Walsh of Excalibur Solutions presented, “Adapting Conventional Testing Strategies for an Agile Environment” this morning at STPCon.

Rob started with a list of myths or misconceptions:

  • Agile doesn’t test.
  • Agile doesn’t need testers.
  • There’s no place in Agile for manual testing.
  • Agile doesn’t write documentation.
  • Agile doesn’t plan.
  • Agile has a public release at the end of each iteration.

Agile methodologies do encourage planning, documentation and testing; however, unlike traditional methodologies, with Agile we want to do enough but not so much that we will have to do it again.

If we are going to write that documentation, let’s do it once, and make it valuable, so we don’t need to do it again. Agile believes in ‘just enough’ but no more. If we don’t see a need for it, don’t do it.

Traditional methodologies are notorious for requiring detailed documents throughout the lifecycle. Most of us know the pain of writing these documents and wondering if they’re ever even read. Walsh said that he heard of someone who stuck a brownie recipe in the middle of a document just to check if anyone was actually reading it. He was met with understanding nods from the audience.

Though there were many other examples that Walsh gave of working in a more Agile way, the anecdote that I enjoyed most was one that I’ve often used as well.

Walsh told the story of the family that cut the end off of roast for generations without really knowing why. It’s just that it was the way it had always been done. Upon investigation, it was learned that the practice started because Great Grandma’s roast pan was too small to fit a full roast.

“Once upon a time there was a reason. If it’s not valuable, don’t do it.”

We need to use our critical thinking skills and question why we are doing what we do. Is there a better way? We should never do things simply because it’s the way it’s always been done. And, in fact, Agile tells us to always be looking for ways to do things better.

October 25, 2011  10:24 PM

uTest and Mozilla partner to create a test case management tool

Yvette Francino Yvette Francino Profile: Yvette Francino

Crowdsource test organization uTest has partnered with Mozilla to create a new test management system, CaseConductor™ . This system will allow testers to collaborate on test case creation, management and execution, regardless of where they are physically located.

CaseConductor beta is now available for download in a repository, where developers and QA managers are encouraged to start leveraging the tool, view the source code, provide feedback on usability and ultimately help advance the development of this TCM system.

I caught up with Chief Marketing Officer Matt Johnston, who delivered the lunchtime keynote at this year’s STPCon, “In-the-wild testing: Your survival may depend on it.” He has this to say about the announcement:

But that’s not all. Another announcement made here at STPCon was the expanded partnership with SOASTA to include uTest’s “JumpStart” service for the SOASTA CloudTest Platform. Now customers using SOASTA CloudTest Lite can conduct both performance and automated functional testing using uTests crowdsourced service.

October 25, 2011  9:37 PM

Data testing on data warehouse projects

Yvette Francino Yvette Francino Profile: Yvette Francino

When Karen Johnson told a barista that she was working on a data warehouse, the barista said, “Oh, where is it located?”

While most of us that work with software realize that a data warehouse is sort of a mega-database, not a physical building, we still might feel a little lost when tasked with testing data on a data warehouse project.

“A BI [Business Intelligence] project is like a foreign language. You can’t really see it. If you’re in management and you’re trying to staff for it, you’re not sure how to hire for it,” said Johnson, talking about the difficulty of trying to explain the skills needed for testing data on BI/DW systems.

I attended Johnson’s presentation this morning at STPCon titled, “Data Testing on Business Intelligence and Data Warehouse Projects.” She gives a quick overview in the short clip below.

Johnson described a lot of the basics, including the term ETL: Extract, Transform, Load. She explained that this really was all about sucking data out of systems, transforming it to a common format, and then loading it into a database which will allow for some sophisticated analytics. One simple example she gave was with date formats. If you are working with multiple systems that are using different date formats, you would first extract the appropriate data, use an algorithm to transform the data into a consistent format, and then load it into a table in that common format so that reporting could be done.

So what should a tester look for when testing data? Here are some bug opportunities:

  • data type mismatches
  • rounding
  • truncation
  • boundary conditions
  • data accuracy
  • is the data current
  • data refreshes
  • data security

What about when you’re working in an Agile environment? Check out the interview I had with Ken Collier about his book, Agile Analytics: A Value-Driven Approach to Business Intelligence and Data Warehousing.

October 25, 2011  2:26 AM

STPCon: Meet Lanette Creamer

Yvette Francino Yvette Francino Profile: Yvette Francino

Going to industry conferences is wonderful, not just because of the additional learning and professional development that’s available, but because of the connections we make. We go beyond recognizing a name; we have the opportunity to really get to know some of the people who we’ve heard speak or whose blogs or books we’ve read.

I had the opportunity to really get to know one of the conference speakers, Agile test consultant Lanette Creamer. I’d done two pre-conference interviews with Lanette about her sessions, one about pairing programmers with non-programmers, and one about Agile games.

But in order to personally get to know someone, I am always interested in finding out how they started in their careers and what led them to the place they are now. I asked Lanette about her background and what series of events led her to the place she is today. She answered:

In the early 1990’s I was involved in a local bulletin board system and made a group of friends. They helped me put together a second-hand PC from parts at the computer swap meet for a little more than $200, all I could afford at the time, and the sysop (owner of the local BBS) put it together, and taught me enough DOS to get online to chat with my friends and play TeleArena.

Back in those days, it was pretty unusual for females to be interested in technology. In a few years, when I graduated with my AA from the community college, I went to Western Washington University, majoring in Graphic Design, but my favorite part of design was in the use of the computer programs, especially desktop publishing and photo editing. I didn’t enjoy the competitive nature of graphic design, and frankly, I learned that while I wasn’t a terrible designer, I wasn’t that great either. Not awful, just not talented enough to make the cut at that time, in 1995. I wasn’t sure what to do.

Honestly, I knew that I didn’t want to be a Graphic Designer at that point. I just wanted to be on the computer all day, every day, and I knew that. To this day my design background gives me an advantage in testing UI and usability aspects of software. I can identify font inconsistencies, and why something is attractive or not, in more description than before I had experience with graphic design. While I didn’t end up in the field of my major, I credit my love of the arts for getting me over the first barrier to move into a technology career. The creative part of testing remains my favorite.

After college, I was working at a charity bingo parlor, and I was known as the “computer whiz” who made our manual spreadsheet into a self-calculating excel workbook. I was the person who knew that if you had just a “flashing c,” it would tell the panicked manager, “Type “win” and press enter.” Of course, in college I’d already fallen in love with the Mac OS, but something like that was far out of my bingo budget. It took me six months to save up my money to buy my used car in cash, and I’d made myself ill with food poisoning twice my last year of college trying to eat cheaply off of leftovers for too long. I make it sound awful, but besides the stress of not enough money, being poor wasn’t much of a problem. Those were some of the happiest years in my life to date. I had friends and family all around me, and I was in love and newly married. Best of all, I had good health. I felt good most days, and those I didn’t, I believed were temporary.

I started in testing working on Adobe InDesign in 1999. I worked on several Adobe products, and ended up leading workflow testing across the Creative Suite products at Adobe. In 2010, I worked my first consulting job for Sogeti, on a project for a large Seattle based coffee company. It was quite a shift to go from a company that makes software to a company that makes something else, but needs the software to have a good user experience to efficiently deliver coffee. When the project for the coffee purveyor was completed, I had a great opportunity to work as a consultant and Agile testing coach for a great company working in the medical field. That is when I started Spark Quality LLC, my own consulting company! I’m currently consulting for a great company in California called Silicon Publishing, where I’m working with some amazing developers to create custom solutions for clients in the publishing, printing, and design industries.

In consulting, I’m finding that meeting with a client to get a clear picture of their needs is a wonderful advantage if it can be arranged. The more we understand about our users, the easier it is to delight them and target our tests to reflect their top priorities in addition to risk. Getting feedback as a result of your testing from the client is what I love about working with a small company. As much as I loved working on the Adobe Creative Suites, there are some corporate reasons why it is more difficult legally to share demos with customers and have a free and open conversation when you are representing a publically held company. In the context of a small business, we can demo real code for the client just as soon as it is ready with fewer privacy concerns.

October 17, 2011  5:09 PM

Keynote DeviceAnywhere provides customized performance monitoring to GPS vendor TomTom

Melanie Luna Melanie Luna Profile: Melanie Luna
In keeping with SearchSoftwareQuality’s focus on mobility testing this month, I spoke with the VP of Marketing at Keynote DeviceAnywhere, Leila Modarres, and the VP of Technical Operations for TomTom, Ian Hammond, regarding the recent implementation of a performance monitoring solution for TomTom’s in-vehicle GPS devices as well as the iPhone app. Keynote DeviceAnywhere provides a real-time monitoring product, Test Center Enterprise Monitoring™ (TCE Monitoring), which is accessed over a cloud-based platform.

Quality is imperative, yet it can be time-consuming and expensive to ensure quality on mission-critical applications. Keynote DeviceAnywhere offers a public cloud or a private cloud option for Software-as-a-Service models that provide testing through automated processes as well as real-time monitoring services.

The TCE Monitoring service has enabled TomTom to provide geographically specific monitoring to their market in France. Keynote DeviceAnywhere’s hardware integration solution allows TomTom to upgrade their own software without having to do re-integration on their devices and without redoing their test sets each time.

Ian Hammond discussed the expansion of these TomTom offerings to the U.S. and other countries, and how they will be able to continue monitoring performance in different geographical areas. He explained, “We don’t have offices everywhere, so this kind of solution allows us to put our devices wherever we want to put them and monitor them in any geography, which is invaluable when we’re trying to do full diagnoses.”

For more information about this partnership, check out Real-time performance monitoring for mobile apps.









October 13, 2011  9:56 PM

Mobile testing strategies

Yvette Francino Yvette Francino Profile: Yvette Francino

Mobile application development is undeniably a hot and growing market and with it comes new challenges for the test community. This month, SSQ is focusing on mobility testing with a series of tips and stories covering everything from determining scope for your mobile test efforts to implementing an automated test strategy.

We started the month with a series from DeveloperTown’s Rick Grey who was tasked with creating a mobile applicaton test strategy in a resource-constrained environment. Grey reminds us that risk and mission criticality need to be taken into account when determining scope and creating a strategy. In the situation he describes, there was not a need for a highly available mobile application since the functionality was available through a website. Additionally, security risks were low. This allowed for a different approach than one in which availability and security are greater risks.

Be sure and check out his series and other recent mobility test coverage by reading the tips below:

Mobility application testing: Mobile devices on a budget
Mobile application testing: Cost effective strategies
Mobile Web applications: Monitoring test triggers
Mobile testing: Nine strategy tests you’ll want to perform

October 11, 2011  4:48 PM

SwithPoint’s new Queuing Alert app signals mobile phone possibilities

Melanie Luna Melanie Luna Profile: Melanie Luna

In the latest from a recent series of interviews with companies offering creative new mobile apps, SSQ spoke with Manish Jha, Vice President of Business Development at SwitchPoint, about their new Queuing Alert app, which enables service-oriented businesses to contact customers directly on their mobile phones to provide details about wait times and service availability.

With the Queuing Alert app, a restaurant can alert patrons when their table is ready, rather than relying on a handheld buzzer, which is limited to that area and can only flash and buzz. Instead, customers can leave the immediate location and receive updates via text message or voice mail; they can also look at the business’s schedule online to see where they are on the waiting list.

These capabilities are available to all types of hospitality businesses in addition to restaurants, such as spas or salons, which can choose the Queuing Alert app as an add-on to their existing online services, which they then provide as a convenience to their customers. Since it is a Web-based application, the testing process does not entail testing across different platforms such as iOS, Android and Blackberry.

Furthermore, SwitchPoint utilized a developer platform in the cloud that is hosted by the GENFuzion Developer Community called the “Sandbox” to develop and test the Queuing Alert application. This facilitates the process for developers, as system maintenance is performed by GENBAND A2.

Manish Jha offered some insights into the possibilities inherent in the combining of voice technologies and mobile apps:

When you go to the GenApps page, you can see the different ways in which we are using voice technology. There is a queuing alert system, but there is also time to open up the kind of data that the switching has in it so that customers can control access to their own line, like if they have two minutes or twenty seconds of time, they can actually block numbers on their cell phones themselves—I’m talking about controlling their home phones through their cell phones. There is an enormous potential in voice technology to do things other than just talk, because it’s a very secure, parallel technology.  

If you’re interested in reading other recent posts about mobile apps, check out:

History on the Go: New mobile apps engage students


MobileCause and software testing: SaaS platform ensures quality for non-profits

For related articles on, check out:

Mobile software testing: Similar tests, different environments


Mobile application testing: Five differences testers must take into account


Mobility application testing: Mobile devices on a budget

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: