Software Quality Insights


April 8, 2010  9:05 PM

A modern way of gathering requirements: Visualizations

Yvette Francino Yvette Francino Profile: Yvette Francino

If a picture’s worth a thousand words, a visualization’s worth a thousand pictures. A “visualization” is a term used to describe a functional software prototype.  This form of rapid User Interface (UI) prototyping is now being used by business analysts as an effective method for gathering customer requirements.

In the software development lifecycle, it’s the business analyst’s responsibility to work with business stakeholders to determine the requirements of a software application. There are many ways that this can be done. Traditionally, after many long meetings and countless interviews, a big, thick requirements specification would be written, attempting to describe the requirements of the system.  This would then get passed to a design team who would create a functional specification which developers would ultimately use to write code. The end result would be an application that often looked very different from what the business had originally envisioned.

Some reasons this approach leads to an end product that can be so different than what the business wants are:

  • It’s difficult to describe exactly what you want when you don’t have a starting point.
  • Requirements can change over the lifecycle of a product.
  • The details of what the business wants need to be continually clarified as the project evolves.
  • It’s much more difficult to describe a user interface and functionality in words than it is to work with the actual screens.

These problems are some of the reasons the waterfall methodology has gotten such a bad rap and why so many people are switching to an agile methodology. It’s become recognized that more effective collaboration and communication with the business is required in order to accurately understand exactly what business users want.

But switching to an agile methodology isn’t the only solution to this problem. Another technique that is being used is to gather requirements using visualizations.

SearchSoftwareQuality met with iRise leader Mitch Bishop last week to discuss their product line.   Their tools are meant for the business analyst, not the developer.  Working with the iRise applications, business analysts are able to collaborate with stakeholders to create working models, going beyond mock-ups or wireframes.  Visualizations can be integrated with data to provide an actual functional preview of a finished application.

A quick search revealed this informative blog post listing 41 prototyping tools that can be used for rapid UI generation. iRise was included in the list, described as “A very complex tool used to model business process and prototype application interfaces.”  It’s not surprising to me that the tool is listed as “complex” as it does allow for quick prototyping for a variety of product types including Web 2.0, mobile applications and SAP Extensions across several industries.

Being the devil’s advocate that I am, I asked Bishop whether the problem they were trying to solve was already solved by the trend of agile teams. In an agile environment, the product owner works on the same team as developers and testers, producing functional code in short sprints. This cross-functional team addresses the improved collaboration and communication with the business and allows for the continual product review throughout the lifecycle.

Bishop answered that rapid prototyping tools can be used in an agile environment as part of the short sprint. He’s finding that all teams, regardless of methodology, are effectively using visualizations to help better define requirements.

It’s good to know that the industry is finding effective ways to gather requirements. Let’s hope the days of thick requirements specifications are quickly coming to an end.

April 7, 2010  10:40 PM

Taking a peek at hardware QA

Rick Vanover Rick Vanover Profile: Rick Vanover

This week, I had the opportunity to attend the HP StorageWorks Tech Day in Houston. This social media event connected bloggers across technology fields to focus on various server and storage products. One of the focus points of the event was at two of the quality assurance labs at HP’s campus in Houston. There we met with various people who manage and implement all aspects of HP’s lab functions for internal product support.

The first quality assurance lab was for the Enterprise Backup Solutions or EBS. In this lab, the entire matrix of supported data protection configurations is available for HP’s testing along with the partner ecosystem. The EBS lab’s main objective is to prepare the support matrix for various data protection products, storage systems and partner software. This lab builds up and breaks down the various permutations of products and software to continually produce the support matrix. With the vast array of product lifecycles between servers, storage and software – this becomes a big challenge.

The second quality assurance lab we visited was the StorageWorks lab. This area provided a different level of support down to disk array, drive and controller endpoints. Here, a number of engineering QA functions were in action during our visit the foremost of which is firmware testing. For storage systems, there were a lot of protocol analyzers in place. For this function, throughout much of the equipment in the lab, there were Serial Attached SCSI (SAS) protocol analyzers hooked up to drive slots or blade chassis backplanes. Figure A below shows a SAS protocol analyzer hooked up to a SAS drive slot:

Figure A

Figure A
Another quality assurance function that caught my interest was the ability to get to some very low-level functionality within a storage array. While in the storage lab, I noticed that many of the popular storage products such as the MSA P2000 had additional connectivity compared to a normal customer installation. For the MSA P2000, the lab has a special I/O device attached to the controller shown in Figure B below:

Figure B

Figure B
This device is quite the storage utility knife. First of all, all devices in the lab with a red circuit board indicate a series of device states. The primary states for red circuit boards are prototype or internal tool. This I/O device shown in Figure B is an HP internal tool and is not available to customers. The CAT-5 cable attached to the end of the interface allows engineers to get access to the device for low level functions, such as firmware updates (which are done daily) and command line options. The device has an additional functionality of allowing engineers to step through the storage processor’s commands in a debug capacity. This allows HP to diagnose how the storage processor is handling a command step-by-step by jogging it through the sequencing on the current command set.

This opportunity to see a number of the quality assurance labs for HP was an interesting experience. While I do not have anything to compare it to, it was an impressive operation.

Disclosure: The event organizer has provided me attendance to the event which included meals, airfare and accommodations. The opinions mentioned above are a result of an in-person demonstration of the technologies discussed. Read my full blogger disclosure here.


April 6, 2010  8:49 PM

Can iRise, an enterprise visualization tool, aid business analysts?

Rick Vanover Daniel Mondello Profile: Daniel Mondello

The age old expression, “we are on two different pages” may no longer apply to analysts discussing application functionality with business stakeholders. It is a common problem in the software industry: an analyst attempts to describe the needs of business executives with documentation, ending with a product deployed by the development team that is completely different from what they’d envisioned. To that regard, there is another old saying that, “a picture is worth a 1000 words” but what is the value of a fully functional software model, containing all (or most of) a finalized software application’s features?

This is exactly the type of situation iRise has built its business on: situations where complex answers just won’t suffice and a visual representation is needed. iRise has been 200 Fortune 1000 companies’ choice for analysts. This figure includes the likes of General Motors, Fedex, UPS and 197 other big companies. In addition they have another 200 customers in their client base of varying company sizes.

To cater to a growing need for visualization before builds, iRise software, has recently added new features and functionality to their tooling for software business analyst’s use. These features are available in iRise 8 and include drag and drop, content modules, API plug-ins and more.

The idea of visualization for software apps came from a slew of identified issues in software board rooms and a need to solve problems before an application is fully constructed. “We designed this as an answer to the confusion of business stakeholder’s — people heavily embedded in the finances of companies. These people will often approve ideas that sound promising and then when they see the end result they are perplexed, it just wasn’t what they had in mind. Then come the changes, making changes in a fully coded piece of software can be expensive and risky — if you change one thing, chances are you’ll be required to change a lot more” says Mitch Bishop CMO for iRise.

iRise software is very similar to what Computer Aided Design (CAD) was for various things in the 1980’s and 1990’s, except this time the idea has been applied to software applications. Now stakeholder’s can see what to expect from a proposed application before the time, money and effort has been spent developing and testing.

According to the company the next “big thing” in software application visualization for iRise will be predesigned templates. Built and functional templates that will be the basis of available features according to industry demand. Basically if you or your company wants to have an application visualized before building and you want an eCommerce feature built in, you can purchase that feature from iRise and drag and drop it into your application interface.

For more on iRise visit their website, http://www.irise.com/ and check out this blog on our sister site Enterprise Visualization: from requirements to specification.


March 31, 2010  5:12 PM

Software tool and quality certifications: Valuable or useless?

Yvette Francino Yvette Francino Profile: Yvette Francino

There are a bunch of software quality certifications and more popping up all the time. Some of the classics are those offered by ASQ (American Society for Quality), IIST (International Institute for Software Testing), ISTQB/ASTQB (International / American Software Testing Quality Board) and QAI (Quality Assurance Institute. ) Then there are certifications that are being offered for learning particular tools. These certifications are put out by organizations such as HP, IBM and Microsoft, to name a few.  There are also Scrum Master and Scrum Product Owner certifications for the agile crowd.  With so many different certifications to choose from, which are the most valuable for QA professionals? Or are any of them valuable? It seems as though many people feel they are useless.

I asked agile practitioner Mark Levison in a recent podcast what he thought about certifications. Mark felt it was best to stay away from certifications that encouraged a “throw-it-over-the-wall” mentality or those certifications that recommended expensive tools.  He also didn’t feel certification was necessary for agile testers or developers and thought the same skills could be picked up from reading.

I, myself, have recently gotten a Scrum Master certification so I’m quite interested in this controversy. Personally, I think any kind of industry training or professional development is beneficial. I learned a lot from the class and think that people that are new to Scrum would benefit from attending.  However, I agree that the “certification” label does not mean much.  The test is very straight-forward and if you fail, you can simply take it again. Basically, this particular certification just proves I took a two-day class. It doesn’t prove whether or not I’d be able to apply the skills I learned. I don’t put listings for all of the university classes I’ve taken on my resume and those are much more comprehensive than the Scrum class. Wny should a two-day class carry enough weight on a resume to make-or-break whether you will make it past HR’s first filter?

This lack of substantial proof of skill seems to be the basic issue for those that argue against certification.  Does that make certifications useless? A recent press release by ASQTB showed that 89% of software testers surveyed felt they were more valuable to their organization after ISQTB certification. That makes sense to me. If you take a class, one would hope you would gain some knowledge and become more valuable to your organization. I don’t think anyone would argue with that. And if I were taking a survey asking if taking a class made me more valuable to an organization, I would absolutely answer ‘yes.’ This doesn’t really quantify how much more valuable to an organization one is after getting certified.  However, a “certification” loses some of the ability to impress, if it’s attained simply from passing a test.

But what else can be done? There are a couple of networks that are available to help those wanting to gain additional agile skills.  One of these is called the Agile Skills Project. Using a wiki, this group has been using agile methodologies to define an Agile Skills Inventory, identifying skills necessary to work in an agile environment as well as offering suggestions about how to get those skills. Another group that offers a network, particularly for those interested in distributed agile, is the network I facilitate, Beyond Certification. The network, powered by Ning, allows those interested in this topic to discuss and share idea for gaining agile skills.

In general, I think taking classes and then taking exams to test your knowledge is one way to learn.  Getting certified in a particular skill proves that you took the intiative to take a class or study a body of knowledge and pass a test. Though I don’t this passing a test should be a primary factor in gaining employment, it does show that you have enough interest in a topic to want to learn. After you’ve gotten certified, work towards getting experience.  If you’re not able to do this from a workplace, then check into local user group meetings. Check the search engines and find a software test and QA community to join. Volunteer to join a Scrum Team for a non-profit or as a hobby.  There are always opportunities to learn.

So, no, I don’t think quality certifications are useless. It’s a first step. Just don’t stop there. Apply the knowledge and keep on learning.


March 29, 2010  5:12 PM

James Whittaker on exploratory testing and scenario-based exploration

Yvette Francino Yvette Francino Profile: Yvette Francino

Last fall I listened to James Whittaker talk about his book, “Exploratory Testing,” at a virtual presentation hosted by uTest. I have since been able to read the book and follow-up with Whittaker to pick his brain a little more about his view of exploratory testing.

“What is the definition of exploratory testing?” I asked, in an attempt to get a definition I could use for TechTarget’s popular “WhatIs” column. Whittaker was quick to note that his view of exploratory testing may not be the same as others, but he took a stab at it: “Testing without a pre-defined test plan.”

Whittaker explained that the adjective “exploratory” is used in the same way it would be used to describe “exploratory surgery.” “There’s something wrong. You’re not sure what it is and so you explore. Eventually exploratory surgery will lead to other types of treatment.” Similarly, Whittaker explains, exploratory testing is used to explore problem areas so that then further more directed tests can be done. “Exploratory testing helps us document some of the primary testing paths through the software and discover problematic areas so we can explore further testing.”

Whittaker stressed that he felt exploratory testing was a means to an end and not the end of the testing, and said that this is where his opinion may differ from those of his colleagues.

“Using it as the actual end is very naive. It’s one part of the testing process. Don’t assume anything about the solution until you know more about the problem.”

Whittaker also noted that he felt there were some problems with exploratory testing.

“Exploratory testing doesn’t always apply. Some applications require automation to thoroughly test. Concepts like tours are very important.  Exploratory surgeons have techniques. a certain way to enter depending on whether or you’re operating on the pancreas or the heart or a toe.”  Similarly, there are different ways you test depending on the technology. The tours can help structure your testing efforts without scripting them.

“Isn’t exploratory testing simply unscripted testing?” I asked. Whittaker agreed but pointed out that, as he states in Chapter 5 of his book, there are “hybrid” exploratory testing techniques which can include a combination of scripted tests or scenarios and unscripted tests. These can complement one another and provide for something Whittaker refers to as “scenario-based exploration.” Traditional scenario testing takes the tester down a pre-defined path reflecting expected usage of the application. But scenario-based exploration more closely imitates users  who often stray from the expected path.

Come check out six of the exploratory testing tours in this tip Six tours for exploratory testing the business district of your application.


March 29, 2010  5:12 PM

Software development QA issues run parallel to automobile manufacturing mistakes

Yvette Francino Daniel Mondello Profile: Daniel Mondello

Proceeding a Webcast session with Theresa Lanowitz earlier this week, Lanowitz and I entered the post Webcast debriefing room (basically a private phone conference line) where we had the opportunity to pick one another’s brain. Somehow, Toyota was brought up and we ended up finding some common ground in our interests, even those outside of software development and testing–the automotive industry. Both Lanowitz and I agreed, the auto industry is (and has been) suffering from many of the same ailments we are concerned with in software.

The first spark was Toyota and the quality assurance disaster they are currently sorting through. And then a documentary we had both seen “Who killed the electric car” also found its way onto the discussion palette. Both topics were ones we were very passionate about. For me especially, it was the car angle. How the American auto manufacturer GM had really dropped the ball–by killing a market they were the first to enter (the electric vehicle market).

GM was quick to recognize that the combustion engine was a motor we could not rely on infinitely. They knew that one day oil and gas would be either depleted or so expensive that it was no longer affordable to anyone in their consumer base. They decided to build the first electric car, the EV1, which was based and sold on the Saturn platform.

After being at the blunt end of scrutiny and ridicule from their investor’s (Mostly made up of Oil and Gas distributors) GM decided to pull the plug on the project and recall–all 77 EV1’s already sold and on the road. They based the recall on “potential software, and electronic complications”, which they had uncovered in post-sale testing. They arrived in trucks in front of every EV1 owner’s household and loaded up these forward thinking vehicles. Their final destination was one of two places, a handful of fortunate EV1’s made it into auto museums, the bulk of them (just under 75) were dismantled and buried in an Arizona desert.

Lanowitz was more or less baffled by GM’s plug pulling on their production line electric car, but the conversation quickly moved onward to Toyota. There is a real debate going on in the media over whether or not this Toyota fiasco is related to a mechanical issue or to their drive-by-wire computer controlled throttle. It’s a good question, and no one seems to have all the facts but it leads to a sensitive area of technology–with all these various software vendors building function into modern vehicle drive trains, braking systems, radios, communication systems, etc–could there be a problem of over-integrating systems?

Let me give you the benefit of my extensive background in the automotive space. Fifteen years ago, the level of engineering we see today was virtually non-existent. Sure, there were computers mounted in vehicles which controlled the basics of the engine by monitoring rpms, transmission temperature and fluid levels–but they weren’t in constant contact with all other aspects of the vehicle. They were very, very basic, if something broke they signaled a light to flick on the dashboard, and that is about it.

These days, every system is interacting with every other system. The brakes communicate with the radio, the radio is speaking to the door locking sensor, the door locking sensor is dependent on the engine speed–it is like being in a crowded room with numerous conversations going on, all about different topics and somehow everyone is directly involved. It will only get more complex.

As experts on SearchSoftwareQuality have been saying for years now, software is becoming increasingly complex–and automotive integration is mimicking that level of complexity. These days if there is a vehicle issue, you bring it to a mechanic, this mechanic (believe it or not) probably spends just as much time in front of a diagnostic computer as he or she does under the hood. There may come a day, in the not too distant in the future where a run of the mill mechanic will need a Master’s degree in computer engineering just to become certified.

You software testers dealing with multiple platforms and continuous integration are probably plagued with similar issues and nodding your heads in agreement. But in either industry, if huge strides are taken to keep quality high in these software based systems, then perhaps someday soon we can lay many of these modern quality and integration issues in the Arizona desert alongside GM’s first electric vehicle.

 


March 22, 2010  2:51 PM

Follow SearchSoftwareQuality on Facebook and Twitter

Yvette Francino Daniel Mondello Profile: Daniel Mondello

For most of you in the software development space, and for me in choosing online software journalism as a career, there is a tremendous amount of our time spent in front of computers and online. If you are like me, being on your computer and having Facebook open are synonmous with one another. Few could argue that Facebook and its legions of followers, friends and groups are replacing numerous aspects of what was once everyday life. The newspaper, mail, telephone, address books, valentine’s day cards and basic phone calls have all found themselves in social networking’s crosshairs.

That being the case, we would be foolish not to ride the Web 2.0 wave–so we have. SearchSoftwareQuality is now on Facebook. And while you may already be a daily reader of SearchSoftwareQuality.com, let me assure that there are many fantastic reasons to “friend” us.

Think of SearchSoftwareQuality’s Facebook profile as quick way to get daily updates on what is happening in your career, the software industry and cyberspace in general. We update our Facebook profile often, providing you with teasers and links to important software quality and testing stories (which are archived in the “Notes” portion of the page). We have a video library set up that will be updated LIVE from conferences and software venues–finally you’ll get up to the minute video coverage of the conference you weren’t able to expense through your company.

We are hoping that our presence on Facebook might also open up the communication flood gates that have remained fairly restricted on our other media channels. As our friend, we warmly welcome you to ask questions, submit software tips, “like” certain content and not hold back on telling us what you didn’t like. Communication with us on Facebook might be the ticket you need to have your company, application problem or solution story heard across our vast fanbase and readership.

To “friend” us type “SearchSoftware Quality” in the search bar on Facebook and prepare yourself for a wealth of incoming software information that matters to you.

If Twitter is your thing, we’re there, too! Our content will automatically be tweeted, so that you’ll be able to see the latest information, as soon as it’s available. Follow us at @softwaretesttt

TechTarget:Where Serious Technology Buyers Decide


March 19, 2010  9:55 PM

Too old to learn new technologies? Never!

Yvette Francino Yvette Francino Profile: Yvette Francino

Recently SearchSoftwareQuality published a Q&A with Navot Peled, CEO of Gizmox Ltd, entitled How new Web application platforms put dev/test pros’ careers at risk.  It’s not often that I disagree with the experts, but as a former developer who’s way past my spring chicken days, I have to say I was a bit insulted at the implication that older developers would not be able to learn new technologies.

Certainly, emerging technologies can be a challenge to keep up with.  I’d just attended a SQuAD meeting last week in which the speaker, Igor Gershovich, espoused the difficulties of automated testing of Rich Internet Applications.  However, I absolutely don’t buy into the notion that an older person is at a disadvantage in being able to learn these new technologies. In fact, I would say often it’s just the opposite. Most of us have lived through many shifts in technology and are used to the constant changes that our industry throws at us. We don’t run from it. We thrive on it!

The most brilliant technologist is not well-versed in every technology. There are just too many programming languages, operating systems, databases, tools and systems to become an expert in all of them. It’s important to understand what is happening and what is changing in the world and to update our skills to meet the needs of the industry, but does that mean that if we don’t know AJAX we’re over the hill? Those of us that have been around software for years and years usually are very quick to pick up new skills.  I can’t speak for the entire older set, but I can tell you that I absolutely love progress and when some new technology comes out, I want to be the first to jump on the new bandwagon and test it out. One of the great things about my job at SearchSoftwareQuality.com is that I get to be on the forefront of new developments, reading about the latest industry trends.

Not knowing a particular new technology is not what will hurt us in the job market.  What will hurt us is if we stop wanting to learn. If we stay stuck in a world where we only know one way of coding and we refuse to be open to the wonderful changes that surround us, we are indeed going to limit our potential. Instead, we need to read, learn, grow and embrace change.  If we do that, whether we are 20 or 90, we will be a valuable resource to any employer.  There most likely will come a day when we will want to retire,  but as long as we keep learning, there will never come a day when we are unemployable.


March 18, 2010  9:41 PM

TSS Java Symposium 2010: Dependencies, complexity make software QA tricky

Jan Stafford Jan Stafford Profile: Jan Stafford

QA managers, take heart! Not all Java developers think that software quality processes are an unnecessary overhead. The proof? This week’s strong attendance and busy during and post-session Q&As at TheServerSide Java Symposium session, Software Quality: The Quest for the Holy Grail.

Defining the basic requirements for and viewing dependencies as an integrated part of a project are critical elements in project quality and ultimate success today, said speaker Jesper Pedersen, core developer for JBoss by Red Hat and project lead for JBoss JCA, a Java Connector container.

Because development platforms have more business-specific code and platforms are a larger piece of the pie, finding where issues are located is more difficult, Pedersen said. It’s become more important to do good integration testing. Also, dependencies must be managed well and as if they are part of the application.

I caught up with Pedersen after his session and asked him why managing dependencies is so tricky and why developers aren’t crazy about doing quality assurance processes. He answers those questions in this video.

On TheServerSide 2010 Java Symposium slideshare.net site, you can view Pedersen’s notes for this presentation.


March 17, 2010  6:51 PM

Automated testing of Rich Internet Applications (RIA)

Yvette Francino Yvette Francino Profile: Yvette Francino

Igor Gershovich, president and principal consultant of Connected Testing, Inc. spoke this month at the Software Quality Association of Denver (SQuAD) meeting. The topic is one the group had been clamoring for: Automation of Web 2.0 Rich Internet Applications (RIA).

Gershovich started by talking about Web 2.0, sometimes known as “social software,” and the technologies used to create these types of applications. Popular RIA Frameworks and toolkits incude AJAX, Adobe Flash/Flex, Google Web Toolkit and Silverlight. Gershovich said there were hundreds more, but focused his presentation on AJAX, one of the most popular RIA technologies, probably due to it’s pricetag: free!

AJAX is a framework that combines Asynchronous JavaScript and XML and uses techniques that combine and exploit long-standing Web technologies. Examples of Web technologies AJAX uses are XHTML and CSS for structure and presentation and the Document Object Model (DOM) for displaying and manipulating objects. Gershovich went over some of the pros and cons of using AJAX and then explained how Google Web Toolkit (GWT) could be used to write AJAX front-end code in Java, which can then be compiled into optimized, standalone JavaScript files.

Can automation tools, such as HPs Quick Test Pro (QTP) or Selenium be used to create automated scripts for AJAX? Gershovich says he is often asked this question. He says the real question is: Does QTP [or Selenium] work with custom objects from various JavaScript toolkits? The answer is yes, but it’s not easy!

Gerchovich described some of the technical challenges involved in automating GWT-based applications:

  • They use custom or 3rd party Web controls
  • They have no unique object properities
  • Synchronization for AJAX
  • Cascading Style Sheets (CSS)
  • No common design framework between GWT applications
  • Can’t view HTML using View->Source

 

Gerchovich went on to show the technical details about how these obstacles can be overcome, but the bottom line is advanced test automation expertise is required. Gerchovich’s examples used QTP, but he said the same techniques could be used with other automated tools such as Selenium. Coordination with the development team is required as well in order to gain insights into the objects and their properties.

Gerchovich’s presentation, as well as past SQuAD presentations are available for download.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: