May 26, 2009 7:38 PM
Posted by: Rick Vanover
, Rick Vanover
There is no doubt that test environments are the lifeblood of high-quality technology solutions. In a recent discussion with another IT professional, the issue of test environment licensing came up and clearly became a grey area that is applied differently between organizations. In the case of Microsoft licensing, larger enterprises have a distinct advantage to licensing test environments when engaged in Microsoft Software Assurance (SA). While there may be slight differences of what differentiates a test and QA environment, generally a QA system has a longer lifespan. Further, QA systems usually are fully licensed in all regards like their production counterparts.
Fine print and vague or missing clauses are common in software licensing, and test environment licenses share those same ills. For organizations that do not maintain Microsoft SA, the base operating system licensing can be very expensive to maintain for Windows Server, SQL Server database and other systems. For the operating system, evaluation licenses can usually accommodate temporary use. For ongoing use, the practice and options become unclear. The Microsoft SQL Server product continues to offer a free edition, Microsoft SQL Server 2008 Express. The issue from a test/QA standpoint becomes if using the Express edition is representative of the end configuration or product being tested.
This is not an issue for the most part for organizations utilizing free products end-to-end, such as Linux for an operating system and open databases such as MySQL. There may be licensing considerations if any commercial tools or other licensed software titles are used or tested, however.
How are you approaching this topic? Ideally all systems are licensed the same way, but the informal tiers of testing may be omitted from an organization’s larger licensing initiative. Share your comments below on test environment licensing.
May 21, 2009 1:53 PM
Posted by: Jan Stafford
Agile software development
Jon Kern, co-author of the Agile Manifesto, told me recently that many companies won’t adopt the agile development methodology soon. Why? Some companies are doing just fine with waterfall, he said, because it has worked in the past and is still working. Also, they see little chance that their business analysts and leaders will agree to get involved in the development process.
Intrigued by Kern’s assessment, I asked some software development and testing veterans for their views. I asked about their work with and opinions on why companies stick with waterfall development.
It’s true that waterfall methodologies can work quite well, said Mike Kelly, a software testing and development consultant and a fan of the agile methodology.
“I know this will surprise some people, but I’ve worked on several successful waterfall projects,” Kelly said. “It’s crazy, I know. Seriously, I’ve been a member of teams that do roughly the same thing, with minor variation, again and again. The business context is well understood, the requirements are mostly right upfront, and the team kind knows what needs to be done to be successful. If it’s working, it might not make sense to change it.”
Bernard Golden — CEO of IT infrastructure consulting firm, HyperStratus – has worked with development teams that don’t think the agile methodology can be effective in large-scale, enterprise projects. They think that agile is a good micro practice that should live within a good macro project management/planning process. To an extent, he agrees, and thinks agile is “best suited for scoped projects that have small user populations — built on top of robust system infrastructures built as traditional projects.”
Golden is also doubtful that having a “business person be part of the development process ensures that systems will achieve user satisfaction.” Thinking that “one guy can subsume all end user viewpoints and prevent end user political unhappiness strikes me as quite naive about the way organizations works.”
In this short video clip, Golden explains more about why agile may not stack up.
Kelly has also worked on projects where business managers just didn’t want to do more than turn in a list of requirements. “The business [people] didn’t get into business to talk about development methodologies, and to them, it’s often a distraction,” he said.
Requirements analysis expert Robin Goldsmith believes business managers don’t believe that doing systems work is their job.
“Agile is very much driven from a programmer’s perspective, and business folks don’t identify, understand, or agree with it,” said Goldsmith, president of the consultancy, Go Pro Management Inc. “ As an analogy, I have many years of experience working in the most technical of programming roles within IT; but these days I have no interest in fooling around with software internals—I just want stuff to work, and that’s not unreasonable.”
Goldsmith sees the agile-versus-waterfall debate as the subtext behind a larger business-versus-IT cultural issue that causes continual problems for software projects.
Do you agree? I’ll be continuing this discussion with software testing and development experts, and your input would be very valuable. You can share your experiences and views by commenting below or writing to me at email@example.com.
May 15, 2009 2:18 PM
Posted by: Jan Stafford
Moving software development and testing to cloud environments wasn’t a hot topic at Microsoft TechEd 2009 in Los Angeles. Sure, there were some TechEd sessions on cloud computing, and
Microsoft cloud computing evangelist Steve Martin talked up Azure; but a good number of attendees, particularly in the software development field, said cloud isn’t on their agenda now.
“People are waiting to see if cloud computing has staying power and what its true importance will be,” said Wayne Ariola, strategy vice president for Parasoft Corp. told me.
Then again, some attendees and vendors at the show are gung-ho about the cloud. For example, Greg Allen — First Financial Bank SCCM 2007 administrator — told me he’s keen on Microsoft’s virtualization and cloud technologies.
Let’s take a look at what people on both sides of the cloud had to say at TechEd.
Some attendees said that development teams with a wait-and-see cloud adoption strategy could be eating early adopters’ dust. They noted the advantages of using cloud environments for software testing; particularly the ability to test applications in a full production environment.
While the pro-cloud people I met at TechEd don’t advocate dropping development into cloud computing without due diligence, they do think that companies should prepare for and do pilots in cloud environments today.
“It’s not a future technology. It’s a now technology,” said Margaret Lewis, AMD director of commercial ISV marketing. Indeed, she thinks cloud computing is a disruptive technology with the potential to help the economy recover. She foresees the emergence of a bevy of boutique cloud providers for various vertical and horizontal markets. In this video, she explains why cloud preparation and usage should be on software companies’ agenda.
While Lewis sees cloud computing as a recession beater, others at TechEd said that the economy will hold up cloud adoption.
The recession has forced many companies to stall, scrap or not start development projects, moving what’s left to the cloud isn’t a compelling objective now. “They’re also waiting they don’t see cloud as operationally necessary or strategic competitively,” said Ariola. Also, he’s talked to quite a few companies that recently adopted virtualization to consolidate servers, particularly in their test/dev labs, and are happy with the results. They don’t feel an immediate need to do another move right now.
In my TechEd conversations, about a dozen people opined that cloud services aren’t mature or production-ready. Their views reminded me of my recent interview with Eugene Ciurana, director of systems infrastructure at LeapFrog Enterprises, a large U.S. educational toy company, in which he warned that cloud service-level agreements aren’t up to par.
Cloud providers have to work out some thorny issues before scores of development teams get on board, Ariola said. In his work in the field, Ariola has heard many ISVs express concerns about cloud security. Those fears may be well-founded, according to this week’s Forrester Research report citing problems early cloud adopters have had with customer privacy protection.
Ariola thinks dev/test could be a killer app for cloud computing at some point, but right now “it’s not on most application developers radar.” He likened the cloud’s clout today to that of service-oriented architectures (SOA) in its early days. “This is a major change, and it won’t take place overnight.”
Yes, the cloud adoption pros and cons debate will continue for a while. Watch this blog, as well as SearchSoftwareQuality.com and SearchCloudComputing.com, for more information. Meanwhile, check out the news and views in these recent articles and videos:
May 12, 2009 8:43 PM
Posted by: Jan Stafford
, panel discussion
Developers should get their noses off the coding grindstone and spend time developing expertise in a future technology, consultant Ted Neward advised during the Microsoft TechEd panel discussion on “Surviving the downturn.” Too often, he said, developers are so focused on current projects that they don’t think about advancing their careers.
So, if developers should be studying future technologies, what should they choose? “Choose something that will make you rich and something that will make you happy,” said panelist and consultant Aaron Erickson.
In this post, I’ll share some of the career advice from the panel discussion, held at the TechEd conference in Los Angeles. Panelists included Microsoft MVP Rachel Appel; consultant Aaron Erickson; Headspring CTO Jeffrey Palermo and principal consultant Eric Hexter; and moderator Bryan Von Axelson, a Microsoft partner solutions advisor.
When choosing which technologies to study, do some homework so you’ll know which technologies are not going to be winners in the long term, Palermo said. For example, he ignored the first version of Windows Workflow. Other panelists laughingly agreed he’d made a good choice.
“Work on something that raises the bar,” said Hexter. Choose technologies that can help your company leap ahead of competitors.
Neward advised developers to learn a new language every year, even though it may be an exhausting process. Frankly, he said, developers must accept that they’ll always be in school. “If you’re not getting better, you’re getting worse,” he said. “If you’re not excited by a constant pace of learning, you’re not in the right field. The IT space has a higher pace of change than other industries.”
A good practice for ongoing learning is to first learn concepts more than details, Neward said. Once your interest in piqued, other panelists added, then drill down and become an expert on the how-to details.
Don’t just focus on development to do your job better, panelists advised. Beyond studying technologies, expose yourself to different segments of development and IT.
Just being an expert in one job, like programming, is short-sighted, Palermo said. Pay attention to what the other IT guys are doing.
“Go for something beyond your comfort zone,” Appel said. Learning about database management, for instance, could help a developer approach problems in a different manner. Others suggested shadowing data center, network and other managers.
Too many IT people focus on technologies and not business processes, Palermo said. Those who understand the fundamentals of solving business problems will develop better products that meet business requirements.
After the panel discussion, I interviewed Neward about issues relating to software testers. He said that being able to pitch the value – in dollars and cents – of your team’s role in development is critically important during an economic downturn. Here he answers questions about the impact of the recession on software testing, ways software testers can show the value of testing in dollars and cents and career strategies.
May 12, 2009 8:36 PM
Posted by: Jan Stafford
, Chris Aker
, Microsoft Windows
, Zennith Borrego
The keynotes at Microsoft TechEd in Los Angeles today brought mixed reactions from attendees today, according to the 23 IT pros I quickly interviewed as they exited. Most were enthused about promised new enhancements to Windows 7 and Windows Server 2008 R2’s virtualization features. Others wanted Microsoft to give them new technology that they can use today.
The happiest news, most attendees said, is that a new Windows 7 and Windows Server 2008 R2 RC may be released this year. The release date was sort of promised by Bill Veghte – senior vice president of the Widows Business at Microsoft – during his keynote. He said that testing and partner feedback indicates that “Windows 7 is tracking well for holiday availability.”
Microsoft’s planned enhancements to existing virtualization products are good news for Greg Allen, SCCM 2007 administrator for a large financial institution. He sees virtualization bringing great opportunities for savings on server costs and more configuration flexibilitly.
On the flip side, attendees wanted more birds in the hand than in the future, future releases. “I didn’t see or hear anything new or exciting,” Chris Aker of Reed Elsevier Inc. told me. “Mostly, they talked about products I’ve heard about before.”
Zennith Borrego of Ysleta School District’s Technology also came away from the keynote with a positive take on Windows 7. Here’s a video clip from my interview with her.
For in-depth news and updates, watch this space and TechTarget’s Microsoft TechEd news roundup.
May 12, 2009 7:46 PM
Posted by: MichaelDKelly
With Microsoft TechEd happening this week, I thought it might make sense to look at some of the content being presented. For those who can’t attend the conference, Microsoft has provided TechEd Online, as series of “channels” where you can watch talks and interviews related to the conference. It’s on the Developer Tools, Technologies, and Practices channel that I found Grigori Melnik’s interview of Keith Stobie, a Test Architect at Microsoft, on Spec Explorer for Visual Studio.
If you’re not familiar with model-based testing, it’s a flavor of software testing where test cases are generated based on a model that describes various aspects of the application. By creating a model of the application and using that to generate various sequences of calls and their expected results, you can come up with a large amount of varied and complex test cases. These types of tests lend themselves to finding edge conditions, long-sequence errors, and issues related to application or system state.
Stobie, who will be giving two talks on this topic this week at TechEd, points out that model-based approaches can show design issues up front. In one of his talk he’ll describe how Microsoft used the Spec Explorer plug in for Visual Studio to verify over 50 of the Windows Protocols. His second talk builds on that to lay out how Spec Explorer can be used for both model exploration and for automatic test generation. Having spoken with Stobie on multiple occasions at peer workshops, I’m sure those talks will be worth the price of admission. I consider him one of the best minds in this space.
You can find Grigori’s interview of Keith on the Microsoft TechEd Online Developer Tools, Technologies, and Practices channel. For more on Spec Explorer for Visual Studio, check out Spec Explorer for Visual Studio by Su Llewellyn.
May 11, 2009 9:20 PM
Posted by: Jan Stafford
Testing software in the cloud: Pros and cons
Consultant Bernard Golden – author of IT bestseller, Virtualization for Dummies — is an advocate for virtualization and cloud computing technologies, but he doesn’t want anyone to approach either wearing rose-colored glasses. I met and talked with Golden recently about a slew of IT and development topics. In this post, I’ll share his thoughts on the pros and cons of software testing in cloud environments. In his day job, by the way, Golden is CEO of the IT consulting firm, HyperStratus.
Let’s start out by meeting Golden – via video — and hearing about his experience with a software testing project on Amazon’s cloud.
Expanding upon this sound bite, Golden told me that doing software testing in a cloud environment makes sense for several reasons. For one thing, it’s easier and less costly to mirror a production environment in the cloud. Very few development labs actually have the exact server and software environment as does the data center that runs the application in production. The cost of that set-up would be astronomical. Along the same lines, development in the cloud makes it possible to scale up and scale down the application for testing of various load sizes…without incurring hardware costs. So, the cloud supports difficult testing requirements like load testing and scaling that many labs can’t support.
Golden pointed out some challenges in software testing in the cloud. First, there’s the tricky business of finding the right cloud services provider. Then, software testers will have to learn some new skills. Development teams have to be careful about integration with internal applications, as service endpoints are required and configuration will be different in a cloud setting.
Developers and software testers will also have to figure out how to handle application lifecycle management in cloud environments, Golden said. Expect to see lifecycle and architecture testing issues with web and application servers, the load balancer and databases.
Golden will be writing about his work in cloud and virtual lab environments in a SearchSoftwareQuality.com series. The first installment is Testing software with Amazon Web Services. In March, he wrote an article for SearchServerVirtualization.com on Choosing an application architecture for the cloud.
What are your concerns about, hopes for or experiences with software development and testing in virtual labs and/or cloud environments? Tell me your story via video or a simple email: firstname.lastname@example.org.
May 8, 2009 5:20 PM
Posted by: Jan Stafford
Software Quality Insights blogger Mike Kelly has kept this blog up to date on software testing. Rounding things out, we now welcome a software quality assurance (QA) expert Stuart Yarost as guest blogger. A computer engineer with numerous certifications, Yarost has worked in avionics testing, QA and development for more than two decades.
Yarost is also a devoted member of the American Society for Quality (ASQ) and vice chair of programs for ASQ’s Software Division. Currently, he is helping plan programs for The World Conference on Quality and Improvement ASQ’s yearly national conference. Running from May 18-20 in Minneapolis, the conference will offer over 100 sessions.
“There is something for you no matter your area of interest,” Yarost said. “As a software quality assurance engineer, I will be attending the Institute of Software Excellence (ISE), one of three special institutes being held concurrent with WCQI.” The ISE, run by ASQ’s Software Division, is being held for the first time in 2009.
All the ISE sessions pertain to areas that affect software quality. “They not only cover how to test, but also how to control the whole software development process, ensuring that the software product is developed with software quality from the beginning,” said Yarost. Topics covered will include using Capability Maturity Model Integration (CMMI), lean practices, or data-driven software management and
The conference’s presenters are “the movers and shakers in the software quality field,” Yarost said. For example, Bob Stoddard — who is presenting on “CMMI High Maturity Made Practical” — was part of the team that developed CMMI. Also, Linda Westfall — presenting “Software Is a Risky Business” — was the first person to receive recognition as an ASQ.
There’s a full listing of ISE sessions on the ASQ conference site. Besides the presentations by Stoddard and Westfall, other sessions include the following:
- Kandy Senthilmaran – How to Set up IT Dashboards Using the Critical to Quality (CTQ) Process
- Zigmund Bluvband, Sergey Porotsky – Reliability Centered Lean Software Testing
- Timothy G Olson – How to Lean Processes and Procedures Using Best Practices
- Taz Daughtrey – Data-Driven Software Management
- Carol Dekkers – Are You Smarter Than a CSQE?
- Mark Paulk – Selecting and Implementing a Best Practice Framework
Yarost is one of the hosts of the Software Division hospitality suite on Monday, May 18. He’ll be blogging from the conference for Software Quality Insights.
May 8, 2009 12:03 AM
Posted by: MichaelDKelly
Earlier this week Micro Focus, best known for their Cobol application development tools, announced that they would be acquiring some product lines from Compuware Corp., as well as Borland. Both Borland and Compuware are makers of software testing and quality monitoring tools, so this news has created a buzz in our community.
The Borland deal brings Micro Focus a full suite of application lifecycle management products, including more testing products. In 2006, Borland acquired Segue Software and currently supports SilkCentral Test Manager, SilkTest, and SilkPerformer. Tools that compete in the marketplace with Compuware. It will be interesting to see what direction they take as they consolidate their offerings.”
Compuware’s lines include some cool products:
- QADirector, Compuware’s test management solution for providing a framework for managing the entire testing process;
- DevPartner, Compuware’s award-winning debug, analysis, test and tuning applications for Microsoft Visual Studio and Java;
- TestPartner, Compuware’s flagship automated testing tool for traditional functional test automation;
- QALoad, Compuware’s load testing tool for load test development, execution, and performance analysis;
- and the less well known ApplicationVantage, which can help find and fix performance and infrastructure problems.
According to their press release, the acquisition moves Micro Focus into the $2 billion global application testing and automated software quality market. Micro Focus has made a number of acquisitions over the last three years, including: Liant Software who provided COBOL and PL/I tools; NetManage who web-enabled legacy applications; and Relativity Technologies who provided solutions for Enterprise Application Modernization and Application Portfolio Management.
The Compuware suite of tools appears to fill the quality assurance gap in the Micro Focus suite of tools.
I don’t know what the numbers are, but based on my experience I suspect Compuware is probably the fourth largest player in the automated testing, load testing, and test management tool space behind HP Mercury, IBM Rational, and Microsoft Visual Studio Team System. It’s been my experience that HP Mercury is the juggernaut of the industry, and teams using the MS and IBM products are doing so because they are already using other tools in the vendor’s suite (such as WebSphere or Visual Studio). It will be interesting to see if Micro Focus can position them in a similar way.