Earlier this month, CollabNet announced Agile Assessment, a new consulting service that helps IT organizations assess the status of existing Agile practices and develop strategies to extend the benefits of Agile-based methodologies across the enterprise.
I had the opportunity to speak with David Parker, Vice President of CollabNet’s Scrum Business Line Unit about the announcement.
SSQ: Can you tell us more about the Agile Assessment? Is it a series of questions like a survey, or does a team come in and observe? Does an organization get scored, and then are there recommendations about how the team could incorporate more Agile practices?
David Parker: CollabNet’s Agile Assessment is one of the most critical steps an organization can take to become more Agile. First, it clearly defines the goals you are seeking to accomplish while weighing the business context in which you operate. Second, it delivers a snapshot of your organization, helping you to determine the extent that you have embraced Agile. And finally, it delivers a crisp, prioritized plan that defines the practical steps you should take to become more Agile.
The process by which we build this assessment includes a detailed questionnaire, on-site observation of the development teams and free-form discussions with management stakeholders.
SSQ: The announcement says the service includes: “strategies to extend the benefits of Agile-based methodologies across the enterprise.” What type of “Agile-based methodologies” are we talking about? Scrum? XP? or things like “collaboration,” and “transparency”?
Parker: Although CollabNet has a long history of involvement with Scrum, we do recognize that 1) most organizations adopt a hybrid Agile approach that incorporates aspects of Agile and Waterfall and 2) new methodologies like Lean/Kanban are emerging. As such, our assessment is methodology-agnostic. As mentioned above, a critical part of the assessment is the prioritized action plan that helps companies determine the steps they need to take to become more Agile. From our perspective, it does not matter what flavor of Agile the company is adopting. Our goal is to simply help the organization become more Agile.
SSQ: Agile practices promote face-to-face communication and collaboration. Is the team makeup (co-located or distributed) part of the assessment? What are your thoughts on co-location vs. distributed teams?
Parker: Geographically distributed teams are a reality today. It is the rare enterprise who has not, to some degree, embraced a distributed development model. As such, part of scaling Agile to the enterprise necessitates the use of Agile in a distributed context. Having worked with hundreds of clients with distributed development models, we fully appreciate the challenges it brings. But the benefits of access to deep talent pools and of reduced cost (to a lessening extent) make the net result a positive one for the enterprise. We have proven that Agile can add great value to distributed development organizations.
CollabNet’s Agile assessment examines the interaction of distributed teams. We look at release planning, sprint planning, backlog management, and organizational context. In this way, the assessment helps organizations understand where things are working well and where there is room for improvement, and provides a plan for better integrated distributed teams.
SSQ: Once again, on the question of “extending across the enterprise,” can you elaborate? Are we talking about using Agile practices for sales? Marketing? Governance? Operations?
Parker: Although CollabNet’s Agile Assessment focuses on the software development organization, we recognize that the development organizations do not work in a vacuum. The truly Agile enterprise brings multiple stakeholders into the development process, and into IT as a whole. When we think of extending Agile across the enterprise, we are looking at ways to better engage with stakeholders in other parts of the organization.
SSQ: Do the services include an evaluation of the organization’s ALM toolset and/or recommendation for tools?
Parker: Although tools can play an important role in a broad-scale Agile adoption, CollabNet’s Agile Assessment is tools-agnostic. Evaluation of specific tools is not part of this assessment.
SSQ: Are “best practices” for Agile ALM available?
Parker: With hundreds (thousands?) of large-scale ALM and Agile customer engagements, CollabNet has built up a huge repository of expertise in Agile practice, Agile Process and ALM. We believe that we are one of the few vendors that can really combine these three for enterprise adoption. Each and every engagement we enter into leverages this vast experience. CollabNet case studies provide real-world examples of our implementations.
SSQ: Are these Agile development coaching services? If so, is a particular methodology such as Scrum recommended?
Parker: The CollabNet Agile Assessment is not a coaching engagement. Rather, it is designed to provide a snapshot of an organization’s adoption of Agile techniques and a prioritized set of recommendations for helping improve agility. We seek to be tools-agnostic, and we do not favor one flavor of Agile over another. That said, CollabNet does have deep expertise in Scrum. We’ve trained more ScrumMasters than any other vendor, and have on staff an impressive set of Certified Scrum Trainers and Coaches. If Scrum is the best fit for an organization, we are certainly well-equipped to help them adopt it.
Recently, SSQ created a quality metrics guide which includes a series of articles, tips and stories related to measuring software quality. It’s a complicated and controversial topic with no easy answers. We asked our readers to weigh in, and I wanted to share a couple of insightful responses we received.
Darek Malinowski, an ALM Solutions Architect at Hewlett-Packard, writes:
In my opinion we should start from business requirements. First we should estimate the weight (criticality) or business impact of each requirement. Business impact is related to loss of money, loss of confidence, loss of market etc. So the best to measure quality is requirements coverage by tests vs. requirements criticality. This can tell if the product we deliver to the business provides all needed functionality as required.
Haya Rubinstein works as the Quality and Delivery team lead in SAP in the IMS NetWeaver MDM group. After reading Crystal Bedell’s article, What upper management should know about managing software testing processes, she gave a very detailed answer to our question about how her organization measured software quality.
In my experience, management is interested in the test KPI’s even before the release is decided on. There is usually a quality standard that they wish to adhere to.
Following is an example of KPI’s for a software release:
- 0% defects on the top ten priority functionalities.
- Less than 5% degradation in performance.
- All new features have been tested.
- All new “hard” software requirements were met.
- All quality standards were met or mitigations were agreed on with quality standard owners.
- Over 90% of tests on new features have passed.
- Over 95% of all unit tests have passed.
- Over 95% of all regression tests have passed.
- Over 95% of all quality standard tests have passed.
- No Very High or High priority defects remain open.
- All medium priority defects have been examined by development to ensure they are not a symptom of a more severe issue.
Rubinstein goes on to describe the detailed and disciplined process that is used throughout the development life-cycle with test milestones starting with the planning of the feature through it’s delivery. She says that at the end of the test cycle, management receives weekly reports aimed at showing compliance to the KPIs including:
- If there are any risks to KPI compliance they are presented together with the mitigations proposed.
- The compliance to the KPI’s above with the current status (number or % and color coded)
- Details of open Very High/High priority defects –Defect #, Summary of issue, Created by, Opened on date, Assigned to, Current status
- Approval status for the relevant features according to hard software requirements and details of remaining defects that are still open on each of the features.
- Link to full defect report.
- Link to full test plan report.
- Notes exist for all open defects.
An overall status (green, yellow, or red).
What about your organization? What metrics do you use to measure quality?
Replay Solutions recently announced that they are expanding their performance monitoring tools to the mobile market. Larry Lunetta, CEO, and Jonathan Lindo, VP of Products and Technology at Replay, explained how the two new solutions work.
It’s a very powerful ability to transport any issue, whether performance-related, security-related, or a logic bug, from a test environment back to the developer’s desktop, where he or she has a very rich set of tools that can be applied to understanding and fixing that problem.
apmMOBILE, which is focused on production and is applied in a more lightweight manner, offers real user monitoring for enterprise help desk and customer support. It can be enabled on every mobile application, and operates in either an “always on” mode or be activated on-demand. Lindo explained, “We are able to expose a level of information that we haven’t seen to date on the market — things like performance trending, deployment metrics and AppDex monitoring.”
An important aspect of these two offerings, according to Larry Lunetta, is the following:
For more on mobile performance, see Performance management for mobile devices: Solutions and strategies.
There was a time when two-way communication was limited primarily to those people with whom we already had relationships. Then social media came along bringing us a whole new way to communicate. Virtual networks are giving us access to more information than we ever thought possible. We have more than access to information and documentation, though. We have access to people – experts, team members, authors, vendors and customers. This ability to connect with anyone from around the world has helped in fostering stronger communication and collaboration in teams, regardless of where team members are physically located.
SSQ has published a number of articles and expert responses focused on how social media has changed the way businesses operate. In Social media: A guide to enhancing ALM with collaborative tools, you’ll find a collection of tips, stories and expert responses describing ALM tools that take advantage of social features and how distributed teams are taking advantage of social tools.
Recently, SSQ’s Agile expert Lisa Crispin spoke at our local SQuAD (Software Quality Association of Denver) meeting about distributed teaming and challenged us all to experiment with new ideas. Listen in and hear some of her thoughts about distributed teams and social media. “It’s really transforming how we work,” says Crispin about social media, a big Twitter fan.
Today OpTier is announcing the rebranding of their APM product to back a new approach to application performance that reflects the needs of today’s always-on business environment. I spoke with Linh Ho, Vice President of Corporate Marketing at OpTier about this effort.
According to Ho, the challenges inherent in mobile performance testing include identifying where issues lie along the operation delivery chain, developing and testing for five or more platforms, as well as a myriad of devices, and taking into account service provider differences.
As far as testing across the many different devices, Ho advises:
If there is a need to prioritize, prioritize where the bulk of your customers are. Otherwise, your critical business application is your business, and if that’s not performing, you can’t ignore a particular device because there are only ten users using it. Those are ten customers.
She offered several additional tips for ensuring mobile performance, including:
Be cautious of third party content or components that could impact performance.
Test early and very often.
Make sure you are monitoring and measuring and reporting on performance of applications all the time.
As far as ideas for implementing an APM strategy, Ho recommends a cross-siloed solution that provides value to developers, testers, QA managers and operations team members. She also suggests that “when looking for a strategy, make sure your choice is complementary to what your organization has already invested in. A lot has already been invested in the monitoring space; add on to complement the older technologies.”
She further emphasized that monitoring must be taking place at all times, and while a lot of older technologies are not always-on, it’s important in today’s economy that businesses and solutions move to this always-on approach.
For more on performance testing, check out these related stories from SSQ:
With the acquisition of Austria-based Ventum Solutions, IT process automation vendor UC4 has added automated enterprise application release process software to its ONE Automation platform. The acquisition was announced today.
Application release is a key pain point for DevOps teams who are rolling out applications in enterprise, services, cloud and the many other diverse environments available today, said Clark, CMO of UC4 (Bellevue, WA), in our pre-announcement interview.
Adding Ventum’s release release planning and coordination capabilities to UC4’s deployment engine completes its central management and control capabilities. “We want to make sure organizations don’t have what they have today when rolling out applications,” Clark said, citing huge volumes of helpdesk calls that typically follow releases.
UC4’s goal with ONE Automation is to provide structure to DevOps processes. “The DevOps movement has real grass roots activity based on using open source tools,” said Clark. “But DevOps lacks an automation framework and engine and platform to make sure processes are scalable and implementable by operations in IT.”
Now integrated into ONE Automation, Ventum’s automated release management software bring features in the areas of application and component modeling, test environment scheduling, resource conflict identification and centralized approval processes.
Already integrated with UC4’s Application Release Automation, Ventum’s solution enables release managers to initiate a deployment using UC4’s automation engine from within Ventum’s UI. Tighter integration is currently being developed to provide even greater efficiencies and cost savings.
For more information on application release automation, check out these resources:
Automation from version control to deployment
Automating release management processes with continuous integration
UC4 white paper on Application Release Automation
“Only 12% of companies perform a thorough analysis before enabling mobile devices access to business applications,” according to a recent survey conducted by Dimensional Research and sponsored by HP. Yet as Gal Tunik, Senior Product Manager, HP Software, observes in a recent article, “mobile applications have revolutionized how people conduct business.” He discussed challenges and strategies for managing mobile application performance in a recent interview.
Challenges include server utilization issues, network issues and end-to-end performance testing problems, Tunik explains. HP and partnering organizations provide products that address each of these issues and integrate with each other.
Tunik offered the following pointers for deployment and mobile performance testing:
Using your business goals as a guide, identify the devices your targeted market is most likely using; this is important as it is not feasible to test each and every device on the market.
Consider developing with the older models in mind.
Enhance employee productivity by doing BYOD (“bring your own device”) functional testing.
Test applications on each of the main operating systems, as these can affect performance differently.
Test in real network conditions, on real carriers, which can vary widely.
For more information on testing management, see Real-time performance monitoring for mobile apps, Mobile testing: Nine strategy tests you’ll want to perform and Improving software performance: Mobile, cloud computing demand APM.
High performing teams are able to balance creative conflict and safe communication, according to Agile Coach Ryan Polk of Rally Software, whom the Atlanta Scrum Users Group hosted at their January 25 meetup. Polk illustrated the variations between teams with a continuum featuring “Conflict” on one end and “Harmony” on the other. At some point between “Creative Conflict” and “Fun/Play,” towards the center of this continuum, is where high-performing teams can be found.
He explained, “If you don’t have people working together, no amount of process or tools or framework will result in a high-performing Agile team.” Then he reviewed the typical Agile ceremonies, discussing how the standard practices break down when a team isn’t working together effectively.
The Agile practice that Polk highlighted most was retrospectives. He said that they may be the most important activity, though many practitioners may avoid retrospectives because team members lose interest in them. He offered several fun approaches to get started with retrospectives, including activities such a “Draw me a picture,” “Futurespectives” and “Break-up letter.” He also recommended the book Agile Retrospectives: Making Good Teams Great by Esther Derby and Diana Larsen.
In addition, he offered some suggestions about the estimation process, which he said should not be overly time-consuming, but rather an off-the-cuff activity, and he offered examples of estimation games that can take some of the pain out of this process, such as “Planning Poker” and “High-Low Showdown.” He emphasized that these activities “feel like games, and they enable estimation in a safe environment. They are also very accurate.”
More information from this presentation is coming soon on Ryan Polk’s blog.
Recent articles on SSQ related to Agile team work include:
Recently Keynote DeviceAnywhere™, who specializes in mobile monitoring and testing, announced its support for the Application Resource Optimizer (ARO) diagnostic tool from AT&T, which collects data and analyzes mobile application performance. Developers, those in telecommunications and enterprises seeking to upgrade applications, as well as mobile device consumers, will be happy to learn that this very timely tool is now available.
Leila Modarres, Senior Director of Marketing at Keynote DeviceAnywhere, explained how they are participating in this effort by offering access to ARO, which is pre-loaded on actual mobile devices, such as Samsung Captivate, HTC Inspire 4G, HTC Aria, Samsung Infuse 4G, LG Thrill 4G 3D and Motorola Atrix 2, enabling developers to download apps in their testing environment and test directly on the device. Developers can have on-demand, remote access to this service, which expedites app testing and benefits developers by increasing app ratings.
In a recent release, Faraz Syed, President of Keynote DeviceAnywhere said, “The creation of AT&T ARO represents a win for everybody involved from consumers, developers and the carrier. Consumers will have a better experience, developers will have better-rated apps, and AT&T will have more efficient apps running over its network.”
Many companies are already using ARO, including Pandora and Zynga. Chief executives from each of these organizations made appearances at the AT&T Developer Summit held January 8 and 9, 2012, in Las Vegas and gave testimonials on how this tool is helping them improve their app performance.
One of the greatest advantages to this unique tool is that it enables optimization of hardware performance, specifically battery life and data usage.
Leila Modarres said:
As applications become richer and more convenient, it’s also going to mean that they are going to become more high maintenance. They are going to need a lot of memory usage; they are going to require a lot of battery. So initiatives such as this, which we intend to support, will help ensure that innovation is not limited because of a constraint on the hardware of a device.
Read about the Keynote DeviceAnywhere partnership with TomTom in our previous post: Keynote DeviceAnywhere provides customized performance monitoring to GPS vendor TomTom.
As we end 2011, we at SSQ are taking a look at our most popular stories of the year. Melanie Webb reported on our top ALM stories and today I’m going to fill you in on our top six Agile tips and stories.
You’ll see that four out of the six deal with questioning the popular methodology, specifically comparing it to the traditional Waterfall approach of developing software. Some people love Agile; others hate it. Though recent surveys show Scrum as the most popular Agile framework, more and more organizations are pulling together a number of Agile techniques, creating a customized methodology that works best for their organization. Wherever you fall on the Waterfall vs. Agile debate, it would be worth your while to check out these top articles and read about the varied opinions and perspectives.
Coming in at number six in our lineup is: Waterfall vs. Agile development: A case study. Though many people claim that Agile development provides better results than using the Waterfall methodology, it’s hard to prove. In this case study, one development team worked on two similar projects, but with one project using Waterfall, and with the second project using Agile. Though the development team was new to Agile development and had a shaky start, in the end, they were convinced of the benefits.
Though Agile development worked best in that instance, Consultant Nari Kannan makes a case for a hybrid approach in our number five story: Why hybrid Waterfall/Agile process lessens distributed software development problems. In this tip, Kannan describes mixing Waterfall and Agile techniques, claiming that the hybrid approach benefits distributed teams, by combining some of the discipline found in Waterfall with the flexibility found in Agile.
Kannan is also the author of our fourth most popular Agile story, Scaling Agile software development: Challenges and solutions. In this tip, Kannan addresses some of the difficulties with executing Agile development on large projects, and again, addresses the issue of distributed teams and offshore development. He talks about ways to organize projects and improve communication when working on large-scale projects.
SSQ contributor David W. Johnson takes a very logical look at the differences between Waterfall and Agile in his tip: Waterfall or Agile? – Differences between predictive and adaptive software methodologies. Johnson describes relative strengths and weaknesses to both approaches and discusses how and when to leverage each, depending on the needs o f your business. Johnson recognizes the heated debates over the merits of predictive vs. adaptive methodologies and recommends using the best aspects of both when deciding upon a methodology.
We certainly hear a lot from those singing the praises of Agile development, but is it really all it claims to be? The Web is full of blog posts from people who hate Agile. In Agile development: What’s behind the backlash against Agile?, SSQ’s Jan Stafford takes a hard look at the dark side of Agile. What are the nay-sayers reasons for speaking out against the methodology?
Finally, our number one Agile story for 2011 is Agile requirements: A conversation with author Dean Leffingwell. In this interview, I talk to Leffingwell about his book, Agile Software Requirements – Lean Requirements Practices for Teams, Programs, and the Enterprise and about some of the challenges with the requirements management process in Agile development. Leffingwell describes the differences between requirements processes in traditional and Agile environments and gives some advice on what teams should be looking for in Agile requirements tools.