Software Quality Insights


July 8, 2010  8:53 PM

SQL injection flaw leaves door wide-open to valuable user information on a popular file sharing site

Daniel Mondello Profile: Daniel Mondello

This week, a trio of hackers based out of Argentina uncovered various entry points into the popular (and controversial) file-sharing site Pirate Bay using SQL injection flaws contained in the site. The infiltration gained them access to upwards of four million user profiles containing names, addresses, email accounts and other sensitive and (potentially) incriminating information.

As originally reported by Krebs on Security, the group gained access through SQL injection vulnerabilities contained within the site. The leader of the hacker group, Ch Russo, maintains that he and his accomplices did not crack the site for any personal gain, though he did admit, once inside, it had dawned on him that some of the information uncovered would have been valuable to the Recording Industry Association of America and Motion Picture Association of America. But at the end of the day, they chose not to share information with either organization. The group says that they were only attempting to spread awareness that security vulnerabilities exist and SQL injection flaws can still be readily found in today’s applications and websites.

Pirate Bay is a website that allows registered members to access and download numerous types of multimedia using the currently lawful method of file sharing. File sharing takes particles of related information from various sources and then compiles them in sequence to create a full multimedia file.

Is it legal? Well that is for the courts to decide. If it is deemed illegal, it would a difficult crime to trace. Technically, downloading copyright material is unlawful but because the media is compiled from numerous sources it is very tricky to track and even harder to prosecute because only a portion of media is taken from each source. It would be like trying to charge someone for Grand Theft Auto who only stole a steering wheel.

Either way, the hack has eyebrows raised about the security exposure and concerns are growing because of SQL injection and other types of security flaws. This hack just proves what kind of damages are possible when SQL injection issues exist. In this case, under the assumption that file sharing will become illegal, these hackers could have sold information on users that might provide evidence for lawsuits. SQL injection flaws are clearly no laughing matter.

SQL injection is and has been a major concern among quality and security departments as it is elusive to developers and testers to find and eliminate but it is fairly easy for hackers to find and exploit. In this case the hackers (had they been malicious ones) could have exposed and sold valuable personal information of Pirate Bay users.

Hackers are able to gain access to apps through weak SQL portals by adding their own Structured Query Language (SQL) into language field features on sites and in applications. These coded statements instruct the app or site to respond to their coded request and (in most cases) grant them administrative or backend access. Once access is gained, typically the sky is the limit to what database information becomes available and what changes can be made.

While many vendors advertise SQL injection detection and fixes in their offerings, SQL injection remains a high-profile risk in many expert’s minds.

For information on how to protect your software applications and sites from similar security compromises, we recommend these tips:

July 8, 2010  6:27 PM

Requirements debate continues: Are visualizations beneficial or dangerous?

Yvette Francino Yvette Francino Profile: Yvette Francino

This month SSQ is focusing on software requirements and ways to overcome the challenges so that defects can be eliminated up front.

One method of requirements elicitation that I blogged about recently is the use of visualizations. Visualization software is used by business analysts to create a working simulation that can be used to help communicate with stakeholders when gathering requirements. This blog post sparked enough debate to motivate me to write a full feature story: Are visualizations the answer to gathering requirements? In this piece, I interview several project team members and analysts about their experiences with using visualization, exploring the benefits and drawbacks.

One of the biggest points of contention is the concern that visualizations will focus on the User Interface design, or on ‘how’ the software should work, rather than ‘what’ the software should do. Requirements expert Robin Goldsmith describes how to “discover REAL business requirements” in Problem Pyramid discovering REAL software requirements – Part 1.

I asked Goldsmith what he thought of introducing UIs as part of the requirements gathering process and in his expert response he answers rather adamantly, “Focusing on the UI, system use cases, prototypes, visualization, and other forms of product/system/software solution description in the name of requirements makes it virtually certain that the developers (which include analysts and testers as well as programmers) will not be paying sufficient attention to what is really needed.”

However, after talking to those who experienced such success with visualization software, my impression is that there can be a lot of benefit from introducing the UI at a high-level as a tool for communication and collaboration. In my opinion, if you don’t talk about these things with the stakeholders they are almost certainly going to end up with something they didn’t expect when the product is delivered.

What about you? What are your opinions and experiences?


July 8, 2010  4:57 PM

Mike Cohn previews his ADAPTing to Agile keynote

Yvette Francino Yvette Francino Profile: Yvette Francino

Mike Cohn, one of the leading authorities on agile methodologies, just happens to live in Lafayette, CO, a Denver suberb. This is lucky for those of us that live in the Denver area because it means we occasionally get to see him speak at the Denver Agile User Group meetings. Such was the case last week when Cohn gave us a preview of the keynote he will be giving at this the Agile 2010 Conference  being held August 9-13 in Orlando, FLA.

In his presentation, Cohn reminds of the progress that’s been made in agile. Though it’s not a silver bullet, organizations that are using agile have reported productivity improvements. “Agile is not something you become; it’s something you become more of, ” Cohn stressed. He then added, “For most of us, it’s about becoming more than we were.” Cohn challenged us to “raise the bar on each other,” finding ways to incrementally improve. “It’s still about continuous improvement,” he said. Cohn then talked about how we should be ADAPTing to agile with:

  • Awareness
  • Desire to Change
  • Ability to work in an agile manner
  • Promote early successes to build momentum and get others to follow
  • Transfer the impact of agile through the organization so it sticks

Check out this video where Cohn tells us about ADAPTing to agile.


July 7, 2010  5:52 PM

Holiday hackers use cross-site scripting weaknesses on YouTube, adult pop-ups

Daniel Mondello Profile: Daniel Mondello

Over the holiday weekend YouTube users fell victim to a series of coordinated attacks by a team of black hat users. Their antics ranged from redirecting unsuspecting YouTubers onto malware sites to plaguing them with pop-up ads for adult Websites. The group used a little-known weakness in YouTube’s comment field to place poisoned HTML tags that caused the pop-ups to appear uncontrollably, eventually forcing Google (YouTube’s owner) to step in and shut down the comment features and other aspects of the site.

Allegedly, the attack was coordinated by an online team of pranksters known as the 4chan (pronounced Fortune). The group used a flaw in YouTube’s comment box that normally restricts the amount of HTML allowed, by simply adding additional script tags. Usually when these tags are used, YouTube denies the entry of script tags but 4chan members discovered that by adding, repeat secondary script tags that only the first ones were stripped out leaving the secondary tags intact and thus allowing the hackers to do their worst to viewers of YouTube’s most popular videos.

Over the past year, SearchSoftwareQuality has featured a number of tips designed to help enterprises test and prevent defects related to cross-site scripting (XSS) vulnerabilities. These tips describe professional ways to shut down and secure potential XSS problems from Internet hackers. Using these tips will help secure your enterprise applications from similar attacks and prevent online mischief as XSS vulnerabilities truly are realistic problems.

XSS testing and prevention tips
Cross-site scripting (XSS) explanation
Cross-site scripting issues are a type of validation weaknesses in a Web form. Though XSS issues can be fairly easy to fix, avoiding them all together is key.

Beating software’s cross-site scripting, authentication problems
Web security expert explains where security efforts are best placed. By checking for cross-site scripting and authentication mechanism weaknesses you can eliminate problems in your application.

Finding cross-site scripting (XSS) application flaws checklist
Cross-site scripting (XSS) is a major concern, it can be unpredictable and requires multiple tools to test for it. An expert sheds light on the history of XSS issues and recommends tools to prevent XSS application issues.


June 30, 2010  3:47 PM

Lab management strategies for fibre channel networks

Rick Vanover Rick Vanover Profile: Rick Vanover

When it comes to lab management, administrators and infrastructure managers frequently think of workload provisioning tools such as VMware’s Lab Manager, VMLogix LabManager or Windows Deployment Services to provision servers. When it comes to the hardware side of lab management, we frequently think of advanced management offerings in blade servers or integrated management processors like the Dell DRAC or HP iLO.

A well-rounded lab will also have the opportunity to manage the connectivity between servers and storage. This is what I observed recently when touring the hardware integration lab at Hitachi Data Systems (HDS) in San Jose, CA during HDS Geek Day 0.9. There, a wide array of HDS and other parties’ products are in use for a number of test functions. Part of the HDS offering includes software to help manage over 30 different storage-centric offerings for features such as replication, tiering and protection. The lab was over 36,000 square feet in size and if you are primarily providing testing on storage you will quickly see that a large amount of storage I/O is being managed in the HDS lab.

HDS’s approach is to utilize the Gale Technologies Lab Manager product for managing the storage connectivity. We focused on Ethernet and fibre channel interface management during our tour of the HDS lab, but RF/Coax, POTS/analog telecom, T1 and DSL connections are also available for connecitivity provisioning. This lab management product coupled with a collection of physical layer switching can dynamically assign different servers and storage resources to each other via fibre channel. This means that the lab operators can quickly disconnect a server’s fibre channel host bus adapter (HBA) assigned to a switch port and associated storage in favor of a new assignment without visiting the server. A robust switching environment can move a port between switch zones with a soft connection; but each interface would always require a switch port.

During the lab tour, storage expert Chris Evans and I were impressed with the rapid ability to assign a physical path to the servers and storage in question. This can be used not only to simply add connectivity from a server to an associated storage device, but also simulate dropped fibre channel links within a pool of links. This can test various multipathing functions to ensure there is no data loss.

Do you have a need for physical layer connectivity management in a lab capacity? Share your comments below.


June 28, 2010  4:49 PM

Coverity version 5 makes defect discovery quicker and less painful

Daniel Mondello Profile: Daniel Mondello

Earlier this year, Coverity announced the fifth release of its defect investigation software tool, now called Coverity 5. Coverity 5 is basically a rehashing of its existing code analysis engine, process tools and debuggers, but with some added features and improved interfaces.

Recently, I spoke with Behrooz Zahiri, Coverity’s self-proclaimed perfect storm of code engineer and market analyzer. Zahiri summarized where Coverity 5 came from in three words: “our market research.” Coverity has been running audits of software development organizations to find out where common defects are found, which tools are used to discover these defects and how these issues are resolved. From their research, Coverity built a database of common defect areas and solutions, which it integrated into its latest software.

This new research-derived database serves two purposes: one, to work as a resource for developers so they can devise code with fewer errors; and two, to use the information in the database, once Coverity’s analysis engine is engaged, to find issues quicker and recommend common solutions to the problems found.

“When we decided to revamp Coverity 4, we had two goals that we wanted to reach. One was to make certain that our product was scalable and could deliver quick results for teams regardless of the team’s size. But we also wanted to elevate the adoption of our tool by increasing the effectiveness of defect repair,” said Zahiri.

Coverity’s new defect manager is built on a Java platform that has direct access to popular open source tools that have been approved and entered into its database, which also allows for plug-and-play of these pre-approved tools. The interfaces have also been updated and Coverity believes them to be more business-focused than prior versions.

Also new is a feature Coverity calls Defect Impact. This feature evaluates Coverity’s analysis of source code for defects. If defects are found, it prioritizes how they should be resolved by grouping them under three levels of severity (high, medium and low).

“High-risk defects negatively affect multiple parts of an application,” said Zahiri. “Many of these cause unexpected behavior throughout the application and alter the application’s memory management. Medium-level concerns are often performance dampeners. These defects allow the application to run, but at less-than-optimal speeds. And lastly, there are the low priorities — these are usually warning tags or artifacts in the code. Sometimes low-priority defects can be left alone, as the cost to resolve them doesn’t reflect a true positive return on the investment and they normally don’t hurt the application enough to matter.”

Defect Impact is broken into two features: a static analysis tool that supports analysis for applications built with C++, C# and Java, as well as dynamic analysis, which currently only supports Java.

Static analysis uses map checkers, which will seek out common areas where defects occur and search for problems — this is the primary way that risks are assessed in the final report generated by the tool. Dynamic analysis, which, again, is only for Java at press time, looks for concurrency issues and lockups in the user interface.

According to Zahiri, Coverity’s closest competitor is Klockwork, which he says is a similar offering that is priced and marketed very differently. Zahiri also admits Coverity has crossed paths with Polyspace and Parasoft from time to time.

For more information on Coverity check out these articles:

Coverity introduces build analysis tool, new Integrity Center (Apr. 14, 2009)
Coverity this week announced a new software build analysis product along with the bundling of all its software analysis products into one offering called the Coverity Integrity Center.
Coverity releases open source application architecture diagrams (Feb. 17, 2009)
Coverity’s new Scan library of open source software project “blueprints” can help software pros shave time off development and testing.
Coverity creates program to enforce code adherence (Nov. 25, 2008)
Coverity introduced Coverity Architecture Analyzer, which validates software architecture and detects potential security vulnerabilities.

Inside information from Coverity points to large-scale announcement in early July so stay tuned.


June 16, 2010  9:33 PM

Author Larry Klosterboer speaks on ITIL configuration and change management

Yvette Francino Yvette Francino Profile: Yvette Francino

Are you having trouble implementing ITIL in your organization? Or perhaps you’re new to ITIL and want to learn more? Then you will benefit from two new books: Implementing ITIL Configuration Management and Implementing ITIL Change and Release Management.

I was able to talk to author Larry Klosterboer at the IBM Innovate 2010 conference last week to find out more about his books. In this video you’ll learn Klosterboer’s number one piece of advice for those organizations implementing ITIL Change Management.


June 14, 2010  11:42 PM

Sign up to participate in an Agile innovation game

Yvette Francino Yvette Francino Profile: Yvette Francino

There is a growing interest in agile; but other than a short certification class, it might be hard for people to gain necessary skills and experience to become successful, particularly if their work group is not practicing agile.  The Agile Skills Project is a group that formed with the idea of creating an inventory of agile skills and forming a community of those interested in both practicing and enhancing their skills. I’d created a similar group called Beyond Certification last December.

Recently, there has been some discussion of the future of the Agile Skills Project. So, Cory Foy suggested an agile innovation game, called Prune the Product Tree, in order to explore ideas for the group. Foy explained his goals, saying:

“Basically I want to get two-to-three groups of five-to-eight people each to play the online game. It takes about 30 to 45 minutes, depending on the conversations we have; no special software or cost. I’m a facilitator for the games, so I’ll host and run the game; just need the interest and time.”

Foy has scheduled three sessions:

  • Wednesday, June 16th, 2010 at 6am EST / 11 a.m. GMT
  • Wednesday, June 16th, 2010 at 10pm EST / 3 a.m. GMT
  • Thursday, June 17th, 2010 at 3pm EST / 8 p.m.
    GMT

If you are interested in participating, you are welcome to sign up via Foy’s game spreadsheet , or contact Foy via email.


June 11, 2010  6:21 PM

IBM Innovate: IBMers give their impressions of the conference

Yvette Francino Yvette Francino Profile: Yvette Francino

Over the past week, I’ve been blogging about some of the highlights I’ve experienced while attending the IBM Innovate conference. The conference presented  opportunities to both learn and play and included IBM announcements, keynotes from high-profile speakers, over 350 technical sessions, excursions to Epcot and Disney’s Hollywood Studios, and last —  but not least —  the opportunity to meet and talk with people who are energized by innovation and the role that software will play in shaping our future.

I caught up with several IBMers after the conference to ask about their impressions. Listen in to what they have to say:


June 11, 2010  4:02 PM

IBM Innovate: End of life for the waterfall methodology?

Yvette Francino Yvette Francino Profile: Yvette Francino

Innovate 2010 wrapped up today with a keynote from Walker Royce focusing on “econometrics that are core to continuous improvement.”

Royce’s message was clearly one of “breakthrough agility,” speaking out against the waterfall approach which he labeled as “geriatric.”  He promoted integration test before unit test claiming that this would give teams the ability to catch “malignant errors early in the cycle,” saying that key metrics such as defects, scrap and rework were improved when using agile methods rather than conventional approaches.

To be honest, I was surprised that IBM took such a strong and public stance against the traditional waterfall methodology. I attended a Tuesday breakout session, “Quality in the Trenches Panel: Traditional? Agile? Something Else?” with Terry Quatrani and Scott Ambler, and in that session, too, the underlying message was clearly one that screamed “agile is the better approach.” The debate was pure tongue-in-cheek with Ambler in suit and tie arguing the waterfall approach and Quatrani with Mickey Mouse ears and casual dress arguing for agile, much in the style of Mac vs. PC commercials. However, the humor was clearly mocking the waterfall approach as an approach riddled with inefficient processes and an over-emphasis on documentation and lack of collaboration.

Though I knew a lot of agile was going on within IBM, with Big Blue having more of a suit-and-tie reputation, it surprised me that their message was not just pro-agile, but anti-waterfall.

At the keynote, Tim Lyons of Nationwide spoke about his implementation of an agile methodology.
“We thought we had standardized practices, but when we applied metrics they weren’t as standardized as we thought. Metrics provided the insights to dive down into multiple levels of standardizations and truly get into best practices,” said Lyons.

Nationwide adopted an onshore agile, lean model, operating with CMMI level 3 team and found it more cost-effective than an offshoring model.

Agile and CMMI combined? That was something I hadn’t expected to hear. Royce confirmed I was not alone in this viewpoint when he got back up on stage and said, “Many people think they’re opposing, but they can be used together.”

CMMI (Capability Maturity Model Integration) is a model that helps organizations attain continuous improvement of their software development processes. My experience with CMMI is that it’s quite rigid, requiring thorough documentation of standardized, repeatable processes. Agile is adaptable. The Agile Manifesto promotes “Individuals and interactions over processes and tools” and “Working software over comprehensive documentation.” With CMMI being documentation-heavy with strict adherence to process, I have a hard time imagining this being used in a purely agile environment which promotes adaptability and change. Nevertheless, Nationwide is using both and seeing positive results.

What do you think? Are agile and CMMI a good mix or are they opposing in nature? And what about waterfall? Is the methodology dying?


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: