Monday, August 24, 2009

Quality Assurance

Quality Assurance is a term dropped just about everywhere in the software field now a days. While there are companies that have strong quality assurance departments built upon frameworks that allow for success, other organizations brand basic testing teams as quality assurance but they are often reduced to blindly following step-by-step test plans with no ability to influence quality issues at a corporate level or they have no testing teams what so ever.

Let’s get this out of the way right now, quality assurance is not testing, testing is inherent in quality assurance. Somewhere along the way, this inaccurate perception skewed how QA is perceived and implemented in organizations. Quality Assurance reflects an ongoing commitment to the prevention of quality issues throughout the software development life cycle and beyond, while testing is independent of any process and is solely the examination or evaluation of the degree to which software satisfies the specified requirements after the completion of development. Testing is part of the work a quality assurance team performs.

What is Quality, Anyways?

Quality Assurance is built upon the foundation of ensuring the quality of an organization’s application or service, internal processes and resources. Great, that sounds simple enough, it’s all about quality, but what is quality anyways?

Quality is all about perception. Each individual user has their own definition of quality and they will apply it to your software or service when using it. QA must speak for the quality of the application on behalf of each and every user during the software development lifecycle. Each employee in your organization will have their own definition of quality. What’s yours?

I left you some blank space below to write it down.
------------------------------------------------------------------------------------------------------



------------------------------------------------------------------------------------------------------

Hard to do wasn’t it?

“Quality is hard to define; impossible to measure, but immediately recognized if it is missing.”

Quality Assurance

A survey of three hundred U.S. and European senior information technology executives from large companies showed that eighty five percent of IT executives interviewed indicated “application quality is either critical or very critical to their overall effectiveness in demonstrating business value.”

In another survey of more than 150 software development organizations found that thirty eight percent of developers said their companies do not have an adequate software quality assurance programs and thirty-one percent said their companies had no quality assurance personnel at all.

With such an emphasis placed on quality being so critical to the success of the business, why is quality assurance not emphasized in an organization? In fact, the perception among most developers is that most senior managers are satisfied with the quality of software that their companies are producing. When is the last time you have heard from a developer than the quality of the application they work on is poor, that the code they developed has defects and other issues? Never – why would they admit that their code is not working and what they developed is of poor quality. The perception is ingrained that any negative reflection of the application quality reflects on their performance as a developer rather than a means to strengthen overall product quality. This inherently sets the course where development and quality assurance are on opposite sides of the fence. Finding and resolving defects are ok but question the quality of the underlying application code and up go the defenses. Take the performance testing of a web application, the development team will try and look for every hole in how the test was executed to invalidate the results.

There are ingrained biases surrounding quality assurance. This bias extends from developers through to senior management. Everyone thinks QA is easy, so less-skilled people are hired off the street (with no training) to fill QA positions at much lower salaries. Management can and has demoted underperforming developers into QA.

What is Software Quality Assurance?

All of the quality assurance books on the market today, focus on software quality assurance and provide similar definitions for it. D. Galin wrote software quality assurance is: “A systematic, planned set of actions necessary to provide adequate confidence that the software development process or the maintenance process of a software system product conforms to established functional technical requirements as well as with the managerial requirements of keeping the schedule and operating within the budgetary confines.” The definition seems to leave out critical parts. What about social engineering, user behavior, usability, client expectations and process improvement? As of now, the old definitions no longer exist; I’m throwing them out and writing one that makes sense. You cannot address software quality assurance without dealing with social engineering, human behavior and the processes that developed the software. I consider it to be a vital concept in quality assurance and I will go door to door if I have to, but I’m gonna convince you I’m right.

Without the Why

Quality Assurance has been hopelessly stalled for the last 20 years. Moribund. Ossified. There is a very simple reason for this catastrophic and intractable state we are in. People think they know what QA is and stopped asking why along time ago. Without the why, quality becomes theory at best, not practiced or even of critical concern. No process or methodology, regardless of complexity is going to tell us why quality assurance has stayed the way it is. When the why’s remain unanswered, there is no understanding. When there is no understanding there is no progress. When there is no progress there is no evolution. Without evolution, things just fade away. Welcome to QA.

Sunday, August 16, 2009

Quality Issue of the Day - August 17, 2009

The following issue was noted on the Microsoft Visual Basic registration page. The blue text in the image informs the user to "check the first checkbox, above to recieve important information...". On the page, the first checkbox on the page exists below the message.


A Year without the Yankees

If you are a baseball fan, there is a chance you might know the answer to this little piece of trivia. When was the last time the New York Yankees failed to reach the playoffs? The answer, 1993; the same year the Toronto Blue Jays won their second consecutive World Series. There were no wild card teams or Tampa Bay Devil Rays, the Florida Marlins were in their first year of existence and Jim Abbott (born without a right hand) pitched a no-hitter against the Indians.

In 1993, Bill Clinton succeeded George H. Bush as the President of the United States, Unforgiven won best picture at the Academy Awards and the Montreal Canadians won their 24th Stanley Cup beating the Los Angeles Kings in the finals and Seinfeld was TV royalty. On the computer and technology front, it was a year that started the evolution of our lives, The World Wide Web is released by CERN (European Organization for Nuclear Research), Windows NT 3.1, the first version of Microsoft's line of Windows NT operating systems, is released to manufacturing and id Software releases Doom, a first-person shooter that uses advanced 3D graphics for computer games setting the standard for future games.

On the Quality Assurance front, Mercury Interactive (bought by HP) had shipped the first versions of their products two years earlier and wouldn’t hit stride financially for another 6 years. WinRunner had been released, Test Director (now Quality Center) and Load Runner existed in only their first versions. There was no thoughts of any automated web testing tools as the Internet was just released (the first versions were released in 1996). For QA process and methodology, “Testing Computer Software” by C. Kaner, J Falk and H. Nguven was republished as a 2nd edition (verbatim of the original 1998 version). This book was the standard reference for the testing field and included topics on test case design, test planning, project life cycle overview, software errors, boundary conditions, bug reports, regression testing, black box testing, software quality and reliability, managing test teams, printer testing, internationalization, and managing legal risk.

In explicitly, 16 years later this classic book is still used as a reference guide (I have seen it used by non QA resources and executives as a reference and a guide for QA), even though it is out of date and out of print. It uses examples of MS-DOS and testing dot-matrix printers. It even advocates a “wait and see” approach to the “Microsoft Test” application and the “fad” of automated testing. In fact, automated testing is now a staple of any testing strategy.
For all the gains in technology, process, methodology, the changes in types of testing, and use of automated testing, QA is hopelessly deadlocked and stalled because of this standard reference. The technology field has relied so heavily on this book that there is no longer any evolution and development teams have assumed a dominant role in IT organizations, a la Darwin. Books like "What would Google Do?" are swooped off of shelves and read by product managers, search and advertising executives, human resources and of course engineering teams. These books evangelize changes in how to do business in the new media world, yet there is not a matching book for Quality {QA} in the new world. QA needs a new reference, a new standard, a game changer and I'll write it if I have to.

Saturday, August 15, 2009

Quality Issue of the Day - August 15th, 2009

The following screenshot is from Facebook's Music/Youtube share panel. "To attach, select one of the songs below and click Post" is the instruction to the user, however there is no Post button.



Friday, August 14, 2009

Warning: Science Content

As an April Fool’s joke, Google introduced the Gmail Custom Time feature which allowed users to date and timestamp an email being sent so that they arrive “on time, every time”. Google informed users that they could only date messages back until April 1, 2004, the day Gmail was launched due to a temporal paradox (sub classification: grandfather). A temporal paradox is “a paradoxical situation in which a time traveler causes, through actions in the past, the exclusion of the possibility of the time travel that allowed those actions to be taken”. A grandfather paradox occurs where a time traveler goes back in time and kills his grandfather before his father is conceived. It is a paradox because if this occurs, he will never be born, and therefore never be able to travel back in time to kill his grandfather, thus allowing himself to be born. This example is one type of causality loop.

Now that we had our Star Trek lesson for the day, how does a paradox relate to Quality Assurance?

In society, a paradox is “an apparently true statement or group of statements that leads to a contradiction or a situation which defies intuition; or it can be, seemingly opposite, an apparent contradiction that actually expresses a non-dual truth (cf. Koan).

In Quality Assurance, it is called the “paradox of excellence”. Outside the walls of IT, software development is hard to understand, few realize how hard it is to release software with only a few bugs. When software has lower quality, users will identify and pick up on bugs and other issues with the software. When the software has high quality, users don’t perceive the absence of bugs.

The “paradox of excellence” states that QA doesn’t get credit for issues that don’t happen. When QA does their job right, they don’t get acknowledgement or kudos for the lack of issues in a release, in fact no one notices. When an issue arises and the development team scrambles to fix it, they typically get the kudos for the prompt solution and QA gets the blame for not catching it in the first place.

The typical argument I hear from co-workers is that it is QA doing their job, and they don’t need kudos for that. As a senior executive, how often do you reward developers, project managers and business analysts for delivering a project on time? When the last time you acknowledged the job QA is doing (and not just as a member of a project)?

Data Quality : Cleaning up our Toxic Lake

Data Quality, two words that make your eyes gloss over. No one wants to think about it, after all, data is invisible. It is the bytes and bits of the information we use to do business, it is accessible through SQL queries or reports, it is manipulated to send to vendors and stored for future use. Most people don’t think about data or data quality too often or too hard. Instead we go about our daily lives. We take orders and deliver goods and services to customers. We bill them and collect revenue. We improve existing goods and services and market and sell them. We manage the organization. We try to figure out e-business. We make decisions and we plan. We report on how we are doing and try to do better. We try to gain a competitive edge. Data underpins everything we do, but we still don’t think about it. Besides who wants to worry about data quality if no one is complaining? We have real work to do, customers to satisfy, production schedules to meet, decisions to make, strategies to map out and a demanding board to answer to.

In our rush to deliver, we forgot that customers recognize poor quality more readily. They are sensitive to conversions, billing errors, improperly addressed mail and claims that turn out to be incorrect. In many cases, users get fed up, which leads to money being spent elsewhere. How many times have you got frustrated at improperly addressed mail because a company got your name wrong?

In today’s day and age, the world focuses on carbon footprints and cleaning up our environment. This leads to a great analogy. A data store whether it is a SQL database, an Access database or an Excel spreadsheet can be considered a lake. The lake water represents the data, and the streams represent the flow of information out of the lake. Factories that exist upstream introduce new sources of pollution or in this case the input of poor quality data into the lake which eventually flow down stream and contaminating the streams creating poor conditions for matching logic. So how do we clean up our lake?

By cleaning up existing conditions and preventing future contaminants! We spend small amounts of time each week picking up the garbage on the shore. However, correcting existing data by itself will not increase the downstream quality, as the factories will continue to introduce more polluted water into the lake creating an endless cycle. The root causes of the bad data need to be identified and eliminated. This shifts the focus from detecting and correcting data errors to preventing future errors from being introduced.