How to Blend Quality Control and Scope Validation for Project Success
One of the most successful projects that I worked on was one in which we as a management team decided to blur the lines between the Control Quality and Validate Scope activities.
Now don’t get me wrong. I am not advocating, as some do, that these two processes are one and the same. They are not. However, as I will show shortly, huge gains in the progress of a project are made when these two distinct processes are judiciously blended or overlapped.
First, a reminder of the differences between the Control Quality and Validate Scope activities. The following description is taken from my previous article, “Why You Should Always Implement Quality Control:”
It is also important when defining quality control, not to confuse it with … scope validation. As noted earlier, quality control is about inspecting the product being developed to determine whether it meets the quality requirements established and agreed upon at the beginning of the project. …
… Quality control and scope validation are sometimes (erroneously) considered to be synonymous. Think of quality control as an internal process. Scope validation, on the other hand, is an external process. The client team conducts its own review to validate that the deliverable is acceptable and can be signed off.
Next, a refresher on two other terms that are often used interchangeably: “verification” and “validation.” These, likewise, are not the same activities. (I wrote about this in the article, “Why You Should Be Prudent in Engaging IV&V.”)
Think of verification as answering the question, “Are we developing the system right?” That is, every phase of the software lifecycle must fulfill the requirements for that phase.
Validation, on the other hand, answers the question, “Are we developing the right system?” The system must meet the client’s objectives that were established in the project charter.
To summarize, “verification” is an aspect of the Control Quality activities, while “validation” is associated with the Validate Scope activities. In many of the papers I researched, it is often stated, or at least implied, that the validation process does not involve the IT project team. In the writers’ minds, there is a clear demarcation between verification and validation.
As I studied these writings, I got a clear visual of the IT team completing a deliverable, subjecting it to an internal verification process (Control Quality), and then tossing it over the transom to the client team for review and acceptance (Validate Scope).
This is where I differ (somewhat) with these experts. This is where the project I referred to at the beginning of this article blurred the lines between Control Quality and Validate Scope. And to our great benefit, I might add.
Some of what I am about to describe may be par for the course on Agile projects. However, for the waterfall development of a system involving over 200 team members over a period of four years, this was a great success.
I believe the validation process begins before the formal project is even initiated. As one of the leads in the Project Management Office, I was assigned the task of building the client project team. I requested of the primary stakeholder that her team be comprised of fulltime end-users loaned to the project for the duration. They were to be from the various departments that would be served by the new system. They were to be the organization’s best and brightest.
Not only was I granted this request, but I was given permission to interview all applicants to determine which of them would be most suitable for the challenges ahead.
As soon as the vendor was selected and the contract awarded, we inserted the client team members among the IT provider’s teams to begin the requirements gathering tasks. As the IT business analysts facilitated the work groups, the end-users were immersed in the process alongside them. They corrected misunderstandings, added missing elements, interpreted policies and procedures, and reviewed the detailed notes from what had been discussed.
Periodically, the IT team would take stock of the requirements produced to-date and conduct cross-team verification sessions. The end-users were co-contributors to this process. Later, as the requirements definition sessions wound down and the final deliverable was being produced, the end-users were invited to participate in the provider’s internal quality control process. They provided further input to the finalization of the requirements deliverable.
But wait a minute! Wasn’t the client team just doing the IT provider’s work for them?
Not at all. By this point, the client team felt a sense of deep responsibility to develop an excellent system for their colleagues in the field who would one day use it to perform their daily jobs. They also wanted to ensure for themselves that the many hours spent gathering and defining requirements were accurately reflected in the final product. From the IT provider’s point of view, the client team’s immersion into the requirements helped them understand what they were building, and prepared them for the upcoming task of scope validation (i.e. acceptance of the requirements deliverable).
Don’t misunderstand me. When the IT provider delivered the final deliverable, the client team did not “rubber stamp” its acceptance. They combed through it to find additional errors and omissions and areas needing more definition. The deliverable was accepted on the second submission – exceptional for one this massive and complex.
This blurring of the verification/validation lines did not stop there. As the IT provider began the design of the technical and programming specifications, the end-users were called on frequently to work through the clarification of functional requirements for incorporation into the design specifications. Working prototypes of the user interface were presented to the end-users before finalizing the look-and-feel of the new system. Initial system test scripts were executed in the presence of client team members to ensure that calculations and data on the screens were presented accurately.
Later in the project as this same client team executed its user acceptance test, members of the IT provider were inserted into the client team – not to run the scripts – but to interpret results and get a firsthand view of what the client team was seeing when they reported defects.
Was it the client’s job to assist the IT provider to this extent? Perhaps not to this extent. But what resulted was a fully functioning system developed on time, UNDER budget, and with far fewer defects than the industry standard for a system of this size and complexity.
Yes, we blurred the lines between verification and validation. While verification is an internal process, it was aided by input from the client team. While validation is an external process, it was aided by the client team’s deep involvement in the IT provider’s development and quality control activities.
Both verification that the system was being developed right, and validation that the right system was being developed, were satisfied by this blurring of the lines.
Quality was achieved.
The system was accepted.
Get Your Free Guidebook
Subscribe and receive your free guidebook,
5 Ways to Master the Art of Managing People, Projects and Profits.
There are currently no comments.