Archive for the ‘compliance’ Category

You’ve got processes, but . . .

Friday, September 23rd, 2011

A friend who consults in program, project and risk management (typically to parachute-in and save wayward technology projects) is working with a client whose project is dreadfully behind schedule and over budget, and, not surprisingly, has yet to deliver anything the end-client or their users can put their hands on.  It doesn’t help that his client isn’t actually known for being technology heroes.  In fact, this is not the first time his client has tried to get this project off the ground.

Looking everywhere but in the mirror, my buddy’s client decided to have the developer put under a microscope.  After all, reasoned the client, they hired the developer on, among other attributes, touts that they were rated at CMMI Maturity Level 3!  So, they had the developer and the product undergo a series of evaluations (read: witch hunts) including a SCAMPI (CMMI) appraisal.  Sadly, this tactic isn’t unusual.

Afterwards, trying to help his client make sense of the results, my pal asked me to review the report of the appraisal which was fairly and (more or less) accurately performed by someone else (not us).  The appraisal was quite detailed and revealed something very interesting.

Lo-and-behold, the company had processes!

However, the development organization nonetheless failed to demonstrate the necessary performance of the Maturity Level 3 (ML3) practices they were claiming they operated with!  In other words, they had processes, but they were still not ML3!  In fact, they weren’t even Maturity Level 2 (ML2)!

How could this be?

While the details bore some very acute issues, what was more interesting were the general observations easily discernable from far away and with little additional digging.  The appraisal company created a colorful chart depicting the performance of each of the practices in all of ML3.  And, as I noted, there were important practices in particular areas with issues that would have precluded the achievement of ML2 or ML3; but, what was more interesting were the practices that were consistently poor, in all areas as well as the practices that were consistently strong in all areas.

One thing was very obvious: the organization, did, in fact, have many processes.  Most of the processes one would expect to see from a CMMI ML3 operation.  And, according to the report, they even had tangible examples of planning and using their practices.

What could possibly be going on here?

Seems awfully much like the development group had and used processes.  How could they not rate better than Maturity Level 1 (ML1)?!  Setting aside the specific gaps in some practices that would have sunk their ability to demonstrate anything higher than ML1 – because this isn’t where the interesting stuff shows up, and, because even were these practices performed, they still would have rated under ML2 – what the report’s colorful depiction communicated was something far harder to address than specific gaps.  The developers’ organization was using CMMI incorrectly.  A topic I cover at least in the following posts: here and here.

In particular, they were using CMMI to “comply” with their processes but not to improve their processes.  And, *that* is what caused them to fall far short of their acclaimed CMMI ML3.

How could I tell?

Because of where the practices were consistently good and where they were consistently gap-worthy.

I was reviewing the report with my friend on the phone.  As I was doing so he commented, “Wow!  You’re reading that table like a radiologist reads an X-ray!  That’s very cool!”  The story the chart told me was that despite having processes, and policies, and managing requirements and so on, the company habitually failed to:

track and measure the execution of their processes to ensure that the processes actually were followed-through as expected from a time and resource perspective,

objectively evaluate that the processes were being followed, were working, and were producing the expected results, and

perform retrospectives on what they could learn from the measurements (they weren’t taking) and evaluations (they weren’t doing) of the processes they used.

It was quite clear.

So, here’s the point of today’s post… it’s a crystal clear example of why CMMI is not about process compliance and how it shows up.  There are practices in CMMI that definitely help an organization perform better.  But, if the practices that are there to ensure that the processes are working and the lessons are being learned aren’t performed, then the entire point to following a process has been lost.  In Shewart’s cycle, this would be akin to doing P & D without C & A.

The only chance of anything that way is compliance.  There’s no chance for improvement that way except by accident. 

CMMI is not about “improvement by accident”.  (Neither is Agile for that matter.)

Interestingly enough, while there were clearly issues with the developer’s commitment to improvement, there may not necessarily have been any clear issues with either the product or the results of their processes.  While the experience may not have been pleasant for the developer, I don’t know that by buddy’s client can say to have found a smoking gun in their supplier’s hands.  Maybe what the client needs is a dose of improving how they buy technology services – which they might find in CMMI for Acquisition.