Archive for the ‘SCAMPI’ Category

CMMI On One Leg

Tuesday, December 18th, 2012

I’m not sure, but I’m told some famous guy back in Biblical liturgy was once asked to explain the point of the Pentateuch (aka, the Torah, aka, The Five Books of Moses) while "standing on one leg".  

I now undertake a task, possibly no less daunting, regarding CMMI.  And, if there ever were anyone more appropriate to try it, I doubt I’ve met them.

Seriously though, much has been written here and many other places (not to mention eons of conference and user group content) about a number of "universal truths" about CMMI.  Let’s get these out there first, but without dwelling on them:

  • There are no "processes" in CMMI, only practices, and there’s a difference.
  • The practices in CMMI are "what" but not "how".
  • These practices are use to improve your processes, not to define them.
  • The CMMI does not require the SCAMPI appraisal to be effective.  You can use CMMI to improve your operation without ever using the SCAMPI to appraise your use of CMMI.
  • 42.  OK.  Not really.

However, not a single one of these "truths" explain the point of CMMI, or,  how to actually use CMMI.  So, here it goes:

Each one of the practices in CMMI improves some aspect of your organization’s performance resulting from how you do your work.  It doesn’t matter whether it’s providing a service or developing a product.  And, it doesn’t matter whether you do so using so-called traditional development methods or Agile approaches.  If you have performance issues in an area of your operation (called, "Process Areas" in CMMI), Check each of the practices in that area for activities in your operation that might be causing those performance issues. 

It’s assumed, then, if you don’t have any issues covered by a practice then you don’t need to do anything about a practice, because you’re already doing it.  This says nothing of how well you do it, why you do it, how you do it, whether you recognize that you do it, or whether the fact that you do it is a complete coincidental freak of nature, but, if you read a practice, understand the risk it avoids, and you don’t encounter that risk, you’re somehow performing that practice.  Pretty simple.

I’ll repeat and summarize that two-step thought experiment:

  1. Look in the process areas for practices that address performance issues you’re experiencing with the operation of your work.  When you encounter a practice (or more than one), the absence of which can explain why you’re seeing those issues, make appropriate changes to your operation so that you incorporate that/those practice(s) into your operation.  Rinse and repeat.
  2. Practices that don’t represent risks or issues you’re not seeing are (pretty much, by definition) practices you’re somehow managing to accomplish.  Don’t bother with them — unless you notice that you don’t like something about how you do it, but that’s a different priority/matter.

Keep in mind, this says nothing of

  • whether what you do/don’t do will suffice as "evidence" for an appraisal
  • how well you perform the practices (regardless of whether or not you perform them or believe you can use them to improve),
  • what it takes to incorporate practices or make change, in general, happen in your operation,
  • whether an appraisal team will concur with whether you do/don’t perform practices, or
  • you interpret practices in constructive ways.

Nonetheless, if you internalize the significance of the above 2 steps, you can (I dare say, "will") save yourselves a lot of time and grief when using CMMI.  This approach can certainly help you prioritize the practices for which to focus on, appraisal or not.  And, if you do take this approach towards preparation for an appraisal, keep in mind the bulleted caveats and don’t try this alone.

Process In the Fabric

Monday, November 21st, 2011

Say you’re in a truly disciplined, lean and agile operation and your processes are so deeply ingrained in what you do that putting your finger on tangible evidence is a challenge, and not for lack of process.  Just lack of being able to step back far enough from the canvas to see the whole picture.  What do you when it comes to demonstrating your practices for a CMMI appraisal, for example?

Well… the best advice I can give companies in such situations is to work early and closely with a consultant and/or lead appraiser to elicit the best evidence for the appraisal long before the appraisal event itself is planned or carried out.  It’s important to be clear about what the evidence is, and, you want the appraiser and the appraisal team on board with how the evidence will “show up”.  This is not something you want to surprise anyone with come appraisal time.

Working early and closely with a lead appraiser will not only help everyone understand the context, and not only will it provide an opportunity to strengthen practices and identify operational risks, but it will give you a good idea about whether or not the lead appraiser has the wherewithal to think broadly about practices and to assemble the contextual picture for how practices would “show up” in the context of your operation.

Sadly, not all appraisers have this skill set.  In fact, in my experience, the great majority do not have the skills to make contextually relevant model interpretation such that actual, naturally-occurring evidence from an operation can take its most natural form and still be recognized as implementation of CMMI practices.  In my experience, most lead appraisers expect evidence to come in very specific shapes, sizes, and colors and they don’t recognize the evidence when it doesn’t meet their pre-conceived notions of what particular evidence should look like. 

That being said, this does not give carte blanche for not having evidence.  I’m not saying that the evidence isn’t there, I’m just saying that the evidence may not be what’s traditionally thought-of as evidence from larger or more traditional development operations.

Process evidence from operations whose processes are deeply ingrained can often show up as very clear, obvious artifacts.  Especially from traditional development operations.  However, in small, lean, and agile operations, the evidence can be much less obvious.  It is a special skill set to be able to recognize the outputs of such operations as evidence of CMMI practices and organizations are served well to work with the lead appraiser early to determine whether or not their operation produces evidence as well as whether or not the appraiser can see more broadly than the evidence they’re used to seeing from traditional operations.

Since few organizations know how to pick a lead appraiser, perhaps this “litmus test” for a lead appraiser can serve to help them through the process.  The alternative could be a disastrous paper-chase to create evidence on top of the evidence that’s already there.

You’ve got processes, but . . .

Friday, September 23rd, 2011

A friend who consults in program, project and risk management (typically to parachute-in and save wayward technology projects) is working with a client whose project is dreadfully behind schedule and over budget, and, not surprisingly, has yet to deliver anything the end-client or their users can put their hands on.  It doesn’t help that his client isn’t actually known for being technology heroes.  In fact, this is not the first time his client has tried to get this project off the ground.

Looking everywhere but in the mirror, my buddy’s client decided to have the developer put under a microscope.  After all, reasoned the client, they hired the developer on, among other attributes, touts that they were rated at CMMI Maturity Level 3!  So, they had the developer and the product undergo a series of evaluations (read: witch hunts) including a SCAMPI (CMMI) appraisal.  Sadly, this tactic isn’t unusual.

Afterwards, trying to help his client make sense of the results, my pal asked me to review the report of the appraisal which was fairly and (more or less) accurately performed by someone else (not us).  The appraisal was quite detailed and revealed something very interesting.

Lo-and-behold, the company had processes!

However, the development organization nonetheless failed to demonstrate the necessary performance of the Maturity Level 3 (ML3) practices they were claiming they operated with!  In other words, they had processes, but they were still not ML3!  In fact, they weren’t even Maturity Level 2 (ML2)!

How could this be?

While the details bore some very acute issues, what was more interesting were the general observations easily discernable from far away and with little additional digging.  The appraisal company created a colorful chart depicting the performance of each of the practices in all of ML3.  And, as I noted, there were important practices in particular areas with issues that would have precluded the achievement of ML2 or ML3; but, what was more interesting were the practices that were consistently poor, in all areas as well as the practices that were consistently strong in all areas.

One thing was very obvious: the organization, did, in fact, have many processes.  Most of the processes one would expect to see from a CMMI ML3 operation.  And, according to the report, they even had tangible examples of planning and using their practices.

What could possibly be going on here?

Seems awfully much like the development group had and used processes.  How could they not rate better than Maturity Level 1 (ML1)?!  Setting aside the specific gaps in some practices that would have sunk their ability to demonstrate anything higher than ML1 – because this isn’t where the interesting stuff shows up, and, because even were these practices performed, they still would have rated under ML2 – what the report’s colorful depiction communicated was something far harder to address than specific gaps.  The developers’ organization was using CMMI incorrectly.  A topic I cover at least in the following posts: here and here.

In particular, they were using CMMI to “comply” with their processes but not to improve their processes.  And, *that* is what caused them to fall far short of their acclaimed CMMI ML3.

How could I tell?

Because of where the practices were consistently good and where they were consistently gap-worthy.

I was reviewing the report with my friend on the phone.  As I was doing so he commented, “Wow!  You’re reading that table like a radiologist reads an X-ray!  That’s very cool!”  The story the chart told me was that despite having processes, and policies, and managing requirements and so on, the company habitually failed to:

track and measure the execution of their processes to ensure that the processes actually were followed-through as expected from a time and resource perspective,

objectively evaluate that the processes were being followed, were working, and were producing the expected results, and

perform retrospectives on what they could learn from the measurements (they weren’t taking) and evaluations (they weren’t doing) of the processes they used.

It was quite clear.

So, here’s the point of today’s post… it’s a crystal clear example of why CMMI is not about process compliance and how it shows up.  There are practices in CMMI that definitely help an organization perform better.  But, if the practices that are there to ensure that the processes are working and the lessons are being learned aren’t performed, then the entire point to following a process has been lost.  In Shewart’s cycle, this would be akin to doing P & D without C & A.

The only chance of anything that way is compliance.  There’s no chance for improvement that way except by accident. 

CMMI is not about “improvement by accident”.  (Neither is Agile for that matter.)

Interestingly enough, while there were clearly issues with the developer’s commitment to improvement, there may not necessarily have been any clear issues with either the product or the results of their processes.  While the experience may not have been pleasant for the developer, I don’t know that by buddy’s client can say to have found a smoking gun in their supplier’s hands.  Maybe what the client needs is a dose of improving how they buy technology services – which they might find in CMMI for Acquisition.