Archive for the ‘Improvement’ Category

Forget CMMI!

Tuesday, November 15th, 2011

This is probably the most important blog entry I’ve ever posted.

The video is the longest video I’ve ever posted on the blog, and for that reason, I’ll keep the text content to a minimum. 

Here’s why you should watch the video:  CMMI may be entirely wrong for you, and you may not know it!

The video explains an epically crucial reality about CMMI that many agile (and other) teams are not aware of, leading them unknowingly down a path of self-defeat and damage.  All of which could be avoided with this one super-critical piece of knowledge.

You’ll thank me later.


The lure of seemingly limitless opportunities can be quite strong, obviously.  And, especially in tough economic times, succumbing to that lure can cause even the best of businesses to act unwisely.  Such is the lure of CMMI ratings.

Well, anything that’s very alluring can cause unwise behavior, I suppose.  Whether it’s as apparently harmless as indulging in a luscious dessert, spending money on unnecessary luxuries, or any of equally limitless opportunities to make bad choices, doing what we want instead of doing what’s right shows up even when working with CMMI.

This blog is full of examples of such bad CMMI choices, but there’s one bad choice I haven’t mentioned much about.  That’s the choice to even try to use CMMI.

When working with a knowledgeable, concerned, trustworthy CMMI consultant, an organization should be steered away from CMMI when their circumstance doesn’t align well with model-based improvement using CMMI.  In some cases, it may be a matter of steering towards the right CMMI constellation (e.g., for Development, or, for Services).  However, just as whether or not CMMI is right for an organization ought to be discovered before too much energy is put into it, so should the decision about a particular maturity level within the constellation.

No CMMI constellation should be attempted if/when the organization doesn’t control the work that it does.  Namely, that the work it does is controlled by another organization, such as a customer.  Or, put the other way, CMMI should only be used if/when the processes used by the people doing the work are controlled by the same organization using CMMI to improve them.

At Maturity Level 2 (ML2), almost any type of work can use the practices in that level to improve its performance and to demonstrate that the practices are in place.  However, at Maturity Level 3 (ML3), you have to be doing the type of work in the particular constellation in order to be able to use the practices in it.  If you’re not doing that type of work, the practices will be irrelevant.  Attempting to use the practices when there’s no such work being done will only cause the practices to get in the way and add nothing but frustration.

In particular, if you’re not doing work that involves structured engineering analysis, CMMI for Development at ML3 will be truly unwieldy.

Adding practices for work you’re not doing is an example of the bad behavior many organization exhibit when they’re chasing a level rating rather than hot on the trail of performance improvements.  It’s these sorts of behaviors that are somehow rationalized as being beneficial when, in fact, they are unequivocally, diametrically, and everything but beneficial.  They are a colossal waste of time and money and detrimental to morale and productivity.

You really need carve out about 11 minutes to watch the video.

Agile is a Service: You May Be Improving the Wrong Things

Sunday, October 9th, 2011

So much about software development (in particular, and product development in general) as a business has less to do with technology than it has to do with keeping customers happy.  What do customers really care about?  While they say they want their product on time, on budget and doing what they asked of it to do, most of the time, managing their expectations has little to do with time, cost, features, functions, or quality.  What they experience is more about how the developer treats them as a customer.  In other words, what they perceive as the developer’s business as a service is what customers react to.

Of course, customers aren’t typically happy when the product is late, doesn’t do what they need it to do, and/or costs more than they were expecting to pay – scope creep notwithstanding.  Be that as it may, agile development and management practices recognize the importance of customer involvement (and all stakeholders, in general).  In fact, while the “traditional” development and management world has long espoused the importance of an integrated team for product and process development, it’s the agile development and management movement that actually made it work more smoothly with more regularity.

(Before anyone from the “traditional” development camp jumps down my throat, keep in mind: I came from the traditional camp first and saw attempts at IPPD and saw how difficult it was to get it going, keep it working, and eliminate the competition and other organizational stress that IPPD continues to experience in the traditional market.  And, I’m also not saying it doesn’t work in traditional settings, just that it worked much better, much faster, and with much more regularity in the agile settings.)

From the beginning, agile practices understood the importance of the customer and of being a service to the customer.  Kanban (more recently) even refers to different types of work as “classes of service”.  In fact, if we look at the most common pains in development work (e.g., staffing, time, agreement on priorities and expectations), we see that it’s seldom technology or engineering issues.  They’re issues more aligned with the developers’ abilities to provide their services.

[NOTE: For the remainder of this post, I’m going to assume the development operation actually knows its technology and knows what real engineering development looks like.  This is a big assumption, because we all know that there are development operations a-plenty whose technical and engineering acumen leave much to be desired.]

Let’s now look at another importance facet of all development, agile notwithstanding.  Much of it happens after the initial product is released!  Once the product is released, there is precious little actual development going on.  The ongoing support of the product includes enhancements and other updates, but very little of that work requires any engineering!  Furthermore, what is worked-on comes in through a flow of requests, fixes, and other (very-often unrelated) tasks. 

After a product has been released, the operation of a development shop resembles a high-end restaurant far more closely than it appears as a production floor.  Once the menu has been “developed”, from that point forward, patrons merely ask for items from the menu and for modifications to items on the menu.  Even were there to be a “special order” of something not at all on the menu, the amount of “development” necessary to "serve” it is minimal.  And, when something truly off-the-wall is requested, the chef knows enough to respond with an appropriately apologetic, “Sorry, we can’t make that for you right now.  Please let us know in advance and we’d be happy to work something up for you.”  At which point, they would set about developing the new product.

Meanwhile, the vast majority of the work is actually just plugging away at the service.  In the service context, development is often not the majority of the work.  In that context, engineering plays an important role much less often than the ability to deliver services, manage transition of services, ensure continuity of service, handle incidents, manage resources, and so on.

What does this mean for agile teams, and, what does this have to do with CMMI?

Well, maybe much of the perceived incompatibility between CMMI (for Development) and agile practices are not due to incompatibilities in CMMI and agile, but incompatibilities in the business of agile and the improvement of development.  In other words, maybe the perceived incompatibilities between CMMI and agile are because CMMI for Development (CMMI-DEV) is meant to improve development and many agile teams aren’t doing as much development as they are providing a service.  Perhaps it’s just that the business models presumed by the two approaches are not aimed at making progress in the same way.

When agile teams are doing actual development, CMMI-DEV should work well and can help improve their development activities.  But, agile teams are often not doing development as much as they are providing a service.  They establish themselves and operate as service providers.  Most of the agile approaches to development are far more aptly modeled as services.

CMMI for Services defines services as follows*:

  • A product that is intangible and non-storable.
  • Services are delivered through the use of service systems that have been designed to satisfy service requirements.
  • Many service providers deliver combinations of services and goods. A single service system can deliver both types of products.
  • Services may be delivered through combinations of manual and automated processes.

*Glossary CMMI® for Services, Version 1.3, CMMI-SVC, V1.3, CMU/SEI-2010-TR-034

Many requests made of many agile teams have more to do with supporting the product than developing a product.  While the product is still under development, then, by all means, CMMI for Development is apropos.  But after the initial development (where more product-oriented money is spent), the development is hard to see and harder to pin down.

Maybe, improving development is not the right thing to develop.  Perhaps agile teams could look at how they handle “development as a service” for their improvement targets.  Maybe CMMI for Services is a much better fit for agile teams. 

Could a switch from CMMI-DEV to CMMI-SVC benefit agile teams?  Could a switch from CMMI-DEV to CMMI-SVC make achieving CMMI ratings easier and more meaningful?

I believe the answer to both is a resounding: ABSOLUTELY!

ATTENTION AGILE TEAMS: You need a CMMI rating?  Look at CMMI for Services.  It might just make your lives easier and actually deliver more value right now!

[NOTE: I have an essay, Are Services Agile?, in this book on this topic.  Since you can “look inside” you might be able to read it without buying it.  Furthermore, the essay has been published online in some places.  You might be able to find it out there.]

You’ve got processes, but . . .

Friday, September 23rd, 2011

A friend who consults in program, project and risk management (typically to parachute-in and save wayward technology projects) is working with a client whose project is dreadfully behind schedule and over budget, and, not surprisingly, has yet to deliver anything the end-client or their users can put their hands on.  It doesn’t help that his client isn’t actually known for being technology heroes.  In fact, this is not the first time his client has tried to get this project off the ground.

Looking everywhere but in the mirror, my buddy’s client decided to have the developer put under a microscope.  After all, reasoned the client, they hired the developer on, among other attributes, touts that they were rated at CMMI Maturity Level 3!  So, they had the developer and the product undergo a series of evaluations (read: witch hunts) including a SCAMPI (CMMI) appraisal.  Sadly, this tactic isn’t unusual.

Afterwards, trying to help his client make sense of the results, my pal asked me to review the report of the appraisal which was fairly and (more or less) accurately performed by someone else (not us).  The appraisal was quite detailed and revealed something very interesting.

Lo-and-behold, the company had processes!

However, the development organization nonetheless failed to demonstrate the necessary performance of the Maturity Level 3 (ML3) practices they were claiming they operated with!  In other words, they had processes, but they were still not ML3!  In fact, they weren’t even Maturity Level 2 (ML2)!

How could this be?

While the details bore some very acute issues, what was more interesting were the general observations easily discernable from far away and with little additional digging.  The appraisal company created a colorful chart depicting the performance of each of the practices in all of ML3.  And, as I noted, there were important practices in particular areas with issues that would have precluded the achievement of ML2 or ML3; but, what was more interesting were the practices that were consistently poor, in all areas as well as the practices that were consistently strong in all areas.

One thing was very obvious: the organization, did, in fact, have many processes.  Most of the processes one would expect to see from a CMMI ML3 operation.  And, according to the report, they even had tangible examples of planning and using their practices.

What could possibly be going on here?

Seems awfully much like the development group had and used processes.  How could they not rate better than Maturity Level 1 (ML1)?!  Setting aside the specific gaps in some practices that would have sunk their ability to demonstrate anything higher than ML1 – because this isn’t where the interesting stuff shows up, and, because even were these practices performed, they still would have rated under ML2 – what the report’s colorful depiction communicated was something far harder to address than specific gaps.  The developers’ organization was using CMMI incorrectly.  A topic I cover at least in the following posts: here and here.

In particular, they were using CMMI to “comply” with their processes but not to improve their processes.  And, *that* is what caused them to fall far short of their acclaimed CMMI ML3.

How could I tell?

Because of where the practices were consistently good and where they were consistently gap-worthy.

I was reviewing the report with my friend on the phone.  As I was doing so he commented, “Wow!  You’re reading that table like a radiologist reads an X-ray!  That’s very cool!”  The story the chart told me was that despite having processes, and policies, and managing requirements and so on, the company habitually failed to:

track and measure the execution of their processes to ensure that the processes actually were followed-through as expected from a time and resource perspective,

objectively evaluate that the processes were being followed, were working, and were producing the expected results, and

perform retrospectives on what they could learn from the measurements (they weren’t taking) and evaluations (they weren’t doing) of the processes they used.

It was quite clear.

So, here’s the point of today’s post… it’s a crystal clear example of why CMMI is not about process compliance and how it shows up.  There are practices in CMMI that definitely help an organization perform better.  But, if the practices that are there to ensure that the processes are working and the lessons are being learned aren’t performed, then the entire point to following a process has been lost.  In Shewart’s cycle, this would be akin to doing P & D without C & A.

The only chance of anything that way is compliance.  There’s no chance for improvement that way except by accident. 

CMMI is not about “improvement by accident”.  (Neither is Agile for that matter.)

Interestingly enough, while there were clearly issues with the developer’s commitment to improvement, there may not necessarily have been any clear issues with either the product or the results of their processes.  While the experience may not have been pleasant for the developer, I don’t know that by buddy’s client can say to have found a smoking gun in their supplier’s hands.  Maybe what the client needs is a dose of improving how they buy technology services – which they might find in CMMI for Acquisition.