when relevant content is
added and updated.
Twenty years ago, when a change program came in, it came in a three-ring binger. Today it is more likely to be a confluence page or a division-wide email. No matter the delivery format, they all seem to have about the same actual chance of making a lasting impact, somewhere between slim and nothing. I’m not trying to be down here. The effort to create the new program is usually noble. Most of these programs address real problems and organizational pain. We just can’t figure out how to make them stick.
Today I’ll talk about how to get there.
Not change management!
There is an entire discipline about change management. They have conferences, they have their own language. Within the change management space I’m a fan of the work of John P. Kotter, who talks about creating a sense of urgency and building a coalition and so on.
And that is not what I am going to talk about today.
I’m not going to give you a roadmap. Not going to suggest kickoff meetings, mid-trospectives, project plans, time boxed experiments, or anything else. Those are all great things. Instead, I’m going to offer a simple observation.
If people don’t want to change, your change program is going to fail, even if you do all that stuff right.
If people do want to change, your new program will succeed.
One major goal of the work should be put the systems in place to make it possible to succeed.
Over the past several years, the best tool I have found for that is Six Boxes Analysis.
Six boxes made simple
Say you want to introduce something new into an organization, like Test Driven Development. Six boxes gives us six simple questions to ask about the change.
Here’s a quick rundown.
#1 – Are the expectations clear?
#2 – Do people have the tools and resources to do the job?
#3 – What incentives and consequences do people have to make the change?
#4 – Do people have the skills and knowledge to try the new change?
#5 – Do the employees have the right aptitude/role fit?
#6 – How is morale impacting performance?
Six Boxes in Practice
Looking again at TDD, it is usually framed as an experiment to try on some low-risk pieces of code, perhaps in your spare time. The team may be given some training that is often based on “canned” exercises, not legacy code. Programmers who slow down and struggle with code to make it work will have poor performance, at least in the short term. Programmers who ignore the training and do what they have always done will be rewarded by getting to production faster.
The incentives are wrong. The expectations are not clear. The team may have the knowledge, but they never really experienced the skills transfer. The team may or may not have good tools to make the change.
Before trying this big change effort, management (and the trainer) would have done well to sit down, examine the boxes, and figure out how to improve the odds of success.
What do you do to improve the odds of success of your change initiatives?
It might be time to start.