The BaldrigePlus Newsletter
Issue 9, Sunday March 11th, 2000



Five minute masterclass
Leadership
“… in every organization there is a formal management structure – the one you see on the org chart – as well as a different, so-called emergent structure based on individual abilities and actual power relationships, which determines how the organization really gets work done.”
Harvard Business Review, March-April 2000, p8 editorial.

So, you’ve got some ‘quality’ responsibilities in your outfit. How significant are the back-channel structures? Are you sure you’ve got support where it matters? If your sponsor is not in the real power loop, is your personal commitment likely to be career-limiting? How do you know?

Let’s assume you’ve got a handle on the politics, what are the ‘Baldrige’ implications? Think disconnects. Potentially very damaging – both for your application and your organisation – disconnects are the mismatches between what supposedly happens and what really happens, between intent and outcome, between what the CEO says to the board and what the road warriors actually do. Disconnects can sink an application (and an organisation).

How about making the emergent structure real? Here’s what happens in a unit of city government recently case-studied here at BaldrigePlus:

“The CPG’s organisation chart – a flat but conventional inverted tree – reflects the formal council-approved delegations necessary in a unit of local government. Five divisional managers answer to the group manager and take responsibility for specific functional and/or geographical areas. But three cross-divisional leadership teams – Management, Business Directions and Staff Consultative – actually run the CPG, with web-like organigraphic relationships …”

“Team principles (structure, functions, declarations, ground rules – in effect, team constitutions) are updated annually and posted as charter statements on meeting room walls, distributed as hard copies and published on the group’s intranet. Semi-formal and temporary special-purpose “spoke teams” are also set up when needed, with members drawn from across the CPG's five divisions.”
Performance Excellence in Local Government, The award-winning City Planning Group, Auckland City Council, New Zealand

An approach like this probably won’t account for and represent all the informal power relationships, but it is more likely to approximate how work really gets done, it acknowledges that the org chart is not the whole picture, and it continually mixes people in different combinations, unsettling the old boys’ networks.

Footnote: ‘Disconnects’ sometimes just means the internal inconsistencies in organisation descriptions – in Baldrige applications, for example. But it’s a concept that can also usefully be applied to organisations.

Workplace realities
Total Quality in two hours
Take a newly formed team, just hired for a seasonal job in a service industry. Some veterans, some newcomers. A range of ages, but mostly young. A variety of educational backgrounds. They’ll all be dealing directly with customers. You’ve got two hours to teach them something worth-while about quality. Where do you start! How about this?

Get them all together in one room. Warm up with a heavily prompted (everybody in) discussion about what team members understand ‘quality’ to be. Use the language of their workplace. Conclude that it’s about both means and ends, processes and products, them and their customers.

Talk about customers. What does ‘quality is what your customers tell you it is’ mean? Conclude that quality is a ‘customer’ thing. Ask ‘why do you come to work. Is it about a pay packet, or about customers who pay’? Ask ‘what does standing in your customers’ shoes mean’? Lead and prompt a discussion. Maybe do a quick role-play.

Pause, review. There’s good and bad quality, right? Discuss. Things will go well, things will go not so well, every day. Especially in the early days when everyone’s learning. When things go well, customers are happy, and vice versa.

Ask how the team will make sure their customers are always happy – that things will always go well? Work harder, or work smarter? Introduce some ways to work smarter.

(1) Bouquets and Brickbats
Suggest that every day at the same time – maybe twice a day for the first few days – the team gathers for a ten minute stand-up meeting. Everyone – no exceptions. Explain how this works, with real-world examples.

Suggest they keep the meeting agenda simple, and always the same. On a whiteboard or flipchart the team leader makes a list of what went well today. Celebrate the successes, maybe with a vote for a jokey Chocolate Fish Award.

Make a separate list of things that didn’t go well – label it ‘Opportunities for improvement.’ Depending on what the OFIs are, have a quick discussion about how to get them off the list. Don’t point fingers or look for scapegoats. OFIs are system faults, not people faults – explain. Keep a record of the OFIs.

If the same or similar OFIs keep appearing, do something about them. Here’s how.

(2) If it’s broke, fix it
The whole team will be busy. It may be difficult to get them to take a deep breath, slow down, and fix what's broke. It will depend on how urgent the need is. If they decide there’s a need, here’s a process:

Decide ‘what’s the mischief?’ What, exactly, needs to be fixed. Then do a quick stand-up brainstorming session – calling out all possible solutions and writing them down; no debate, no criticism, just solutions. As many as possible, silly or sensible. Try writing separate ideas on large Post-It Notes and sticking them on the wall. Maybe group similar notes together, or put them one above the other, with the ‘most likely to work’ at the top. Anyone can move any note, put it anywhere in the sequence they like. Run the game until everyone agrees – all the notes are where everyone agrees they should be.

Take the top idea – the ‘most likely to work’ suggestion – and the next day, try it. At the next meeting ask how it went. Make changes if required, try it again. When it’s working, and if it’s an improvement, it becomes the new way that thing gets done. Explain that this is a traditional PDCA cycle, sometimes called a paduca, used all over the world in all sorts of workplaces.

(3) Process mapping – ‘everything around here is a process’
Ask the team to think forward. It’s weeks later. The daily meetings now run just a few minutes, and there are only occasional OFIs. It’s time to take the next step – from fixing OFIs to Total Quality. From the ambulance at the bottom of the cliff to the fence at the top. Recall, review (and illustrate) earlier discussions about quality means and quality ends. Now, team, you’re going to work on the means:

Suggest they start by making flow charts of how work gets done – the sequences that deliver services to customers. Use the whiteboard on the lunchroom wall – show the team some examples of flow charts. Once everyone’s happy that their charts explain what happens, break them up for each person or each unique job.

Now look at each part critically. Are there unnecessary steps? Are there several ways to do the same job (and if so, which one is best)? Are there some that could be done faster, better, safer? Use the familiar Fix It process to rank possible changes. Run a paduca.

When new team members arrive, use the charts (fancy name – process maps) as a training tool. Perhaps have the map for each job posted right where that job gets done – as a reference. Keep them for next season.

If the team has a member or two who like working with numbers, think about encouraging them to start measuring how long things take, or how variable some processes are, or how much the workload fluctuates (and how effective the team’s response is). As a rule of thumb, measure only what’s important to customers. Use these data to suggest further process improvements.

How would you approach this challenge? Answers on a postcard!

The continuing story …
With apologies to all our new subscribers. Catch up here.

Six sigma IV
Rip Stauffer (Senior Consultant at BlueFire Partners in Minneapolis, MN) emailed, in part: “It was interesting to note the debate between Steve Prevette and Grant Blair concerning Six Sigma. I am very familiar with Steve's work [and] respect and admire both Steve and Grant. It's great to follow their dialogue.

“You might point out that Wheeler has a new book coming out soon around this very question. As part of his research for the book, he looked at 1,143 distributions … from U-shaped to normal, and he found that 1,038 of them had more than 98% of their area within 3 standard deviations of the mean. 644 have better than 99% within 3 standard deviations; only 10 had less than 97% within 3 standard deviations of the mean.

“His conclusion in the paper I saw: ‘Since it is in the extreme tails of these probability distributions that result in less than 99% coverage, and since finite data sets tend to under represent the extreme tails of a probability model, the empirical rule (Approximately 99% to 100% of the area will be within 3 standard deviations of the mean) has been hedged in favor of what will generally happen in practice, rather than merely reporting what is seen in the mathematical models.’

“Although Six Sigma is obviously based on and derived from the area under a normal curve, it is used as a communications metric, and not a scientific one. It's a way to express the approximate aggregate capability of a process. Understanding it that way requires that you … add a new definition for ‘sigma’ to your vocabulary, but once you do that, it actually becomes a … useful communications vehicle.

“It is unfortunate that they used the word ‘sigma,’ because it has such precise meanings in the statistical world (or not, depending upon whom you read).”

Those numbers II
In Newsletter 8 Steve Prevette offered a ‘Deming’ answer to the question ‘What charts should a CEO see,’ and argued that composite numbers – averaged, aggregated, whatever ... like those in IBM Rochester’s Quality Dashboard – don’t tell a reliable story, contravene the 11th of Deming’s 14 points for management.

Steve Hoisington, an architect of IBM Rochester’s Quality Dashboard (and now VP of Quality at Johnson Controls) has joined the debate. Here, in part, is his response:

“You know I had to comment on Steve Prevette's assessment... The IBM Rochester Quality Dashboard contains a small set of measurements that reflect customer’s perceptions of quality. Deming followers often miss the customer aspect of measuring quality results.

“Quality professionals also need to understand their internal customer requirements. If management wants a simple, one-page report card on quality status, and you cannot convince them otherwise, then the Quality Dashboard provides a comprehensive consolidation.

“… it is impossible to portray control charts for all 19 measurements included on the Quality Dashboard, and meet the one page requirement. The goal [of a quality professional interpreting these results for management] should be management awareness and understanding of the results, and a willingness to review and provide resources for situations that warrant their attention.

“… the targets on the chart were set two ways. Some were based on customer requirements. What right do I [as a quality professional] have to tell customers they are wrong? The rest are based upon financial performance commitments of the organization … performance factors such as planned warranty expenses or headcount.

“If you work backwards from company-set financial objectives, these ultimately translate to the targets shown on the Quality Dashboard.

“I do not think Deming meant that an organization should not set goals or targets. If I have no goals for myself or my organization, the results I achieve are the results I deserve, and are likely [due to] chance. This implies an environment full of variability and surprises. There is a distinct difference between a quota and a goal.

Steve Prevette’s motivation for entering this debate was ‘peer review,’ the critical appraisal of his professional colleagues, and so is Steve Hoisington’s.

“I welcome any input to improve upon processes I instilled at IBM, and have transported to Johnson Controls, Inc.,” Steve Hoisington wrote, “I would hope the legacy I left behind at IBM, and the legacy I create at Johnson Controls, is a company that takes a ‘systems’ approach to quality management - with Baldrige as the model, properly employing a plethora of tools to the right situation; as opposed to implementing a narrow approach in a ‘one-size-fits-all’ environment.

”… the message to quality professionals is not to get hung up on the use of a single tool or methodology such as ‘Deming’ or ‘Six-Sigma,’” Steve wrote in a later email, “I like to think of my quality skills as a toolbox of tools, to be applied to the right situation. If all I have in my toolbox is a hammer, every opportunity starts to look like a nail...”

Footnote: Steve Hoisington has co-authored (with Earl Naumann) ‘Customer Centered Six-Sigma: Using Customers to Drive Continuous Improvement,’ to be published in about six months by ASQ. “The entire book discusses alignment of customer needs with business operations, processes, and performance,” Steve says.

Benchmarking
Beryl Anne Sanders (bsanders@lgu.ac.uk), who teaches business studies at London Guildhall University, would like to make contact with anyone interested in benchmarking in tertiary institutions. Contact her direct.

Spelling
This Newsletter has an international readership, originates in New Zealand (which uses British English) but refers often to North American material (where American English rules – organization vs organisation, centre vs center, for example). We try to respect the spelling conventions of the origin of our material, but sometimes slip up.