Ensuring Flexible Lean Six Sigma Process Improvement Culture

Last session of the day is Jennifer Thompson of Royal Bank of Canada on process improvement culture within an organization: how to maintain a relevant Six Sigma culture in changing economic times, and how to keep people engaged in the program. RBC is Canada’s largest bank, with 80,000 people in 50 countries; they were a customer of mine for a couple of years in the past through a somewhat drawn-out BPM and process improvement project, so it’s good to see them getting a bit more rigor around this. They’ve embraced Kaizen, and the idea that you can make significant change in a short period of time, which created quite a bit of interest from the business, then followed this with a reworking of their Lean Six Sigma program into a business performance excellence program for their operations areas. More recently, they’ve taken these ideas and pushed them beyond the operations areas, deploying Six Sigma in areas across the organization, such as IT and finance. They maintain a small core group of trained LSS people: 3 master black belts and 20 black belts in the center of excellence, and 105 belts across the enterprise; they measured $35M in benefits in 2009, making this a definitely worthwhile undertaking.

Effectiveness of the program is a function of both the quality of what is produced, and the acceptance of the program, or E = f(Q,A). Their real challenge is in acceptance: becoming and staying relevant, especially considering that they’re applying what could be seen as a manufacturing technique within a financial services organization. Project metrics need to be relevant to the business that you’re in; in RBC’s case, that’s profits and savings, not green belt utilization ratios (which I can’t believe that they even considered as a useful metric). You need to engage and partner with the business, and read their feedback properly to fine tune the skills and offerings of your LSS program.

RBC came to the realization that one size does not fit all, and developed a flexible methodology that leads with Lean, then works towards the ideal methodology as the business needs are understood. It’s important to recognize the size and complexity of the project as you get into it, and adjust the governance to match the project effort. They’ve developed a toolbox of training courses for various stakeholders, from executive/champion training to green belt to Kaizen to their internal business performance excellence programs. They no longer have a separate black belt training, but only an upgrade from a green belt due to the high level of overlap in the required skills.

She covered some points on keeping people engaged in the LSS program: a bi-annual LSS forum and monthly lunch-and-learns for the belts; a monthly dashboard for executives for sharing best practices; and a champion roundtable for champions and sponsors. The LSS program at RBC is not driven from the top down; there is no unequivocal, from the top mandate for it, and some of their challenges are to get people to adopt it when they have no management-driven motivation to do so. That has required the LSS program to show value on multiple programs, then use that to motivate interest in other areas by showing it as a way to improve their business, not a corporate directive.

I’m glad that I dropped in on these sessions, and I’ll be back tomorrow to see more of the ones on wrapping this back into automated processes and BPM. The whole conference is really tiny, only about 30-40 attendees, but the quality of the presentations and conversations is really high. With no open wifi (grumble, grumble) but in my home data network area, I promise that I’ll remember my iPhone tether cable tomorrow so that I can post as I go.

TRIZ, Six Sigma and Transactional Processes

Tom Kling, who has the most magnificent mustache this side of the 19th century, talked to us about TRIZ: an acronym that most of us in the room had never heard of, much less used. I confess, I Googled it before the session, so knew that it was related to innovation – it’s also known as “systematic innovation – and is a Romanized acronym of the Russian phrase “Теория решения изобретательских задач”, or “theory of inventive problem solving”. Kling is from Dow Chemical, a company that knows a bit about innovation as well as chemistry. They’ve had a Six Sigma program since 1999, having trained more than 10,000 green and black belts, and having it fully integrated into all major improvement and innovation processes. I sat with him at lunch, and heard more about how Six Sigma can be applied to research and development areas such as his, as well as to more operational and manufacturing areas that are more common applications. This presentation turns that around a bit: taking a technique used primarily for research innovation, and applying it to transactional processes.

This gets past the issue that I referred to in the previous post, where Lean and Six Sigma are seen as only for incremental improvement, not disruptive change: TRIZ starts with a description of a perfect, hypothetical solution state, then applies a number of techniques such as importing solutions from other fields as well as more incremental improvement based on application of technology. Starting by envisioning an ideal solution can result in more out-of-the-box solutions than working forward from the current state; although the perfect solution likely won’t be achieved, at least it opens people’s eyes and gets them thinking about what’s possible. There are sets of TRIZ problems and solutions, with operators to map problems to solutions. This may allow a specific problem to be generalized to a TRIZ problem using contradiction characteristics, operated on to map to a TRIZ solution, then applied back to a specific solution. There are several methods for doing this, such as using contradiction characteristics such as Altshuller’s characteristics or Mann’s business TRIZ features, and examples of problem solving techniques such as Altshuller’s 40 inventive principles.

He used a very funny example of this: he uses a comb on his ‘stache during warmer weather but a brush during dry, cold weather, but doesn’t like that the combination of comb and brush don’t fit well in his pocket; using a contradiction matrix with increased area being a good thing but increased volume being a bad thing, the suggested set of solutions includes nesting, spheroidality or new dimensions, which leads to his actual solution, a folding brush/comb combo. A more realistic example for the rest of us is a survey where more questions increases the amount of information gathered, but worsens communications flow when not all questions are applicable to all respondents; the solution set includes having a dynamic survey that removes irrelevant questions based on earlier responses.

TRIZ, then, is an algorithmic approach to innovation. This departs from our somewhat fantastical notion of a scientist having an “a-ha” moment at some point in every innovation, or an unbounded brainstorming session generating all the new ideas; now, what’s necessary is a skilled facilitator who can explore how the inventive principles apply to the specific problem in order to generate possible solutions. There’s also a lot of skill required to map between the specific and general problem and solution spaces, as well as evaluating potential solutions (and even generating new hybrid solutions) using something like a Pugh concept evaluation matrix.

At Dow, they’ve embedded TRIZ into their Design For Six Sigma (DFSS) processes for designing chemical plants, allowing for better integration across designs of all the other processes surrounding the chemical plant, such as rail shipping. This doesn’t always result in a disruptive change, but sometimes results in a group of interrelated changes that make a big difference: in one case, reducing shipping transit times and costs, thereby increasing customer satisfaction.

There are many other TRIZ-related tools, applicable to both technological and transactional situations. Some of these, such as mind maps, didn’t originate with TRIZ but are commonly used during TRIZ innovation projects; many have been adopted into the TRIZ body of knowledge maintained by the Russian TRIZ group. It’s also possible to mix several of the TRIZ techniques with Six Sigma techniques to good effect.

Applying Lean Six Sigma Methodology to Transactional Processes

Next up was a panel discussion with David Haigh of Johnson & Johnson, Sabrina Lemos of United Airlines, and Gary Kucera of Kaplan Higher Education, moderated by Charles Spina of e-Zsigma.

United Airlines has a unique project going on in one of their freight-related operations: they decided to outsource the operation in order to be able to completely remake the process and have it meet specific KPIs, but also decided to allow the existing people to bid on their own jobs. This would have the effect of shifting them out of their current ways of doing things and proposing the best possible way to do it, since they will be in a competitive bidding situation with outsiders. Lemos also spoke about the importance of getting to the real data. She did an exercise of tracing a particularly biweekly report –which took several hours to compile – up to the VP and what he reports on, then tracked what he actually reports on back down to the reports and metrics that are being gathered at the lower levels. Not surprisingly, she found that there was zero alignment: nothing in the biweekly reports were used by the VP in his report, or anywhere else in the chain of command. She spoke about using gauge R&R, walk the process, and value stream mapping techniques to analyze processes, and the necessity of coming to agreement on the meaning of things such as process start points.

Haigh spoke about accounts payable processes at J&J Canada, and how an in-depth review of those processes was triggered by someone actually forgetting to pay the electricity bill, and showing up at the office one day to find a notice that the power would be cut if the bill weren’t paid immediately: not that they didn’t have the money to pay the bill, just that the process to do so wasn’t working. Accounts payable is often one of those processes in companies that is ignored when looking at major process improvement because it’s not revenue generating, but it’s important to recognize that enormous cost savings can be found through taking advantage of early payment discount levels, and avoiding any late penalties or service disruptions. They have found that doing some amount of the work onsite where the business processes are being done is helpful, since the process participants can see what’s involved in their process overall. They use the same techniques as discussed by Lemos, plus Kaizen Blitz and some activity-based costing.

Kucera spoke about aligning the corporate and executive goals with efforts at all levels, and how Jack Welch suggested making your bonus be some percentage of your process improvement savings in order to incent people to align their behavior and metrics with the ultimate goals. He spoke about some of the modeling and display tools that they use, such as fishbone and Pareto diagrams, and how doing these early and engaging with the business management can greatly speed the process improvement efforts. In many cases, since they’re dealing with simple transactional processes, they can use fairly simple analysis tools, but have some of the more sophisticated tools and techniques available as required.

They all had examples of process improvement efforts that have had a direct customer impact. Lemos had a great example of processing freight insurance claims, where they had a metric of processing five claims per day, resulting in the claims people cherry-picking claims in order to meet their quota; enforcing first-in, first-out claims processing resulted in an immediate and dramatic improvement in customer satisfaction. Listening to her stories of their paper-based inefficiencies, where emails are printed, signed and passed around, reminds me so much of the processes in some of my financial services and insurance customers.

In all cases – and I think that this is a key criticism of Lean and Six Sigma – they’re looking for incremental process improvements, not completely disruptive reengineering that would discover new ways to do business. However, in many of today’s standard transactional processes, incremental improvement is the only alternative.

Lean Six Sigma & Process Improvement: David Brown of Motorola

I missed the first morning of the IQPC Lean Six Sigma & Process Improvement conference in Toronto today, but with my usual impeccable timing, showed up just in time for lunch (where we had to explain the rules of curling to the American attendees). The first session this afternoon is with David Brown, a black belt at Motorola, where the term “Six Sigma” was first coined and is still used to make their processes more effective, efficient, productive, and transparent.

There has been a transformation for them in how they analyze their processes: ranging from just looking at transactions to high-level intelligence including complex simulations and forecasting. Since they run SAP for their ERP, they have a number of SAP business intelligence (Xcelsius and Business Objects) products, although their most complex analysis is done with Oracle Crystal Ball.

Brown’s presentation was short – less than 10 minutes – and the rest of the session was an interactive one-on-one interview with questions from Charles Spina of e-Zsigma, the conference chair. The Q&A explored much more about how Motorola uses business analytics tools, and opened it up to the (small) audience for their experience with analytics. Not surprisingly, there has been quite a bit of success through the introduction of analytics to process improvement teams: sometimes it’s the black belts themselves, sometimes it’s a separate analytics group that works closely to develop the reports, analysis, and more complex intelligence based on the large volumes of data collected as part of any process improvement project.

Reporting tools can be as simple as Excel – for simple needs – through more complex solutions that include ETL from multiple data sources and regularly scheduled reports, such as Crystal Reports and Xcelsius. Legacy systems can make that a bit of a challenge; often these end up as extracts to Excel or Access, which are then remixed with other sources. Extracts such as this can be really problematic, as I’ve seen first-hand with many of my customers, since there’s no way to keep the data completely in sync with the underlying systems, and typically any one legacy system doesn’t have all the relevant data, so there can be a real problem in matching up related data from multiple systems. Brown underlined that the key issue is to get all of your data into a central data warehouse in order to determine if your data is complete and clean, and to facilitate reporting and analytics. This is especially important for process engineers when trying to do time studies over long periods of time: if you don’t have some consistent representation of the processes over the time period in question, then your analysis will suffer.

Motorola is using their data analytics to improve operational processes, such as order shipping, but also what-if scenarios to inform salespeople on the impact of discount levels to the bottom line. In many cases, this is an issue of data integration: Sabrina Lemos from United Airlines (who will be on the panel following) shared what they were able to recover in late container fees just by integrating their container tracking system with a database (Access, alas) that generates their invoices. Interestingly, I wouldn’t have thought of this as a process improvement initiative – although it is – but rather just as an artifact of doing some clever system integration.

They also discussed the challenges with presenting the results of analytics to the less numerically inclined, which often entails rolling data up to some simpler charts that can be drilled into as required, or just presented in a PowerPoint or PDF file. The real ROI may come from more interactive tools, however, such as dashboards that show operational alerts, or real-time what-if analysis to support human and automated decisions. Since Lean and Six Sigma tools are inherently analytical, this isn’t a new problem for the people in this audience; this is a matter of building relationships early with the non-analytical business managers, getting some early successes in projects to encourage adoption, and using different presentation and learning styles to present the information.

Because of the nature of this audience, the analytics that they’re discussing are typically for human consumption; in the BPM world, this is more and more moving to using the analytics to generate events that feed back into processes, or to inform automated decisioning. Either way, it’s all about improving the business processes.

Forrester Day 1: Automating Business Processes panel

Connie Moore moderated a panel on automating business processes, featuring David Knapp of Ford, Pamela Rucker of Philip Services and Theo van den Hurk of ABN AMRO. The room is considerably less crowded than this morning, so I’m guessing that there’s some golfing going on (probably all the CIOs whose golf games were rained out last week at the Gartner conference in Orlando). I really like how they use the big projection screens to show live video of the panelists; I’m sitting way over on the side to score some power for the laptop, so can’t see two of the presenters directly.

Moore talked about Forrester’s BPM maturity framework, which I’ve never seen before but it’s similar enough to BPMM and others that I’ve seen: moving from process knowledge to process efficiency to process consistency to business optimization to business transformation, where most companies are in the efficiency stage, moving towards consistency.

Each of them told a bit about what they’re doing with BPM:

  • Ford is another Lombardi customer that’s just getting their implementation started (it’s sort of ironic, considering that Ford pioneered the concept of the assembly line, that they’re only now getting around to business process automation). They’re a big Six Sigma shop, and are looking at getting some automation and metrics in place, then drive towards optimization using BPM. They’re using BPM in large part for orchestration of their existing legacy systems.
  • Philip Services has a mandate to innovate, but no extra budget to do so, which is a common problem in organizations; they don’t use BPM software but are effectively building their own in code or within the enterprise systems.
  • ABM AMRO is using SAP and Oracle as their enterprise systems, and as far as I could tell, they’re not using a BPM suite to orchestrate that but are relying on the processes within those enterprise applications.

Knapp showed an interesting slide if how BPM bridges the gap between end user computing and full-on IT application development; I think that there’s also an overlap between mashups and BPM at some part of that spectrum. Ford has an enterprise process committee that looks at process management across the organization, especially focussing on the discontinuities (hand-offs between functional silos), and decides which processes to implement. However, they’re still narrowing down and deciding which processes to implement.

Rucker said that two major issues for them was having the business take ownership for business process management, and getting away from siloed process optimization (like the accounting department) to look at end-to-end processes (like order to cash). They even got the CEO involved to drive home these points home to people.

van den Hurk talked about the complexities introduced by having several outsourced vendors involved in their systems as well as their own IT people; just getting all the stakeholders to sit down together was a challenge. ABN AMRO looked at heat maps for operational budget areas to figure out where the money was being spent, as well as what the business reported as the pain points.

There was a question about metrics, monitoring and dashboards: Ford is designing it into their systems; Philip put it in after the fact when they realized that processes weren’t improving and had no visibility into why; ABN AMRO also is building it in based on the business needs.

As panels go, this was pretty conversational rather than a series of mini-presentations: good to attend, but harder to blog about in a coherent fashion.

IQPC BPM Summit: David Haigh

Last speaker of the day — and of the conference — was David Haigh, Global Director of Continuous Improvement at W.E.T. Automotive Systems, discussing Lean Product Development. It’s actually refreshing to be at a BPM conference where I’m the only person that I heard (since I missed Jodi Starkman-Mendelsohn’s talk this morning) that talked about the technology.

They previously tried out a lot of different quality programs, including ISO 9000, Six Sigma, Lean, BPR and other techniques, but these were always initiated by the subsidiaries and didn’t really catch on, so in 2006 they started on a global program that included the shop floor, logistics and product creation. Whereas they had always focused on the production/fulfillment value stream previously, they expanded the scope to include the entire order-to-cash cycle, particularly to include the design portion of the cycle that has the smallest cost element but the largest cost influence.

I loved his analogy for hand-offs in the business process: it’s like the telephone game that we played as kids, whispering a message from one person to the next to see how message changes by the time it reaches the end; any hand-off results in a reduction in information clarity, as well as being a big time-waster.

Since he’s in an engineering manufacturing environment, there’s some interesting ideas that at first seem unique, but have value in many other areas: set-based design, for example, where you spend the engineers’ time researching and pushing boundaries on the technology that underlies customer solutions, rather than spending the time building one-off customer solutions. The equivalent in the BPM world would likely be having them focus on building out the service layer, not assembling the services using a BPMS. He also spoke about Toyota’s practice of streaming engineers up to higher levels of engineering rather than “promoting” them to sales or management — I always tried to do that when I ran a company, since there’s always some people who just want to stay technical, and don’t want their career to suffer for it.

They’ve built a “workflow” and project planning tool in Excel that has some interesting concepts: no dependencies between tasks, just points of integration, and the team sets the deadlines (can you say “collaboration”?). This helped them by providing tools for visualizing waste in the process, and driving to reduce the waste, which is the main focus of Lean.

This has been an interesting conference, although the attendance is quite a bit less than I had expected, but that makes for a much better environment for asking questions and networking. And speaking of networking, I think that I just have time to run home before the Girl Geek Dinner tonight…

Is Six Sigma going the way of TQM?

Completely non-Enterprise 2.0, except for the fact that I heard this at lunch today from someone who works for a large financial services organization: 3M may ditch their Six Sigma program because they find that it stifles creativity. To quote Charles O’Reilly, a Stanford Graduate School of Business management professor, “If you take over a company that’s been living on innovation, clearly you can squeeze costs out. The question is, what’s the long-term damage to the company?” If this trend continues, Six Sigma training might join a list like this.

Gartner Day 2: Bruce Williams

The second keynote speaker of the day is Bruce Williams of Savvi International, author of Six Sigma for Dummies (and the accompanying workbook) and Lean for Dummies, speaking on What BPM Means to Business Innovation. Funny, at last year’s Gartner BPM summit, everything was about Six Sigma; this year, this is the first time that I’ve heard it mentioned.

He points out one view of BPM, that it’s just a faster, better treadmill, but we’re still doing the same old things. BPM is more than that: not just operational efficiencies and defect reductions, but measurements and activity monitoring, process controls, and integration between systems and services. Furthermore, he goes on to say that the biggest value from BPM is in business innovation, not process improvement: the introduction of something new and useful and the process by which it is brought to life.

But why is innovation important? Why not just milk the cash cows? The answer is pretty obvious, although ignored by many traditional organizations: the lifecycle of every product or service eventually comes to an end, often because someone else introduces a disruptive product or service to the marketplace that obsolesces the old way of doing things. As James Morse of the Harvard Business Review said many years ago (a quote that I have referred to many times), “the only sustainable competitive advantage comes from out-innovating your competition.” Ultimately, innovation trumps optimization.

Williams continues on with a lot of stuff about why the innovation cycle looks like it does, but there’s really nothing new here: this is just the classic stuff for why products or services pass their peak: fatigue, customer demands, market redirections, competitive pressures, technological changes, globalization effects, organizational changes, demographic shifts, regulatory constraints, economic effects, supply drifts and many other factors. He does point out, however, that most US firms have no program in place for fostering innovation, and don’t even have a clear idea of how to become more innovative. Tom Davenport did a study last year that showed that companies are focussing primarily on product innovation, and mostly ignoring things like business model innovation, or even business process innovation; Williams added some things that didn’t even make the list, like innovation in accounting practices or risk management.

He went through some of the different dimensions of innovation — reactive versus proactive; incremental/sustaining versus radical/disruptive; formal versus informal — and looked at how these dimensions mapped onto some specific cases. When he referred to Americans as the kings of innovation, however, it made me doubt his world view overall and left me with a bit of a bad taste: it came across as ethnocentric flag-waving that has no place at a business conference. I recognize that Americans lead innovation in a number of areas, but there are many other countries in the world that are leaders in their own areas of innovation. He’s also under the deluded notion that everyone wants what Americans have, driving SUVs full of consumer goods back to their monster homes in the suburbs, and laughingly pointed out a survey that he had done that concluded that if everyone in the world lived like he did, we’d need over 7 planets worth of resources to accommodate them. Yikes.

At the end of it all, although he had a pithy quote about how BPM is the grand unification theory for business (which is apparently trademarked?!), Williams had very little to say about BPM, but a lot to say about innovation: one of the prime motivators for why you might be considering BPM.

Two disappointing BPM webinars

Maybe it’s the January blahs. Maybe I’ve seen too many of these things. Maybe they actually were as bad as I perceived. Maybe I’m just cranky because I’m missing Mashup Camp. In any case, I was not impressed by either of the two webinars that I sat in on today.

Making Your People More Effective Through Business Process Management (BPM) Technology

This was the 2-hour web marathon that included a keynote from Colin Teubner of Forrester, followed by several Microsoft partners who all have BPM technology that also ties into Microsoft in some way. I’ve enjoyed hearing Colin speak before, both on webinars and in person, but this time was definitely nothing special. The material seemed to be all retreaded stuff from past presentations by Colin or by Connie Moore: how human-centric BPM evolved from workflow, how integration-centric BPM evolved from EAI; I think I wrote that stuff already.

He did make a very cute analogy between between services and Lego for the (presumed) business-oriented audience. He also made the critical point “BPM doesn’t need SOA; SOA needs BPM” that really needs to be understood by everyone involved in both of these fields, especially as they start to merge into one (several years after Gartner called them all one when they actually weren’t).

Then he made an uncomfortable segue into using Microsoft Office applications as part of the whole BPM picture (I wasn’t convinced), as a prelude to what promised to be several vendor presentations; I lasted through part of the first one before I bailed out due to the double frustration of an over-large screen shared on the meeting (didn’t anyone tell the presenters to scale down to 1024×768?) and a fairly significant lack of content.

BPM 302: BPM and Six Sigma

This was one of the Appian webinars from the BPM Basics site. I’ve commented on these before: some good basic material if you’re just getting started with BPM, but today’s was an exception in that I would not recommend it due to the errors in definition around Six Sigma. If you can’t even define your subject matter accurately, then you shouldn’t be speaking about it.

It all started going downhill when the first speaker said “There are six levels of Sigma”, as if it were some sort of CMM-like certification. That shows such an incredibly fundamental misunderstanding of Six Sigma and statistical measures in general, I’m practically speechless. Not meaning to sound like too much of a techie snob, I should have figured that there could be a problem when she was introduced as having a arts degree, and I don’t think that statistics was her major or minor. She also said “sigma is the number of defects per million”, which isn’t at all correct either; sigma (for all of us who did suffer through stats class) represents standard deviation, and you can check any of number of other sources to find that Six Sigma’s goal is to have six standard deviations between the process mean and the specified limit. In other words, a higher value of sigma means that more of the data (whatever it is that you’re measuring) fits between the mean and the limit (beyond which things are classified as errors).

The second speaker was talking more about BPM technology and seemed to know what he was talking about, but was horribly unprepared and oscillated between overly-fast reading of a prepared script and pause-laden bits where he was either winging it or totally losing his concentration.

Six Sigma and Proforma

Day 2 of the Proforma conference included three additional customer presentations, one from a partner, then all the exciting stuff about the upcoming product release.

Following on the heels of the panel at the end of day 1, in which Paul Harmon and Geary Rummler slammed Six Sigma, Deb Berard from Seagate spoke about their successes with Six Sigma and Proforma. Seagate has been using Six Sigma since 1995, and has been seeing a lot of success with it and Lean — not surprising for a manufacturing organization, which is where Six Sigma originated. They use the Six Sigma framework in ProVision, and their initial process analysis and modelling efforts led to the improvement of some of their product development processes. Based on that success, they then pushed it out to an enterprise-wide initiative.

The only thing that I really had an issue with was her calling ProVision a business process management system (BPMS), which it’s not: it’s a modelling suite. Although BPM still doesn’t have a fully accepted definition, I believe that BPMS has a very specific meaning.