BPM Success at BlueCross BlueShield of Tennessee

Rodney Woods of Tennessee BCBS started out talking about their 18-month history with Pegasystems SmartBPM by stating that you would have to pry Pega out of his cold, dead hands to get it away from him.

His laws of BPM success:

  1. The most important activity in business is improvement: improvement ensures competitiveness; your job is to drive improvement; if you improve the right things, in the right sequence, your business will take care of itself.
  2. Setbacks are not failures; failure is staying the same. Success requires setbacks; winning daily firefights is not progress.

There are four areas of improvement to consider: profit, product, process and people. His key is to make these sorts of innovation second nature, so that they occur routinely within your organization.

He had some good points about identifying the right BPM project, including:

  • Make sure that it’s related to a key strategic business issue: e.g., not just process efficiency, but tied to more effective customer service
  • Get customer and stakeholder input on the issue
  • State the problem as threat or need, not a solution
  • Define the process owner and key stakeholders
  • Focus on the process that is most critical and/or contributes the most

Most of his comments were about organizational issues, not technical issues: strategy, reporting relationships, continuous improvement, and executive support. Many of these were not specific to BPM projects, but any potentially transformational business-technology project. In fact, except for his initial comment, he didn’t really talk about their Pega solution at all; instead, lots of great advice regardless of your technology selection.

That’s it for me at the Gartner BPM summit 2011 in Baltimore; there’s vendor hospitality suites tonight and a half-day of sessions tomorrow, but I’m headed home after a week on the road.

The Great Case Management Debate

With a title like that, how could I miss this session? Toby Bell (ECM), Kimberly Harris-Ferrante (insurance vertical) and Janelle Hill (BPM) took the stage for what was really a live research session rather than a debate. Is it a process pattern covered by BPM? Is it functionality within ECM? Is it an industry-specific vertical application? Gartner is still evolving their definition of case management (as are many people), and currently publish the following definition:

Case management is the optimization of long-lived collaborative processes that require secure coordination of knowledge, content, correspondence and human resources and require adherence to corporate and regulatory policies/rules to achieve decisions about rights, entitlements or settlements.

The path of execution cannot completely be predefined; human judgment and external events and interactions will alter the flow.

Harris-Ferrante said that we need to first create industry-specific definitions or examples of what a case is, then this definition can be presented in that context in order to make sense.

Bell made the distinction between content-triggered automation (e.g., paper invoice scanning and processing), collaborative content-rich processes (e.g., specific projects such as construction), and case management: there’s a bit of a spectrum here, based on a variety factors including cost, complexity, people involved and time to completion. Case management is distinguished from the others by (human) decisions supported by information: Hill felt that this decision-support nature of case management is a defining feature. Harris-Ferrante talked about the cost and risk factors: case management is used in situations where you have compliance requirements where you need to be able to show how and why you made a particular decision. She also pointed out that rules-based automated decision is really standard BPM, whereas rules-supported human decisioning falls into case management.

They showed a slide that talked about a continuum of business process styles, ranging from unstructured to structured; looks vaguely familiar. Winking smile Okay, they use “continuum” rather than “spectrum”, have five instead of four categories, and put structured on the right instead of the left, but I am a bit flattered. Their continuum includes unstructured, content collaboration, event driven, decision intensive, and structured activities; they went on to discuss how case management is the most common example of an unstructured process style. I found that wording interesting, and aligned with my ideas: case management is a process style, not something completely different from process. Business process management, in its most generic form, doesn’t mean structured process management, although that’s how some people choose to define it.

Looking at the issue of products, they showed a slide that looked at overlaps in product spaces, and puts BPM in the structured process/data quadrant, with case management far off in the opposite quadrant. As Hill points out, many of the BPM vendors are extending their capabilities to include case management functionality; Bell stated that this might fit better into the ECM space, but Hill countered (the first real bit of debate) that ECM vendors only think about how changes in content impact the case, which misses all of the rules and events that might impact the case and its outcome. She sees case management being added to ECM as just a way that the relatively small market (really just four or five key vendors) is trying to rejuvenate itself, whereas the case management advances from BPM vendors are much more about bringing the broad range of functionality within a BPMS – including rules and analytics – to unstructured processes.

Hill stated that Gartner doesn’t have an MQ for case management because there are so many different styles of case management: content-heavy, decision-heavy, and industry-specific packaged solutions. Besides, that way they could sell three reports instead of one. Not that they would think that way. Harris-Ferrante discussed the challenges to case management as an industry application, including the lack of shared definitions of both cases and case management, and Bell stated that buyers just don’t understand what case management is, and vendors are rejigging the definition to suit the customer context, so aren’t really helping in this regard.

In spite of stating that they don’t have a case management MQ, they did finish up with a slide showing the critical capabilities that customers are asking for in case management. such as a balance of content, collaboration and process services; and high-configurable case-based user interface. They lay these out against four styles of case management – collaborative forms-based case management, knowledge workers collaborating on internal content, regulated customer-facing file folders and data, and costly processes initiated by customers – and indicate how important each of the factors is for each style. I definitely see the beginnings of an MQ (or four) here. They did state that they would be issuing a research report on the great case management debate; I’ll likely be giving my take on this topic later this year as the industry track chair at the academic BPM 2011 conference.

It’s clear that the definition of case management needs to firm up a bit. As I asked in a tweet during the session: case management: is it a floor wax or a dessert topping? As any old Saturday Night Live fan knows, it’s both, and that could be part of the problem.

Selecting a BPMS

Janelle Hill of Gartner gave a short presentation on selecting a BPMS. Some of her points:

  • The coolest BPMS may not be appropriate. Take advantage of the model-driven development environment that is appropriate for your business people rather than just what’s the most fun for the developers. A typical feature-function evaluation may not be the best way to go about it, since the functionality can vary widely while providing the same business capability.
  • A BPMS is a suite of technologies for supporting the entire lifecycle of process improvement: discovery, modeling, execution, monitoring and optimization. It’s a platform that includes both design-time and runtime. She showed the classic Gartner “gears” diagram showing all the components in a BPMS, and pointed out that you probably don’t need to do a deep dive into some of the components such as business rules, since that’s typically not the deciding factor when selecting a BPMS. A BPMS is a composition environment rather than a full development environment, where the components are used together to graphically assemble pre-existing building blocks from outside the BPMS together with some functionality built within the BPMS to create a process application. As a composition environment, the registry and repository are important for being able to locate and reuse assets, whether created inside or external to the BPMS.
  • A BPMS is not the same as a SOA suite: the latter is used to create services, while the former consumes those services at a higher level and also provides user interaction. As I’ve said (usually in front of my service-oriented friends), a BPMS provides the reason that the SOA layer exists.
  • A BPMS provides visibility, adaptability and accountability particularly well, so you should be considering how a BPMS can help you with these business capabilities.
  • If business (or a combination of business and IT) need to be able to manage process change, or processes change frequently, then a BPMS is a good fit. If process changes are purely under the control of IT and the processes change infrequently, then more traditional development tools (or an ERP system) can be considered. She talked about frequently changing processes as being served by systems that are built to change, whereas those with less frequently changing processes as being built to last, but pointed out that “built to last” often translates to brittle systems that end up requiring a lot of workarounds or expense changes.
  • She presented Gartner’s top four BPMS use cases: a specific process-based solution, continuous process improvement, redesign for a process-based SOA, and business transformation. Their latest MQ on BPMS has more information on each of these use cases; if you’re not a Gartner customer, it’s available through the websites of many of the leading BPMS vendors.

She then moved into some specific evaluation criteria:

  • Know your dominant process patterns: straight-through, long-running with human involvement, dynamically changing processes flows, or collaboration within processes. She categorized these as composite-heavy, workflow-heavy, dynamic-composite-heavy and dynamic-collaborative-heavy, and showed some of the tools that they provide for helping to compare products against these patterns. She stated that you might end up with three different BPMS to match your specific project needs, something that I don’t completely agree with, depending on the size of your organization.
  • Don’t pick a BPMS because it’s a “safe” vendor or enterprise standard, or because of price, or because the developers like it.
  • Do pick a BPMS because it enables business-IT collaboration, because its capabilities match the needs of a defined process, it supports the level of change that you require, and it interoperates well with your other assets.
  • Do an onsite proof of concept (POC), 2-3 days per vendor where your people work side-by-side with the vendor, rather than relying on a prepared demo or proposal. She had a lot of great points here that line up well with what I recommend to my clients; this is really necessary in order to get a true picture of what’s required to build and change a process application.
  • Check for system scalability through reference checks, since you can’t do this during the POC.

She ended with some recommendations that summarize all of this: understand your requirements for change to determine if you need a BPMS; understand your resource interaction patterns to define the features most needed in a BPMS; ensure that your subject matter experts can use the tools; and have a POC to evaluate the authoring environment and the ease of creating process applications.

BPM and ERP at AmerisourceBergen

Gartner BPM always includes sessions for the vendor sponsors, and most of them are smart enough to put one of their customers on stage for those presentations. This afternoon, I listened to Manoj Kumar of AmerisourceBergen, a Metastorm (OpenText) customer discuss how they used BPM as an alternative to customizing their SAP system, as well as to streamline and improve their processes, and enforce compliance. He went through how they built their business case: demonstrating the BPM tool, surveying departments on their business processes and how they might benefit from BPM, and some analysis to wrap it all up. He also covered the business and IT drivers for creating a BPM center of excellence, with a focus on alignment, shared resources and reusability.

Building the execution team was key; with a model-driven tool, he didn’t really want “hard-core developers”, or even people who had used the tool before, but rather those who could adapt quickly to new environments and use model-driven concepts to drive agile development. Having a focus on quick wins was important, rather than getting bogged down in a long development cycle when it’s not necessary.

They also had considerations about their server infrastructure, and since they were using BPM across a wide variety of decentralized and non-integrated groups decided on separate virtual machines that could be taken down without impacting anything beyond the specific departmental process. This seems to indicate that they didn’t do much end-to-end work, but focused on departmental solutions; otherwise, I would have expected more integration and the requirement for shared process engines. When he showed his process stats – 200 different processes across 3000 users – it seemed to reinforce my assumption, although they are doing some end-to-end processes such as Procure To Pay.

He strongly encourages taking advantage of the BPM tool for what it does best, including change management for processes. They’ve obviously done a good job of that, since they’re managing their entire BPM program with 4 people on the implementation team. He recommends not allowing developers to write any code until you’ve prototyped what you can in the BPM tool, or else their tendency will be just to rewrite the BPMS functionality themselves; I am 100% behind this, since I see this happening on many BPM implementation projects and it’s a constant battle.

With an SAP/BPM integration like they’ve done at AmerisourceBergen, you need to be careful that you don’t get too carried away in the BPM tool and rebuild functionality that’s already in SAP (or whatever your ERP system is), but using BPM as a tool for orchestrating atomic ERP functions makes a lot of sense in terms of agility and visibility, and also provides the opportunity to build processes that just don’t exist in the ERP system.

Advancing BPM Maturity

Janelle Hill of Gartner presented on how to advance your BPM maturity, starting with the concept that not only isn’t there one path to get to BPM maturity, but there’s more than one maturity destination. There are many different mind-sets that organizations have about their BPM programs, ranging from simple automation and improvement efforts up to strategic business optimization; how you think about BPM will have an enormous impact on the potential value of BPM within your organization. This is really an excellent point that is rarely explicitly stated: if you think of BPM as a low-level tool to do some automation – more of a developer tool than a business tool – then you can see benefits, but they’ll be limited to that domain. Conversely, if you think of BPM as a tool/methodology for transforming your business, your use of BPM will tend to be more aligned with that. The tricky part is that BPM is both (and everything in between), and you don’t want to lose sight of its use at a variety of levels and for many different sorts of benefits: as fashionable as it is to see BPM as purely a strategic, transformational methodology, there are also a lot of practical BPM tools that are used for automation and optimization at a more tactical level that have huge benefits.

Gartner’s business process maturity model – the same, I think as the OMG BPMM – passes through five levels from process-aware, to coordinated processes, to cross-boundary process management, to goal-driven processes, to optimized processes. In line with this, benefits move from cost and productivity improvements at the low levels; to cycle time reductions, capacity and quality gains at the middle levels; to revenue gains, agility and predictability at the higher levels.

Advancing maturity requires work along six major dimensions:

  • Organization and culture
  • Process competencies
  • Methodologies
  • Technology and architecture
  • Metrics and measures
  • Governance

She then showed a mapping between the maturity levels and these dimensions, with the level of effort required for each, with the critical transition points highlighted. There are some interesting transition points, such as the effort required for organization and culture increasing right up until when you are well-entrenched in level 5 maturity, at which time the organization and culture aspects becomes systemic and mostly self-sustaining, and the explicit effort required to maintain them decreases sharply.

She broke out each of the dimensions in more detail, showing within the organization and culture dimension how the roles and responsibilities must be developed as the maturity level increases through education, establishing a BPCC and becoming goal-aligned.  Some dimensions, such as process competencies, methodologies and technology/architecture, follow fairly logical paths of increased effort as the maturity level increases, although there will be decisions within those such as which particular methodologies to develop within your organization, and your tools may change as your maturity level increases. Metrics and measures tend to be more aligned with the maturity levels, changing from individual lagging indicators to shared real-time metrics tied to strategic objectives and SLAs, and is also heavily supported by technology. Governance is the most difficult of the dimensions, with a collection of very different initiatives, and probably won’t even properly start until you’re transitioning from level 1 to level 2. A lot of what she covered here is centered around the process governance committee, and some level of centralized stewardship for end-to-end processes: otherwise, it’s impossible to fund and push forward with processes that span functional (and budgetary) boundaries. It’s also necessary to create incentives to support this, so that the entire process doesn’t end up sub-optimized when one of the functional subprocesses is optimized.

Gartner’s research has shown the impact of a BPCC on achieving business process maturity, and in turn, delivering more successful BPM projects across the organization; I definitely agree with this, although believe that you need to grow your BPCC more organically on the back of a BPM project rather than making it an independent project of its own. The BPCC should not be part of IT; although it contains some technical people with skills in the tools, it’s really about operational processes and should be under the auspices of the COO or other business leader.

She finished up with a contrast between functionally-driven and process-driven organizations in terms of roles and responsibilities, visibility, hand-offs, cost accounting, risk analysis and other areas, plus a great chart summarizing the linkages between maturity levels and the dimensions.

Excellent talk, and lots of great practical advice on what you need to do to increase your BPM maturity level.

Selling BPM to your Organization

Starting into the breakout sessions here at Gartner BPM 2011 in Baltimore, Elise Olding, with some help from Joel Kiernan of Altera, gave a presentation on selling BPM within your organization. This is about selling that first project internally as well as expanding your BPM initiative beyond the first project: leveraging your success so far and your business-focused BPM definition to see how it can be applied with other opportunities. Like any good sales pitch, you need to have content that is relevant, compelling and repeatable. I wrote about expanding BPM adoption within your organization in a recent article series for Global 360, and covered some of the same issues about generalizing beyond that first project into a BPM program.

Kiernan discussed their own case study at Altera (a semiconductor company), starting with how they had to understand their key business processes and communicate this to the steering committee responsible for the business process projects. They’re early in their journey, but have put together the storyline for how BPM will roll out in their organization: identify the right processes, do some as-is and to-be process analysis including external best practices, implement process/system changes, then move into ongoing process improvement.

As Olding discussed, there will need to be different messages for different internal audiences: senior executives are interested in how BPM will improve performance, competitiveness and operational flexibility; line of business managers are interested in operational goals including reducing errors and rework, and gaining visibility into processes for themselves and their management; front-line workers want to know how it will make their work easier, more interesting and more effective.

As an aside, I get the feeling that Gartner presenters have been coached by someone who really likes complex analogies woven throughout the presentation: in the keynote, Ken McGee used a courtroom analogy throughout the presentation, and here Olding is using a film-making analogy with “trailers”, “setting” and “engaging the cast”. It was also a bit of a strange segue to involve the Altera person for only about two minutes when they were really just starting in their process, although I have to give her credit for sharing the stage with a customer, since that’s pretty rare at any Gartner events that I’ve attended in the past. Would have been great to hear from someone further along in the process, and maybe a bit more from them than just two slides.

She covered some of what you actually want to communicate, as well as the who and how of the communication, stressing that you need to achieve buy-in (or at least understanding) from a lot of different stakeholders in order to reach that tipping point where BPM is seen by your organization as a key enabler for business improvement. She changed the format a bit to get people working on their own process issues, giving everyone time to jot down and discuss their challenges in each of the steps of selling BPM internally, then calling on a couple of audience members to share their thoughts with the room. This format shift caused a bit of loss of focus (and a bit of down time for those of us who aren’t really into this form of audience participation), although she was able to bring the experiences of the audience members in alignment with the material that she was presenting. Not surprisingly, one of the key messages is on the business process competency center (what Gartner calls the center of excellence) and the methodology that they employ with customers to make a BPCC successful within an organization. Success, in that case, is measured completely by how well you can sell BPM inside the organization.

Gartner BPM 2011 Kicking Off

I’m at my first Gartner BPM show in a while: a couple of years ago, I noticed a lot of repeated information from one summit to the next and decided to sit a few out, but decided that there was enough refresh by now and a good chance to catch up with a lot of people who I only ever see at these conferences.

The show kicked off with Michele Cantera, joined by Elise Olding, giving some opening remarks and introducing the winners of the Gartner BPM Excellence awards: Lincoln Trust, UPS, Carphone Warehouse, NY State Taxation, and Maximus.

The keynote was delivered by Ken McGee, Gartner fellow, opened with the statement that this is the time for the business process professional. He backed this up with a look at the economic growth forecast, including some optimistic survey numbers from businesses stating that their revenues and IT spending are going to increase this year. This was a fairly general presentation on the impact of the economy on business environments and the need to seize new opportunities; not at all specific to BPM, except for one slide of the APQC process framework that didn’t really seem to fit with much else.

Gartner has obviously released a report on the Money-Making CIO recently, and that’s what he spent part of his presentation on: looking at the six styles of money-making CIOS (entrepreneur, cost optimization, revenue searching, innovation, business development, and public serving). He mentioned other Gartner research, such as pattern-based strategy, and told us that social networking and cloud computing are important (duh); this seemed like a a bit of a grab-bag of concepts that could have been given to any IT audience at any conference.

I understand that it’s important to have presentations that show the larger context at a tightly-focused event like this BPM summit, but this didn’t have the cohesiveness or inspiration required to elevate it beyond just a summary of this year’s Gartner research.

Getting Business Process Value From Social Networks #GartnerBPM

For the last session of the day, I attended Carol Rozwell’s presentation on social network analysis and the impact of understanding network processes. I’ll be doing a presentation at Business Rules Forum next month on social networking and BPM, so this is especially interesting even though I’ll be covering a lot of other information besides social graphs.

She started with the (by now, I hope obvious) statement that what you don’t know about your social network can, in fact, hurt you: there are a lot of stories around about how companies have and have not made good use of their social network, and the consequences of those activities.

She posited that while business process analysis tells us about the sequence of steps, what can be eliminated and where automation can help, social network analysis tells us about the intricacies of working relationships, the complexity and variability of roles, the critical people and untapped resources, and operational effectiveness. Many of us are working very differently than we were several years ago, but this isn’t just about “digital natives” entering the workforce, it’s about the changing work environment and resources available to all of us. We’re all more connected (although many Blackberry slaves don’t necessarily see this as an advantage), more visual in terms of graphical representations and multimedia, more interactively involved in content creation, and we do more multitasking in an increasingly dynamic environment. The line between work and personal life blurs, and although some people decry this, I like it: I can go to many places in the world, meet up with someone who I met through business, and enjoy some leisure time together. I have business contacts on Facebook in additional to personal friends, and I know that many business contacts read my personal blog (especially the recent foodie posts) as well as my business blog. I don’t really have a lot to hide, so don’t have problem with that level of transparency; I’m also not afraid to turn off my phone and stop checking my email if I want to get away from it all.

Your employees are already using social media, whether you allow it within your firewall or not, so you might as well suck it up and educate them on what they can and can’t say about your company on Twitter. If you’re on the employee side, then you need to embrace the fact that you’re connected, and stop publishing those embarrassing photos of yourself on Facebook even if you’re not directly connected to your boss.

She showed a chart of social networks, with the horizontal axis ranging from emergent to engineered, and the vertical axis from interest-driven to purpose-driven. I think that she’s missing a few things here: for example, open source communities are emergent and purpose-driven, that is, at the top left of the graph, although all of her examples range roughly along the diagonal from bottom left to top right.

There are a lot of reasons for analyzing social networks, such as predicting trends and identifying new potential sources of resources, and a few different techniques for doing this:

  • Organizational network analysis (ONA), which examines the connections amongst people in groups
  • Value network analysis (VNA), which examines the relationships used to create economic value
  • Influence analysis, a type of cluster analysis that pinpoints people, associations and trends

Rozwell showed an interesting example of a company’s organizational chart, then the same players represented in an ONA. Although it’s not clear exactly what the social network is based on – presumably some sort of interpersonal interaction – it highlights issues within the company in that some people have no direct relation to their direct reports, and one person who was low in the organizational chart was a key linkage between different departments and people.

She showed an example of VNA, where the linkages between a retailer, distributor, manufacturer and contract manufacturer where shown: orders, movements of goods, and payments. This allows the exchanges of value, whether tangible or intangible, to be highlighted and analyzed.

Her influence analysis example discussed the people who monitor social media – either within a company or their PR agency – to analyze the contributors, determine which are relevant and credible, and use that to drive engagement with the social media contributors. I get a few emails per day from people who start with “I read your blog and think that you should talk to my customer about their new BPM widget”, so I know that there are a lot of these around.

There are some basic features that you look for when doing network analysis: central connectors (those people in the middle of a cluster), peripheral players (connected to only one or two others), and brokers (people who form the connection between two clusters).

There are some pretty significant differences between ONA, VNA and business process analysis, although there are some clear linkages: VNA could have a direct impact on understanding the business process flows, while ONA could help to inform the roles and responsibilities. She discussed a case study of a company that did a business process analysis and an ONA, and used the ONA on the redesigned process in order to redesign roles to reduce variability, identify roles most impacted by automation, and expose critical vendor relationships.

Determining how to measure a social network can be a challenge: one telecom company used records of voice calls, SMS and other person-to-person communications in order to develop marketing campaigns and pricing strategies. That sounds like a complete invasion of privacy to me, but we’ve come to expect that from our telecom providers.

The example of using social networks to find potential resources is something that a lot of large professional services firms are testing out: she showed an example that looked vaguely familiar where employees indicated their expertise and interests, and other employees could look for others with specific sets of skills. I know that IBM does some of this with their internal Beehive system, and I saw a presentation on this at the last Enterprise 2.0 conference.

There are also a lot of examples of how companies use social networks to engage their customers, and a “community manager” position has been created at many organizations to help manage those relationships. There are a lot of ways to do this poorly – such as blasting advertising to your community – but plenty of ways to make it work for you. Once things get rolling in such a public social network, the same sort of social network analysis techniques can be applied in order to find the key people in your social network, even if they don’t work for you, and even if they primarily take an observer role.

Tons of interesting stuff here, and I have a lot of ideas of how this impacts BPM – but you’ll have to come to Business Rules Forum to hear about that.

Fujitsu process discovery case study #GartnerBPM

I first saw Fujitsu’s process discovery offering last year, and it looked pretty useful at the time, but it didn’t have much of a track record yet. Today’s session brought forward Greg Mueller of Electro Scientific Industries (ESI), a manufacturer of photonic and laser systems for microengineering applications, to talk about their successes with it.

Basically, the Automated Process Discovery (APD) uses log files and similar artifacts from any variety of systems in order to derive a process model, analyzing frequencies of process variations, and slicing and dicing the data based on any of the contributing parameters. I’ve written a lot about why you would want to do process discovery, including some of the new research that I saw at BPM 2009 in Germany last month.

ESI wanted to reduce inventory and improve manufacturing cycle time, and needed to understand their opportunity-to-order process better in order to do that. They used APD to determine the actual process flows based on about 15 months of data from SAP and other systems, then validated those flows with the team who worked with those flows. They wanted to look at variations based on business unit and other factors to figure out what was causing some of their cycle time and inventory problems.

They assumed a relatively simple four-step process of opportunity-quote-order-shipment, possibly with 3-4 additional steps to allow revisions at each of these steps; what they actually found when they looked at about 11,500 process instances is that they had over 1,300 unique process flows. Yikes. Some of this was cycling through steps such as order change: you would expect an order to be changed, but not 120 times as they found in some of their instances. There were also loopbacks from order to quote, each of these representing wasted employee time and increased cycle time. They found that one task took an average of 58 days to complete, with a standard deviation of 68 days – again, a sign of a process out of control. They realize that they’re never going to get it down to 25 unique process flows, but they are aiming for something far lower than 1,300.

They did a lot of data slicing and analysis: by product, by region, by sales manager and many other factors. APD allows for that sort of analysis pretty easily (from what I saw last year), much like any sort of dimensional modeling that you would do in a data warehouse.

They observed that less than 20% of their opportunities followed the happy path, and the rest were taking too long, duplicating efforts, having too many rework loopbacks, and sometimes not even shipping after a great deal of up-front work.

In their process improvement phase, they established 22 projects including a number of improvement features such as automating processes to reduce repeated steps, improving entry flow to reduce time intervals, require the entry of initial data early in the process in order to reduce loopbacks and rework. Since their business runs on SAP, a lot of this was implemented there (which begs the question of who did such a crappy SAP implementation for them in the first place such that they had problems like this – seriously, insufficient required data entry at the start of an process?), and they’re able to keep extracting and analyzing the logs from there in order to see what level of improvement that they are experiencing.

After a much too short presentation by ESI, Ivar Alexander from Fujitsu gave us a demo of APD with ESI’s basic process; I’ve seen a demo before, but it’s still fascinating so see how the system correlates data and extracts the process flows, then performs detailed dimensional analysis on the data. All of this is done without having to do a lot of interviews of knowledge workers, so is non-invasive both from a people and system standpoint.

It’s important to recognize that since APD is using the system logs to generate the process flows, only process steps that have some sort of system touch-point will be recorded: purely manual process steps will not. Ultimately, although they can make big improvements to their SAP-based processes based on the analysis through APD, they will probably need to combine this with some manual analysis of off-system process steps in order to fully optimize their operations.

Dynamic BPM versus agility #GartnerBPM

Jim Sinur led a session this morning on dynamic BPM and how to deal with the demands for change. He started with the statement that dynamic BPM is more than just another type of BPM technology, it’s a requirement for a transformational advantage, and took a look at how BPM will become more dynamic in the future.

Change is driven by unexpected exceptions in processes, and patterns of these unexpected events can indicate trends in your business environment that the processes need to accommodate. Typical change cycles in IT, however, tend to be slow and steady, which doesn’t at all match either the business dynamics or the external forces that shape them. Being able to handle these spiky demands drives the requirement for more dynamism in how processes and rules are managed, and drives the requirement for the business to be able to manage these directly rather than having to engage IT for all changes.

Gartner’s definition of dynamic BPM is the ability to support process change by any role, at any time, with very low latency. Change agents include everyone from customers and business people through business and process analysts, and on to architects and developers; if the people at the business end of this spectrum aren’t allowed to make process changes, then they’ll just work around it and invent their own processes using their own tools. This isn’t just about each individual’s personal preferences for how they work, however: if knowledge workers can make changes to their processes, they will tend to make them more efficient and effective, which has enterprise benefits.

A significant part of this is the inclusion of explicit rules within processes, so that scenario-driven rule sets can detect and respond to conditions, even without the process participants having to make those changes themselves: the basis of what James Taylor was saying in his presentation this morning. What used to be monolithic lumps of code can be split into several parts, each of which has the potential to be agile: user interface is managed by portals and the web; decision points are handled by rules engines; paths of execution are managed by BPMS; and data definitions are handled in databases or XML data representations. All of those parts used to be under the control of the developers, but turning it inside out and using more agile technologies allows people to customize their UI, change their rules on a daily basis, modify their processes, and define their own data structures. Dynamic BPM isn’t just about changing process models, it spans requirements, recompilation, data binding, loading and versioning.

There was quite a bit about services composition environments and CEP that I felt didn’t really belong in a presentation on dynamic BPM: yes, you need to have services and CEP in order to build agile processes in the first place, but it seems like filler.

One brief slide on “Web 2.0”, really just a quick laundry list of enterprise social software aspects that could impact BPM, including collaborative process design and execution, but no meat. Sinur merely read the list and pointed out that there are vendors at the showcase showing some of these capabilities. That was a bit of a disappointment, considering that the term “dynamic BPM” is being used by many (including Forrester and several vendors) to describe collaborative processes that are created or modified at runtime by the user.

He finished up with some sensible advice about separating rules and other application components from the processes in order to push towards more agile processes, although not different from the message that we’ve been hearing for quite a while now.

This wasn’t a new presentation: it was mostly recycled material that I had seen in previous Gartner presentations (either at conferences or on webinars) about agile BPM using rules, services and complex event processing. There’s been some new verbiage put around it and a few new slides, but only the briefest nod to the type of user-created ad hoc collaborative processes that represent the most dynamic form of BPM.