Can BPM Save Lives? Siemens Thinks So

My last session at Gartner BPM 2013 is a discussion between Ian Gotts of TIBCO and their customer Tommy Richardson, CTO of Siemens Medical Solutions. I spoke with Siemens last year at Gartner and TUCON and was very interested in their transition from the old iProcess BPM platform (which originally came from TIBCO’s Staffware acquisition) to the newly-engineered AMX platform, which includes BPM and several other stack components such as CEP. Siemens isn’t an end-user, however: they OEM the TIBCO products into their own Soarian software, which is then sold to medical organizations for what Richardson refers to as “ERP for hospitals”. If you go to a hospital that uses their software, a case (process instance) is created for you at check-in, and is maintained for the length of your stay, tracking all of the activity that happens while you’re there.

With about 150 customers around the world, Seimens offers both hosted and on-premise versions of their software. Standard processes are built into the platform, and the hospitals can use the process modeler to create or modify the models to match their own business processes. These processes can then guide the healthcare professionals as they administer treatment (without forcing them to follow a flow), and capture the actions that did occur so that analytics can determine how to refine the processes to better support patient diagnosis and treatment. This is especially important for complex treatment regimes such as when an unusual infectious disease is diagnosed, which requires both treatment and isolation actions that may not be completely familiar to the hospital staff. Data is fed to and from other hospital systems as part of the processes, so the processes are not executing in isolation from all of the other information about the patient and their care.

For Siemens, BPM is a silver bullet for software development: they can make changes quickly since little is hard-coded, allowing treatment processes to be modified as research and clinical results indicate new treatment methods. In fact, the people who maintain the flows (both at Siemens and their customers) are not developers: they have clinical backgrounds so that they are actually subject matter experts, although are trained on the tools and in a process analyst role rather than medical practitioner role. If more technical integration is required, then developers do get involved, but not for process model changes.

The Siemens product does a significant amount of integration between the executing processes and other systems, such as waiting for and responding to test results, and monitoring when medications are administered or the patient is moved to another location in the hospital. This is where the move to AMX is helping them, since there’s a more direct link to data modeling, organizational models, analytics, event handling from other systems via the ESB, and other functionality in the TIBCO stack, replacing some amount of custom software that they had developed as part of the previous generations of the system. As I’ve mentioned previously, there is no true upgrade from iProcess to AMX/BPM since it’s a completely new platform, so Siemens actually did a vendor evaluation to see if this was an opportunity to switch which product OEMed into their product, and decided to stay with TIBCO. When they roll out the AMX-based version in the months ahead, they will keep the existing iProcess-based system in place for each existing client for a year, with new patient cases being entered on the new system while allowing the existing cases to be worked in place on the old system. Since a case completes when a patient is discharged, there will be very few cases remaining on the iProcess system after a year, which can then be transferred manually to the new system. This migration strategy is far beyond what most companies do when switching BPM platforms, but necessary for Siemens because of the potentially life-threatening (or life-saving) nature of their customers’ processes. This also highlights how the BPMS is used for managing the processes, but not as a final repository for the persistent patient case information: once a case/process instance completes on patient check-out, the necessary information has been pushed to other systems that maintain the permanent record.

Modernizing the healthcare information systems such as what Siemens is doing also opens up the potential for better sharing of medical information (subject to privacy regulations, of course): the existence of an ESB as a basic component means that trusted systems can exchange information, regardless of whether they’re in the same or different organizations. With their hosted software, there’s also the potential to use the Siemens platform as a way for organizations to collaborate; although this isn’t happening now (as far as I can tell), it may be only a matter of time before Siemens is hosting end-to-end healthcare processes with participants from hospitals, speciality clinics and even independent healthcare professionals in a single case to provide the best possible care for a patient.

The Neuroscience Of Change

We wrapped up day 2 of Gartner BPM 2013 with David Rock, author of Your Brain at Work: Strategies for Overcoming Distraction, Regaining Focus, and Working Smarter All Day Long, on neuroleadership and the neuroscience behind organizational change. Neuroleadership deals with how leaders make decisions and solve problems, regulate emotions, collaborate, and facilitate change; this last one is of key focus in his presentation today. In order to create change effectively using a brain-based model, we need to create a “toward” state, facilitate new connections, and embed new habits. Basically, our brains are really bad at doing things that we’ve never done before, because that requires using the relatively small prefrontal cortex. In other words, if we have to think about something, it’s hard. Furthermore, if you’re threatened or stressed, the capability of the prefrontal cortex decreases, meaning that you’re only going to be able to do simple tasks that you’ve done before.

He outlined three levels of thinking: level 1 tasks are simplistic things that you’ve seen/done a lot before, such as deleting emails; level 2 tasks are things that you’ve seen less often, such as scheduling meetings; and level 3 are more complex concepts that you’ve never seen before, such as writing a business plan. When you’re really stressed, you’re pretty much only good for doing level 1 tasks, although peak performance does happen when you’re under a bit of stress.

Change requires a lot of cognitive processing, but when change is perceived as a threat, cognitive processing function decreases. Having change not be perceived as a threat requires creating a toward state, that is, something that is rewarding; since our brains are deeply social, to the point where social pain is the same as physical pain within the brain, social rewards can be used to create that toward state. The five domains of social pain/pleasure are status (your perception of your position relative to others), certainty (uncertainty arouses the limbic system), autonomy (the brain likes to predict and have a say in the future, and having some degree of choice can reduce stress levels), relatedness (categorizations of similar/different to decide who’s on your team and shares your goals), and fairness (unfairness is the same as pain, to the brain). Having higher levels of these social rewards reduces stress, and we protect against the threat of them decreasing. Change, however, can create threats in all of these domains, and you need to find offsetting rewards in one or more of these domains in order to get people thinking about the future rather than just mentally cowering in a corner.

Once a toward state is created by addressing the social reward domains, you can facilitate new connections in people’s brains by creating an environment that permits them to have insights, which starts to form those new pathways that lead to habits.

Thought-provoking talk about the neurological motivations behind change, and a good way to end the day.

Tonight, I’m off to a TIBCO customer event — as a matter of disclosure, TIBCO provided me with one of their conference passes to be here, although I paid my own travel expenses — and I’ll only have time for one or two sessions in the morning before I head for the airport.

Empowering Business Roles For Dynamic BPM

It’s the end of day 2 at Gartner BPM 2013, and I’m in my first session with Janelle Hill — hard to believe, because usually I gravitate to her topics since she’s such an insightful speaker. She admitted that she is feeling like it’s day 4 (of a 2.5-day conference), so glad to know that I’m not the only one experiencing a bit of conference fatigue. Of course, this is my third week in a row at conferences, so that could be a contributor, too.

She’s doing one of their short “To The Point” sessions, only 30 minutes including Q&A, with a quick review of dynamic BPM and what it means to change a process in-flight. There are a number of things that can be done to change a process, ranging from small changes such as reassigning work during runtime, deleting outdated activities, or changing a monitoring dashboard; to mid-range changes such as adding new performance metrics or changing a business rule; to large changes such as major resequencing or mapping to different underlying services. This was a bit of a confusing classification, since some of these were clearly runtime changes to a specific process instance or set of instances, while others were more design-time template changes that would impact all process instances created from that point on. Regardless, it comes down to what kind of things you might want to change in your process, and what could be considered as changes that business could make directly or could collaborate on with IT. And, as soon as process changes are made by the business — or made at all — there need to be changes to the mindset: developers no longer should think about building to last, but rather building to change. This seems like just good practice for most of us, but there are still a lot of enterprise developers who don’t think about a modular service-oriented architecture, and using declarative business rules to enforce constraints.

She finished up with some must-haves for enabling dynamic BPM, which were all technology-based; this was a bit disappointing since she only briefly addressed the topic of what cultural and role/responsibility changes need to be made in order to have business people actually make the changes that the technology now allows them to make. The technology in this area is becoming fairly mature, but I find that the mindset required for IT to relinquish control of process changes, and business to assume responsibility, is the part that’s lagging. She did point out that business people are becoming more comfortable with being involved with model-driven design, but it’s way more than just knowing how to use the tools.

BPM COE At Pershing

Barbara Fackelman and Regina DeGennaro of Pershing (a BNY Mellon subsidiary providing outsourced financial transaction services) presented at Gartner BPM 2013 about their BPM initiative, as it grew from reengineering a broken process related to federal reserve exchanges — saving them $1M/year — to a BPM center of excellence (COE). As I often recommend for growing a COE, they built their COE as an offshoot of their initial project by building a reusable BPM framework along the way, then communicated that out to the rest of the organization to undercover other potential spots for process improvement.

They started to identify sub-processes and functions that are reusable across different processes, such as document rendezvous, which impacted document scanning and handling processes as well as the downstream transaction processing. With that in their portfolio, they were able to implement additional BPM projects with significant savings, making the BPM COE a very popular service inside Pershing.

Their BPM COE reports up to the executive committee, and gets input from a number of other sources internally:

  • Architecture review board
  • Technology prioritization committee
  • Dedicated programming groups and QA
  • Process owners
  • Quality management office
  • BPM solutions team

They have a number of key roles in the BPM COE:

  • Executive sponsor
  • Process owner
  • Process architect
  • Product owner
  • Governors
  • Development leads
  • Quality assurance
  • Product manager
  • Business analyst
  • Process librarian
  • Metrics master (BI architect)

With all of this in place, they have a mature COE that supports process optimization and innovation, and reviews new technologies to support the enhanced vision. Interestingly, they treat their BPM COE like any other process project: having defined and implemented it, they are constantly monitoring what/how their COE is doing, and continuously optimizing it. As an outsourcing firm, their main focus is on maximizing straight-through processes, and they can measure the performance of the COE since STP is a specific mission of the COE. As they have found, nothing succeeds like success: their STP process improvements to date have led to more collaboration and projects in other areas of their organization.

They’re using a lot of homegrown stuff, plus IBM BPM and Pega; like most big financial services organizations, they are piecing together a lot of this themselves to make it work best for them.

BPM And MDM For Business Performance Improvement

Andrew White presented a session at Gartner BPM 2013 on how process, applications and data work together, from his perspective as an analyst focused on master data management (MDM). He was quick to point out that process is more important than data 😉 but put forward MDM as a business-led discipline for maintaining a single version of the truth for business data. The focus is on how that data is created and maintained in business applications, assuring the integrity of the processes that span those applications. Since his background is in ERP systems, his view is that processes are instantiated by applications, which are in turn underpinned by data; however, the reality that I see with BPMS is that data resides there as well, so it’s fair to say that processes can consume data directly, too.

Master data is the common set of attributes that are reused by a wide variety of systems, not application-specific data — his example of master data was the attributes of a specific inventoried product such as size and weight — but there is also shared data: that grey area between the common master data and application-specific data. There are different tiers of systems identified in their pace layering, with different data access: systems of record (e.g., ERP) tend to consume enterprise master data and transaction data; systems of differentiation (e.g., CRM) consume master data, analytic data and rich data; and systems of innovation (e.g., Facebook app) consume analytic data, rich data and cloud-sourced data that might be someone else’s master data. End-to-end business processes may link all of these systems together, and be guided by different data sources along the way. It all makes my head hurt a little bit.

MDM programs have some of the same challenges as BPM programs: they need to focus on specific business outcomes, and focus on which processes need improving. And like the Fight Club reference that I heard earlier today (“the first rule of process is that you don’t talk about process”), you want MDM to become transparent and embedded, not be a silo of activity on its own. Also in common with some BPM initiatives is that MDM is often seen as an IT initiative, not a business initiative; however, just like defining business processes, it’s up to the business to identify their master data. MDM isn’t about data storage and retention; it’s about how data is used (and abused) throughout the business lifecycle. In my opinion, we still need better ways to model the data lifecycle at the same time as we model business processes; BPMN 2.0 added some provisions for data modeling, but it’s woefully inadequate for a whole data lifecycle model.

White presented a number of things that we need to think about when creating an enterprise data model, and best practices for aligning BPM and MDM. The two initiatives can be dovetailed, so that BPM provides priority and scope for the MDM efforts. Business processes (and not just those implemented in a BPMS) create and consume data, and once a process is defined, the points where data is created, viewed and updated can be identified and used as input to the master data model. From an EA standpoint, the conceptual, logical and physical models for data and process (or column 1 and column 2, if you’re a Zachman follower) need to be aligned.

Process Intelligence And Real-Time Visibility At Electrolux With SoftwareAG

Jean Lastowka of Electrolux and Dave Brooks of SoftwareAG presented at Gartner BPM 2013 on process intelligence and visibility; apparently, SoftwareAG chose to include a white paper that I wrote for them in the conference handouts (which I neglected to keep), so if you’re here, check out the package.

Electrolux makes home and professional appliances — best known in North America for vacuum cleaners, but they have a much broader repertoire — and were looking to do some internal alignment in order to serve customers better. To meet this goal, they established their BPM practice a week before last year’s Gartner BPM conference, established a BPM framework, did collaborative process modeling and launched a new ERP system for their new end-customer distribution channel over the next five months, then brought on SoftwareAG’s iKnow product for process visibility in October, and launched it to their business community in November.

Their BPM efforts were initially around end-to-end process mapping of the new processes in ARIS, allowing business and IT to have a shared knowledge of the processes; they are not using a BPMS to automate processes, but the processes are encapsulated in the ERP system implementation and procedural knowledge. Unfortunately, with these new processes and a new ERP system, people were still trying to manage the processes in the old ways (including Excel), causing a lot of customer dissatisfaction. iKnow allowed them to take their process models, connect up event feeds (I assume) from the ERP system (and presumably other systems), and show real-time order tracking and KPIs overlaid on the process model. This allows for predictive analytics, providing advance warning of potential lead time failures based on inventory levels, for example, and allowed them to track order trends and provide a single view of on-hand and in-transit inventory. Best of all, the visualizations — inventory displayed on a geographic map, for example, as well as real-time alerts based on KPIs — allowed the business to more easily consume the data than in the textual format that they had previously received.

This was a good example of what BPM looks like without a BPMS automation project: collaborative process modeling, processes implemented in some other system (an ERP system, in this case), then metrics and KPIs gathered and displayed relative to the process model in a dashboard, with items requiring action flagged and pushed to the appropriate people. Bracketing the ERP system with process modeling and monitoring transforms it into a process-centric BPM initiative that drives process improvement and provides actionable information.

There are a couple of vendors in this part of the BPM technology business, providing tooling to allow you to see the processes that are running in your other (non-BPMS) systems in real-time. For many organizations, this is the only option since they have core ERP, CRM and legacy systems that run their business, but that don’t provide good visualizations nor explicit process models. Process visibility is the first step to process excellence.

BPM Skills And Roles

It’s day 2 at Gartner BPM 2013, and after a fun night out at a Pegasystems customer dinner, then breakfast hearing about Oracle’s new BPM release, I’m in Bruce Robertson’s session on skills required for BPM, and the roles that require developing. Again, this is a “crossing the chasm” issue, where you just can’t get to level 3 in BPM maturity unless you deal with any BPM skills shortage that you have, and build out your BPM center of excellence (or competency center, as Gartner calls it). Based on Gartner’s research, BPM is a part-time job for most people related to a BPM project, with 74% spending less than 50% of their time on it across both business and IT. This is not surprising, since this includes subject matter experts, technical experts and others whose main job is not BPM, but provide some specific non-BPM expertise on the project.

So people are doing their regular day job, then also need to have some combination of operational skills to identify and execute process change, technical skills to build and evolve software solutions, and (most importantly) transformational skills to motivate people to change. A lot of organizations focus on building the operational and technical skills since that’s a bit more straightforward, but are lacking in the more evangelical transformational skills such as business vision, communication and change management. Some of these skills will be grown internally by training your existing staff, some may be available in other parts of your organization (such as HR), some will be acquired with new hires, and some will be rented from consultants like me.

Robertson showed a good chart of basic, intermediate and advanced skills for each of the operational, technical and transformational categories; he advises getting some of the advanced operational skills in as soon as possible to provide overall guidance. He listed the key BPM roles — BP director, BP architects, BP analysts, process owners, BP consultants (internal or external), subject matter experts — and listed what they do and why they’re important to your BPM efforts. There are a number of other roles, but these are the critical ones; he did, however, highlight the growing importance of data experts for both developing metrics and ensuring that analytics are properly in place. I’ve been talking about the necessary integration of process and data for some time, and fully agree with this; there’s a talk later today on BPM and MDM that I’ll be at to see more of what Gartner is seeing happening here.

He went back to the survey data that he showed at the start of the presentation indicating what BPM skills were most lacking in organizations, and overlaid the roles that would meet those skills on the chart: a good indicator of what roles you need to develop in order to address your skills gap. Skills might be in different roles, or combined, depending on the size of your BPM efforts and the skills of the individuals involved. He showed a sample RACI chart cross-referencing roles with specific BPM activities; again, a good tool for ensuring that you have all the activities covered by someone, and that they’re assigned to the right people.

He then pulled the skills/roles ideas into the need for a BPM COE (BPCC) as you gain process maturity as an organization; this has been covered by Gartner and many others (including me, at a presentation at DST’s conference last week) so I’ll just sum it up with Robertson’s top-level benefits:

  • Internal consultancy and expertise focal point
  • Improve project results
  • Better and more repeatable skills
  • Focus across business boundaries
  • Improved technology investment leverage

Only 34% of organizations surveyed have a BPCC, so it’s not surprising that 80% of organizations have not achieved level 3 maturity in spite of stated objectives to become process-driven. He presented some best practices for getting started with a BPCC — targeted around sponsorship, staffing, communicating, methodology and services — and a map for growing the BPCC over time from supporting/guiding projects to defining programs to providing input to strategy.

This presentation was a good refresh on some of the Gartner BPM skills/roles/COE discussions from past years, which had seemed a bit stagnant lately.

Banco Supervielle’s Excellence Award: Business Outcomes Driven By BPM

In the last session of day 1 at Gartner BPM 2013, I sat in on a case study by the Argentina-based Banco Supervielle, who won a Gartner excellence award for best business outcomes driven by BPM. They were experiencing poor quality of service, and having to deal with changing regulations and a complex systems integration environment. They had no process models for what they did, and were too focused on products rather than customers.

They laid out goals for their process-centric transformation: improve customer service through quality-oriented processes, including organizational changes and linking of process models to corporate strategy. They wanted to bring in innovative technologies to support this process-centric view, and strive for continuous improvement. they looked at different ways to do this: bottom-up, through more of an iterative process improvement methodology; and top-down, through complete process redesign with the processes linked to corporate strategy. Daunting as it might seem, they went with the top-down approach, similar to that described by Elise Olding earlier today from a program versus project standpoint, but also with some of the same goal-linking between processes and strategy as Betsy Burton discussed this morning. They used a combination of internal and external methodologies to help guide the program, and implemented using IBM BlueworksLive and IBM BPM.

They experienced a number of challenges:

  • Change management, from the board of directors through the project team to the end users, to have them take on a process-centric view
  • Anxiety control (an excellent descriptive term!) through the same levels, with the particular admonishment to not let your board of directors have lunch with the vendor sales reps since all the BoD will remember is that the vendor said “new process = 2 months” 🙂
  • Sponsorship, and the necessity to get as senior a sponsor as possible in order to provide the best level of air cover
  • Denial and negative influencers, often due to people not wanting to change

They had a number of results: some that they were expecting, some that they were not expecting, and some that just didn’t happen the way that they wanted. In the “expected” category were the usual items of flexibility, cost/time savings and automation, but I always find that the interesting benefits are those that were not expected. In their case, that including moving their business architecture team from a documentation role to a high-value provider; faster adoption with remote than central office users; and a single process to sell all products to all clients through all channels (a dream of all of my financial services clients). What didn’t happen on schedule was mostly the change management and adoption. It’s necessary to constantly communicate throughout the project in order to sell the ideas and the organizational change that’s required, and do everything possible to change the mind of the negative influencers or get them off the project.

Making Process Governance Work

Samantha Searle presented some of Gartner’s research on how to set up effective process governance and ownership. She started with the definition of a process owner, and reinforced that it’s necessary to have someone accountable for delivering the business outcomes of an end-to-end process. A process owner is typically at the executive level, but doesn’t necessarily have all process participants reporting up to them; they’re not the process police, they’re more like an orchestra conductor, guiding skilled professionals to work together.

She identified a number of best practices for process ownership and governance:

  1. Identify clear responsibilities for BP owners, setting expectations, establishing objectives and agreeing on key responsibilities.
  2. Establish BP governance for a BPM decision framework, creating a RACI matrix (for example) mapping actions against roles.
  3. Set goals and gather data to improve process decision making, using a BP analyst in a support role.
  4. Get commitment to process ownership through incentives, since these people are rarely fully dedicated to that role.
  5. Assign collective responsibility for business outcomes, empowering the community and having each person understand their contribution.

[My formatting is a bit primitive here at Gartner BPM since I’m on the Android tablet with the WordPress app, and there’s no easy way to add lists. Update: hacking lists with direct html tags.]

SAP BPM At Bank Of America

At Gartner BPM, there are always a few sessions given over to the sponsors and their clients to present case studies; since it’s been a while since I looked at what SAP is doing with BPM, I decided to sit in on John Cuomo from Bank of America talk about what B of A is doing with SAP BPM.

SAP NetWeaver Process Orchestration includes BPM and business rules, but also UI composition, monitoring and analytics, SOA governance, EAI, and B2B collaboration. When I initially reviewed SAP BPM a few years back on its release, I said that it isn’t the most innovative BPMS on the market (although it’s quite good), but their goal is to be the best BPMS for existing SAP customers through direct integration with their ERP solutions. B of A uses SAP ERP for their invoice processing with BPM on top, with the classic A/P drivers of integrating multiple systems, having flexible processes and providing better control over processes and rules. They receive over a million invoices per month, but only 2% require human intervention/approval and are escalated into BPM. They make heavy use of business rules to dynamically assign approvers to any specific invoice depending on the content, rather than having an identical process flow for every invoice or requiring manual assignment.

They started their SAP BPM initiatives in 2010, working with their process flows that had been defined in ARIS and moving that into the BPM automation environment. It’s pretty common for organizations to have some process flows mapped out already, but no automation.

They’re now expanding their BPM use outside of invoice processing, although all surrounding their SAP ERP usage, including general ledger, fixed assets, provisioning point, and audit processes. In short, they’re using SAP BPM for doing the exception handling for SAP ERP, providing a much more flexible and rules-driven approach to exception handling processes. This is a perfect use case for SAP BPM in conjunction with their ERP; now that SAP BPM is allowing much better access and orchestration of granular ERP functions, they should be able to expand the usage further to deeply integrate the two systems.

Peter McNulty of SAP provided some additional information on some of SAP’s newer capabilities, specifically their Operational Process Intelligence that monitors process events from a variety of platforms, both theirs and third-party, uses HANA to do the big data analytics, then displays in a consolidated dashboard.