Planning the Fall BPM Conference Lineup

It’s been a quiet summer for travel – I haven’t been on a plane since June – but my fall travel schedule awaits:

  • September 13-16 is BPM 2010, the 8th International Conference on Business Process Management, which is the academic BPM conference that I’ve attended in Germany and Italy the past two years. This year, I get to go all the way to New Jersey, instead, for its first North American appearance. I’m not presenting, but will be blogging from there.
  • September 27-29 is IRM’s BPM conference in London, where I’m delivering a half-day workshop on the BPM technology landscape, presenting a session on collaboration and BPM and facilitating a roundtable discussion on transforming business process models into IT requirements.
  • October 17-21 is Building Business Capability 2010, which is Business Rules Forum plus the new Business Process Forum and Business Analysis Forum, in Washington DC. I’ll be doing the same three presentations as at the IRM BPM conference the previous month.
  • November 19 is a new one-day event, BPM World Convention, in London. I’ve agreed to keynote if it goes ahead as planned.

As of now, I don’t have any other definite conference plans for the fall, but there are lots of possibilities and tentative invitations:

  • Intalio|World, October 5-6 in San Francisco
  • Forrester’s Business Process and Application Delivery Forum, October 7-8 in Washington DC
  • SAP TechEd Berlin, October 12-14 and TechEd US, October 18-22 in Las Vegas
  • IBM Information On Demand, October 24-28 in Las Vegas
  • IBM CASCON, November 1-4 in Toronto – it’s not BPM-specific, but it’s a good conference and is local for me

I keep a calendar of all the BPM-related events that I hear about, mostly for my own reference but I have made it public:

If you have something that you’d like to add to the calendar, add a comment here or email me directly. I typically do not include webinars or other online-only events, since that would tend to crowd out the physical events.

Note that you can view this in Week, Month or Agenda view using the controls at the top right. If you’re a Google calendar user, you can add it to your list of calendars using the button at the bottom right, which will allow you to see it overlaid on your own calendar.

Speaking at Business Process Forum in October

A bit early for this, but maybe you’re already starting to organize your fall calendar. The business process track that was at the Business Rules Forum last year has grown, and become its own Business Process Forum. I believe that we need a good independent BPM conference – the ones run by the large analysts are too focused on their own viewpoints to be considered really independent – and this could be the start of something significant.

I’ll be speaking at the conference, which runs October 17-21 in Washington, DC. Look for me presenting a half-day tutorial on the BPM technology landscape, as well as the facilitator of a peer discussion session on transforming business process models into IT requirements.

Update: Forgot one, I’m also doing a presentation on Social BPM. The full conference agenda is here for all three tracks: business rules, business analysis and business process.

The super early bird registration ends tomorrow, and saves you $300.

21st Century Government with BPM and BRM #brf

Bill Craig, a consultant with Service Alberta, discussed their journey with process and rules to create agile, business-controlled automation for land titles (and, in the future, other service areas such as motor vehicle licensing) in the province of Alberta. They take an enterprise architecture approach, and like to show alignment and traceability through the different levels of business and technology architecture. They used a number of mainframe-based legacy applications, and this project was driven initially by legacy renewal – mostly rewriting the legacy code on new platforms, but still with a lot of code – but quickly turned to the use of model-driven development for both processes and rules in order to greatly reduce the amount of code (which just creates new legacy code) and to put more control in the hands of the business.

They see 21st century government as having the following characteristics:

  • customer service focus
  • business centric
  • aligned
  • agile
  • assurance
  • management and controlled
  • architected (enterprise and solution)
  • focused on knowledge capture and retention
  • collaborative and integrative
  • managed business rules and business processes

BPM and BRM have been the two biggest technology contributors to their transformation, with BRM the leader because of the number of rules that they have dealing with land titles; they’ve also introduced SOA, BI, BAM, EA, KM and open standards.

In spite of their desire to be agile, it seems like they’re using quite a waterfall-style design; this is the government, however, so that’s probably inevitable. They ended up with Corticon for rules and Global 360 for process, fully integrated so that the rules were called from tasks in their processes (which for some reason required the purchase of an existing “Corticon Integration Task” component from Global 360 – not sure why this isn’t done with web services). He got way down in the weeds with technical details – although relevant to the project, not so much to this audience – then crammed a description of the actual business usage into two minutes.

One interesting point: he said that they tried doing automated rules extraction from their mainframe applications to load into Corticon, but the automated extraction found mostly navigation rules rather than business rules, so they gave up on it. It would be interesting to know what sort of systems that automated rule extraction works well on, since this would be a huge help with similar legacy modernization initiatives.

Smarter Systems for Uncertain Times #brf

I facilitated a breakfast session this morning discussing BPM in the cloud, which was a lot of fun, and now I’m in the keynote listening to James Taylor on the role of decision management in agile, smarter systems. Much of this is based on his book, Smart (Enough) Systems, which I reviewed shortly after its release.

Our systems need to be smarter because we live in a time of constant, rapid change – regulations change; competition changes due to globalization; business models and methods change – and businesses need to respond to this change or risk losing their competitive edge. It’s not just enough to be a smarter organization, however: you have to have smarter systems because of the volume and complexity of the events that drive businesses today, the need to respond in real time, and the complex network of delivery systems by which products and services are delivered to customers.

Smarter systems have four characteristics:

  • They’re action-oriented, making decisions and taking action on your behalf instead of just presenting information and waiting for you to decide what to do.
  • They’re flexible, allowing corrections and changes to be made by business people in a short period of time.
  • They’re forward-looking, being able to use historic events and data to predict likely events in the future, and respond to them proactively.
  • They learn, based on testing combinations of business decisions and actions in order to detect patterns and determine the optimal parameters (for example, testing pricing models to maximize revenue).

Decision management is an approach – not a technology stack – that allows you to add decisioning to your current systems in order to make them smarter. You also need to consider the management discipline around this, that will allow systems to not just become smarter, but begin to make decisions and take actions without human intervention.

James had a number of great examples of smarter systems in practice, and wrapped up with the key to smarter systems: have a management focus on decisions, find the decisions that make a difference to your business, externalize those decisions from other systems, and put the processes in place to automate those decisions and their actions.

Business Rules and Business Events: Where CEP Helps Decisions #brf

To finish off the second day of Business Rules Forum, Paul Vincent of TIBCO spoke about events and event-driven architecture as a useful way of dealing with business rules. TIBCO is best known for their messaging bus (although some of us know it more for the BPM side), and events are obviously one of the things that can be carried by the bus, or generated from other messages on the bus. The three major analysts who presented here this week – Jim Sinur of Gartner, Mike Gualtieri of Forrester, and Steve Hendrick of IDC – all stressed the importance of events and CEP; in fact, Gualtieri stated that CEP is the future of business rules in his breakfast roundtable this morning.

Going back to the basics of business rules, rules can be restrictions, guidelines, computations, inferences, timings and triggers; the last two are where events start to come into play. Rules are defined through terms and facts; some facts may be events, and rules enforced as events occur. Business rules drive process definitions and the decisions made within business processes, and mapping between rules, processes and decisions is easiest done from an event perspective. Events are key to business rule evaluation and enforcement, where events are triggers for both processes and the rules that determine the decisions within those processes: an event triggers a process, which in turn calls a decision service; or an event triggers a change to a rule, which in turn changes the decisions returned to a process. In fact, there’s a fine line between business processes and event processing if you consider how an event might impact an in-flight event-triggered process, and Paul declared that BPM is really just a constrained case of CEP.

Having taken over the world of BPM, he moved on to BRM, and showed how CEP systems are better for managing automated rules (when all you have is a CEP system, everything looks like an event, I suppose 🙂 ) since all decisions are event-driven, and CEP systems monitor simple events and decisions to identify patterns in real time by combining rules, events and real-time data in the same system to allow organizations to react intelligently to business events. He walked through an example architecture for real-time event processing (which happens to be TIBCO’s CEP architecture): a distributed CEP framework including an event bus and data grid, plus rule maintenance and execution, and real-time analytics. This allows historic patterns to be detected in real time (which sounds like a contradiction), while providing the decision management interfaces, rule agents and real-time dashboards. Rather than having a listener application feeding a rules engine, events are fed directly to the event processing engine in an event-driven architecture. He walked through other aspects, such as rule definition and decision services, showing how EDA provides a simpler and more powerful environment than standard BRMS and SOA.

Business rules are used in sense and respond, track and trace, and situation awareness CEP applications; business users (or at least business analysts) need to be able to understand and model events independent of any particular infrastructure. I completely agree with this, since I find that business analysts focused on process are woefully unaware of how to identify and model events and how those events impact their business processes.

Comprehensive Decision Management: Leveraging Business Process, Case Management and CEP #brf

Steve Zisk of Pegasystems discussed decision management with a perspective on the combination of BPMS and BRMS (as you might expect from someone from Pega, whose product is a BPMS built on a BRMS): considering where change occurs and has to be managed, how rules and process interact, and what types of rules belong in which system.

In many cases, rules are integrated with process through loose integration: a process makes a call to a rules engine and passes over the information required to make a decision, and the rules engine sends back the decision or any resulting error conditions. This loose coupling makes for good separation of rules and process, and treats the rules engine as a black box from the point of view of the process, but doesn’t allow you to see how rules may interact. It also makes it difficult when the parameters that drive rules change: there may be a new parameter that has to be collected at the UI level, held within process instance parameters and passed through to the rules engine, not just a change to the rules. Pega believes that you have to have rules infused into the process in order to make process and rules work together, and to be completely agile.

Looking at an event-driven architecture, you can consider three classes of events: business, external and systems. We’re concerned primarily with business events here, since those are the ones that have to be displayed to a user, used to derive recommendations to display to a user, or used by users in order to make business decisions. Systems that need to involved human decisions with many complex events need to have a tighter integration between events, processes and rules.

Case management is about more than just collections of information: a case is the coordination of multiple units of work towards a concrete objective to meet a business need of an external customer, an internal customer, a partner or another agency. Cases respond to and generate events, both human events (such as phone calls) and automated events (such as followup reminders or fraud detection alerts).

Steve covered a number of case studies and use cases discussing the interaction between rules, processes and events, highlighting the need for close integration between these, as well as the need for rules versioning.

Business Rules Governance and Management #brf

Eric Charpentier, an independent rules consultant who has also been blogging from BRF, gave a presentation on business rules governance and management. He makes a distinction between governance and management, although that is primarily in the level: governance deals with the higher-level issues of establishing leadership, organizational structures, communication and processes, whereas management is tied to the operational issues of creating rules and the day-to-day operational issues. He proposes a number of ingredients for rules management and governance:

  • Leadership and stakeholders, including identifying stakeholders, classifying them by attitude, power and interest, and identifying roles, responsibilities and skills
  • Communication plans for each stakeholder type
  • Identifying types of rules, particularly around complexity and dependencies
  • Understanding the lifecycle of rules within your organization, which shows the process of creating, reviewing, testing, deploying and retiring rules, and the roles associated with each step in that process
  • Rule management processes, with details on rule discovery, authoring and other management tasks, right down to rule retirement
  • Execution monitoring and failure management
  • Change management, including security and access control over different types of changes
  • Building a rules center of excellence; interestingly, Jim Sinur recommended a joint BRM-BPM CoE in his presentation this morning, although I’m not sure that I completely agree with that since the efforts are often quite disjoint within organizations (or maybe Jim’s point is to force them closer together)

Eric obviously has a huge amount of knowledge on organizing rules projects, but he also proved his practical experience at this by discussing two case studies where he is involved as a facilitator, rule author, rule administrator and developer: one in a Canadian government project, and the other with Accovia, a travel technology provider.

The government project is a multi-year renewal project where legacy AS/400 systems are being converted to service-oriented architecture, and rules were identified as a key technology. In order to gain an early win, they extracted rules from the legacy system and put them in a BRMS, exposed them as web services and then call them from the legacy system. In the future, they’ll be able to built new systems that call those same rules, now that they’re properly externalized from the applications. They’re using a fairly simple rule lifecycle (develop, test, deploy) that combines authoring and change management because they have a small team, but have specific timelines for some rules that change on an annual basis. They have processes (or procedures) for deployment, execution monitoring, failure management, testing and validation, and simulation, although they have no rule retirement process because of the current nature of their rules. The simulation, a new process, takes the data from the previous year and runs it through new potential rules in order to understand the different impact on their costs; this then allows assessment of the new rules, and the appropriate policy set that in turn selects the rules for production.

The Accovia project is focused on their off-the-shelf software product, where they are embedding rules as a key component of the software. They have some rules that are internal to the software, and others where they allow the client to customize the rules; this means that the typical rules project roles are split between Accovia and their clients. The clients won’t be able to change the basic rules models, so the challenges are around creating the environment that allows the clients to make changes that are meaningful to them, but are also resilient to upgrades in the core product. They haven’t solved all of these problems yet, but have identified six possible rule lifecycles that need to be managed.

Some key lessons learned about rules governance and management as a wrapup: this takes time, and needs good stakeholder analysis. You may also need to do some research and consult your technical team in order to understand all of the issues that might be involved. Very comprehensive presentation.

BRMS at a Crossroads #brf

Steve Hendrick of IDC gave the morning keynote with predictions for the next five years of business rules management systems. He sees BRMS as being at a crossroads, currently being used as passive decisioning systems with limited scope, but with changes coming in visibility, open source, decision management platforms and cloud services.

He took a look at the state of the BRMS market: in 2008, license and maintenance revenue (but not services) was $285M, representing 10.5% growth; significant in its own right, but just a rounding error in the general software market that is about 1000 times that size. He plotted out the growth rate versus total revenue for the top 10 BRMS vendors; no one is in the top right, IBM (ILOG), CA and FICO are in the lower right with high revenues but low growth, Pegasystems, SAP (Yasu rebranded as NetWeaver BRM), Object Connections, Oracle, ESI and Corticon are in the top left with lower revenues but higher growth, and IDS Scheer (?) is in the bottom left.

Forecasting growth of the BRMS market is based on a number of factors: drivers include market momentum, business awareness driven mostly by business process focus, changes in worldwide IT spending, and GRC initiatives, whereas the shift to open source and cloud services have a neutral impact on the growth. He put forward their forecast scenarios for now through 2013: the expected growth will rise from 5% in 2009 to just over 10% by 2013. The downside growth scenario is closer to 6% in 2013, while the upside is 15%.

Many of the large ISVs such as IBM and SAP have just started in the BRMS space through recent acquisitions, but IBM tops his list of leading BRMS vendors because of the ILOG acquisition; FICO, Oracle and Red Hat are also on that list. He also lists the BPMS vendors with a strong legacy in BRMS, including CA (?), Pegasystems and TICBO. Open source vendors such as Red Hat are starting to gain a foothold – 5-10% of installed base – and are a significant threat to some of the large players (with equally large price tags) in the pure play BRMS space. Decision services and decision tables, which are the focus of much of the BRMS market today, can be easily commoditized, making it easier for many organizations to consider open source alternatives, although there are differences in the authoring and validation/verification functionality.

He spoke about moving towards a decision management platform, which includes looking at the relationships between rules and analytics: data can be used to inform rules definitions, as well as for decision refinement. CEP is an integral part of a decision management platform, with some overlapping functionality with a BRMS, but some complementary functions as well. He puts all of this together – data preparation, BRMS, decision refinement, CEP and state – into a core state machine, with links to an event server for receiving inbound events and a process server for initiating actions based on decisions, blending together an event, decision and process architecture. The benefits of this type of architecture:

  • Sense and respond provides granular support for all activities
  • Feedback allows immediate corrections to reduce risk
  • Decision models become complex but processes become simple
  • Model-driven with implicit impact analysts and change management
  • GRC and consistency are derivative benefits
  • Scalable and cloud-ready

This last point led to a discussion about decision management platforms as cloud services to handle scalability issues as well as reduce costs; aside from the usual fears about data security, this seems like a good fit.

His recommendations to vendors over the next five years:

  • Add analytics to complement the BRMS functionality
  • Bring BRMS, CEP, BPM, analytics and EDA together into a decision management platform
  • Watch out for the penetration of open source solutions
  • Cloud DMP services cater to the unpredictable resource requirements and scalability

Recommendations for user organizations:

  • Understand the value that analytics brings to BRMS, and ensure that you have the right people to manage this
  • Commit to leveraging real-time events and activities as part of your business processes and decisions
  • Watch the BRMS, CEP, BPM and analytics markets over the next 12 months to understand the changing landscape of products and capabilities

A good look forward, and some sound recommendations.

Business Rules Management: The Misunderstood Partner to Process #brf

The title of Jim Sinur’s breakfast session this morning is based on the “lack of respect” that rules have as a key component in business processes: as he pointed out, it’s very difficult to explain to a business executive what business rules do and their value (something to which I can personally attest). I’ve been talking about the value of externalizing rules from processes for a number of years, and Jim and I are definitely on the same page here. He has some numbers to back this up: a rules management system can expect to show an IRR of 15%, and in some industries that are very rules-intensive, it can be much higher.

Rules are everywhere: embodied in multiple systems, as well as in manual procedures and within employee’s heads; it goes without saying that there can be inconsistent versions of what should be the same rule in different places, leading to inconsistent business processes and outcomes. Extracting the rules out of the systems – and with more difficulty, from people’s heads – and managing them in a common rules system allows those rules to become more transparent (everyone can see what the rules are) and agile (a rule change can be made in one place but may impact multiple business processes and scenarios). Or as he said, rules are much easier to manage when they are managed. 🙂

Not all rules, however, are business rules and therefore are a fit for externalization: the best fit are those that truly have a business focus and have some degree of volatility, such as regulatory compliance rules or the rules that you use for underwriting; those with a poor fit for BRMS are system rules that might be better left in code. Once the business rules have been identified, the next challenge is to figure out which of these should actually be managed directly by the business. IT will tell you that allowing the business to change any rule without a full regression testing is dangerous; they’re wrong, of course, since your initial testing of rules should test the envelope within which the business can make rule changes. However, Jim’s suggestion is to have business and IT each state which rules that they want to manage, and just deal with those that claimed by both, by examining the impact of the changing rules within that area of overlap. Basically, if a change to a rule can’t result in any system meltdown or violation of business practices, there’s usually not a good reason not to allow the business to manage it directly.

As with the Gartner definition of BPM, BRM is defined as both a management discipline as well as the tools and technology: just as we have to get organizations to think in a process-centric manner in order to implement effective BPM systems, organizations have to think about rules management as a first-class member of their analysis and management tools. Compared to BPM, however, BRM is further back in the hype cycle: just approaching the peak of inflated expectations, whereas BPA is far out in the plateau of productivity and BPMS is dipping into the trough of disillusionment. Jim predicts that BRM will become important (especially in the context of BPM) in 5-10 years, unless some catastrophic event or legislation causes this to accelerate; this is expected to show high benefit, although not necessarily transformational as BPM is expected to be.

There’s been a lot of acquisition in the rules space, and the number of significant players has dropped from 40+ to about 15 in the past few years. There’s still quite a bit of variability in BRM offerings, however, ranging from the simple routing and branching available within BPMS, to inference engines where rules are fired based on terms and facts either through forward-chaining or backward-chaining, to event-based engines that fire based on the correlation of business and system events. Really, however, the first case is a BPMS, the second is a typical BRMS, and the third is complex event processing, but these boundaries are starting to shift. Rules technology is being seen in BPMS and CEP, but also within application development environments and packaged applications.

He did an overview of BRMS technology, starting with business rule representation: there’s a whole spectrum of rule representation, ranging from proto-natural languages through to the (as yet non-existent) natural language rules. In order to be considered a BRMS (as opposed to just a BRE), a product needs to include a rules repository, modeling and simulation, monitoring and analysis, management and administration, templates, and an integrated development environment, all centered around the rule execution engine.

Combining rules and process is really the sweet spot for both sides: allowing business rules to be externalized from processes so that they can be reused across processes (and other applications), and changed as required by the business, even for in-flight processes. Rules can be used as constraints for unstructured processes, where you don’t need to know ahead of time exactly what the process will look like, but the goals must be achieved – and validated by the rules – before the process instance is completed. The simple routing rules that exist within some BPMS just isn’t sufficient for this, and most BPM vendors are starting to realize that they either need to build their own BRMS or learn to integrate well with some of the full-featured BRMS.

He wrapped up with some key takeaways and recommendations: focus on real business rules; learn how BRM can become part of your management practices as well as technology portfolio; marry BPM and BRM, potentially within the same CoE; and see rules and processes as metadata-driven assets.

Collecting, Connecting and Correcting the BPM Dots #brf

Roger Burlton, who organized the BPM track here, gave a presentation this afternoon on process discovery techniques that fit well with Kathy Long’s previous presentation on process notations. He looked at different levels of BPM (and therefore of models): enterprise, business process, and implementation. Most of the BPM models done at the enterprise level are for the purposes of enterprise architecture and high-level strategy; those at the business process level may be for documentation and optimization whether or not the processes are ever automated; and those at the implementation level are primarily for automation purposes. Some of the collect-connect-correct techniques can be reused across these levels, allowing for easier alignment between the different levels:

  • Collect:
    • Agree on our intent – get the same motivation
    • Find out who cares
    • Discover the truth
    • Measure real performance
  • Connect:
    • Draw pictures and communicate
    • Question why
  • Correct:
    • Make it better
    • Check it out
    • Get to yes
    • Launch and learn
    • Deal with worries

He went through each of these in detail, pointing out what information that you need to gather at each point, and how this applies at each of the levels. Great presentation, tons of information, although I captured very little of it here due to end-of-day blogger burnout.

That’s it for the first day of Business Rules Forum; I’ll be here the next two days as well. Tomorrow, I can just sit in on presentations, but Thursday I’m back to work by facilitating a peer-to-peer workshop on BPM in the cloud over breakfast, and sit on a panel on emerging trends at the end of the day.