Andrew White presented a session at Gartner BPM 2013 on how process, applications and data work together, from his perspective as an analyst focused on master data management (MDM). He was quick to point out that process is more important than data but put forward MDM as a business-led discipline for maintaining a single version of the truth for business data. The focus is on how that data is created and maintained in business applications, assuring the integrity of the processes that span those applications. Since his background is in ERP systems, his view is that processes are instantiated by applications, which are in turn underpinned by data; however, the reality that I see with BPMS is that data resides there as well, so it’s fair to say that processes can consume data directly, too.
Master data is the common set of attributes that are reused by a wide variety of systems, not application-specific data — his example of master data was the attributes of a specific inventoried product such as size and weight — but there is also shared data: that grey area between the common master data and application-specific data. There are different tiers of systems identified in their pace layering, with different data access: systems of record (e.g., ERP) tend to consume enterprise master data and transaction data; systems of differentiation (e.g., CRM) consume master data, analytic data and rich data; and systems of innovation (e.g., Facebook app) consume analytic data, rich data and cloud-sourced data that might be someone else’s master data. End-to-end business processes may link all of these systems together, and be guided by different data sources along the way. It all makes my head hurt a little bit.
MDM programs have some of the same challenges as BPM programs: they need to focus on specific business outcomes, and focus on which processes need improving. And like the Fight Club reference that I heard earlier today (“the first rule of process is that you don’t talk about process”), you want MDM to become transparent and embedded, not be a silo of activity on its own. Also in common with some BPM initiatives is that MDM is often seen as an IT initiative, not a business initiative; however, just like defining business processes, it’s up to the business to identify their master data. MDM isn’t about data storage and retention; it’s about how data is used (and abused) throughout the business lifecycle. In my opinion, we still need better ways to model the data lifecycle at the same time as we model business processes; BPMN 2.0 added some provisions for data modeling, but it’s woefully inadequate for a whole data lifecycle model.
White presented a number of things that we need to think about when creating an enterprise data model, and best practices for aligning BPM and MDM. The two initiatives can be dovetailed, so that BPM provides priority and scope for the MDM efforts. Business processes (and not just those implemented in a BPMS) create and consume data, and once a process is defined, the points where data is created, viewed and updated can be identified and used as input to the master data model. From an EA standpoint, the conceptual, logical and physical models for data and process (or column 1 and column 2, if you’re a Zachman follower) need to be aligned.