Category Archives: analytics

analytics, business intelligence and business activity monitoring

bpmNEXT 2014 Wrapup And Best In Show

I couldn’t force myself to write about the last two sessions of bpmNEXT: the first was a completely incomprehensible (to me) demo, and the second spent half of the time on slides and half on a demo that didn’t inspire me enough to actually put my hands on the keyboard. Maybe it’s just conference fatigue after two full days of this.

However, we did get a link to the Google Hangout recording of the BPMN model interchange demo from yesterday (be sure to set it to HD or you’ll miss a lot of the screen detail).

We had a final wrapup address from Bruce Silver, and he announced our vote for the best in show: Stefan Andreasen of Kapow – congrats!

I’m headed home soon to finish my month of travel; I’ll be Toronto-based until the end of April when IBM Impact rolls around.

Intelligent Business Processes Webinar Q&A

Earlier this week, I gave a webinar on intelligent business processes, sponsored by Software AG; the slides are embedded following, and you can get a related white paper that I wrote here.

There were a number of questions at the end that we didn’t have time to answer, and I promised to answer them here, so here goes. I have made wording clarifications and grammatical corrections where appropriate.

First of all, here’s the questions that we did have time for and a brief response to each – listen to the replay of the webinar to catch my full answer to those.

  • How do you profile people for collaboration so that you know when to connect them? [This was in response to me talking about automatically matching up people for collaboration as part of intelligent processes – some cool stuff going on here with mining information from enterprise social graphs as well as social scoring]
  • How complex is to orchestrate a BPMS with in-house systems [Depends on the interfaces available on the in-house systems, e.g., web services interfaces or other APIs]
  • Are Intelligent Business Processes less Dynamic Business Processes [No, although many intelligent processes rely on an a priori process model, there’s a lot of intelligence that can be applied via rules rather than process, so that the process is dynamic]
  • How to quantify the visibility to the management? [I wasn’t completely sure of the intention of this one, but discussed the different granularities of visibility to different personas]
  • Where does real-time streaming fit within the Predictive Analytics model? [I see real-time streaming as how we get events from systems, devices or whatever as input to the analytics that, in turn, feed back to the intelligent process]

And here’s the ones that we didn’t get to, with more complete responses. Note that I was not reading the questions as I was presenting (I was, after all, busy presenting), so some of them may be referring to a specific point in the webinar and may not make sense out of context. If you wrote the question, feel free to elaborate in the comments below. If something was purely a comment or completely off topic, I have probably removed it from this list, but ping me if you require a follow-up.

There were a number of questions about dynamic processes and case management:

We are treating exceptions as more as normal business activity pattern called dynamic business process to reflect the future business trend [not actually a question, but may have been taken out of context]

How does this work with case management?

I talked about dynamic processes in response to another question; although I primarily described intelligent processes through the concepts of modeling a process, then measuring and providing predictions/recommendations relative to that model, a predefined process model is not necessarily required for intelligent processes. Rules form a strong part of the intelligence in these processes, and even if you don’t have a predefined process, you can consider measuring a process relative to accomplishments of goals that are aligned with rules rather than a detailed flow model. As long as you have some idea of your goals – whether those are expressed as completing a specific process, executing specific rules or other criteria – and can measure against those goals, then you can start to build intelligence into the processes.

Is process visibility about making process visible (documented and communicated) or visibility about operational activities through BPM adoption?

In my presentation, I was mainly addressing visibility of processes as they execute (operational activities), but not necessarily through BPM adoption. The activities may be occurring in any system that can be measured; hence my point about the importance of having instrumentation on systems and their activities in order to have them participate in an intelligent process. For example, your ERP system may generate events that can be consumed by the analytics that monitor your end-to-end process. The process into which we are attempting to gain visibility is that end-to-end process, which may include many different systems (one or more of which may be BPM systems, but that’s not required) as well as manual activities.

Do we have a real world data to show how accurate the prediction is from Intelligent Processes?

There’s not a simple (or single) answer to this. In straightforward scenarios, predictions can be very accurate. For example, I have seen predictions that make recommendations about staff reallocation in order to handle the current workload within a certain time period; however, predictions such as that often don’t include “wild card” factors such as “we’re experiencing a hurricane right now”. The accuracy of the predictions are going to depend greatly on the complexity of the models used as well as the amount of historical information that can be used for analysis.

What is the best approach when dealing with a cultural shift?

I did the keynote last week at the APQC process conference on changing incentives for knowledge workers, which covers a variety of issues around dealing with cultural shifts. Check it out.

In terms of technology and methodology, how do you compare intelligent processes with the capabilities that process modeling and simulation solutions (e.g., ARIS business process simulator) provide?

Process modeling and simulation solutions provide part of the picture – as I discussed, modeling is an important first step to provide a baseline for predictions, and simulations are often used for temporal predictions – but they are primarily process analysis tools and techniques. Intelligent processes are operational, running processes.

What is the role of intelligent agents in intelligent processes?

Considering the standard definition of “intelligent agent” from artificial intelligence, I think that it’s fair to say that intelligent processes are (or at least border on being) intelligent agents. If you implement intelligent processes fully, they are goal-seeking and take autonomous actions in order to achieve those goals.

Can you please talk about the learning curves to the Intelligent Business process?

I assume that this is referring to the learning curve of the process itself – the “intelligent agent” – and not the people involved in the process. Similar to my response above regarding the accuracy of predictions, this depends on the complexity of the process and its goals, and the amount of historical data that you have available to analyze as part of the predictions. As with any automated decisioning system, it may be good practice to have it run in parallel with human decision-making for a while in order to ensure that the automated decisions are appropriate, and fine-tune the goals and goal-seeking behavior if not.

Any popular BPM Tools from the industry and also any best practices?

Are ERP solutions providers and CRMs doing anything about it?

I grouped these together since they’re both dealing with products that can contribute to intelligent processes. It’s fair to say that any BPM system and most ERP and CRM systems could participate in intelligent processes, but are likely not the entire solution. Intelligent processes combine processes and rules (including processes and rules from ERP and CRM systems), events, analytics and (optionally) goal-seeking algorithms. Software AG, the sponsor of the webinar and white paper, certainly have products that can be combined to create intelligent processes, but so do most of the “stack” software vendors that have BPM offerings, including IBM, TIBCO and SAP. It’s important to keep in mind that an intelligent process is almost never a single system: it’s an end-to-end process than may combine a variety of systems to achieve a specific business goal. You’re going to have BPM systems in there, but also decision management, complex event processing, analytics and integration with other enterprise systems. That is not to say that the smaller, non-stack BPM vendors can’t piece together intelligent processes, but the stack vendors have a bit of an edge, even if their internal product integration is lightweight.

How to quantify the intelligent business process benefits for getting funding?

I addressed some of the benefits on slide 11, as well as in the white paper. Some of the benefits are very familiar if you’ve done any sort of process improvement project: management visibility and workforce control, improved efficiency by providing information context for knowledge workers (who may be spending 10-15% of their day looking for information today), and standardized decisioning. However, the big bang from intelligent processes comes in the ability to predict the future, and avoid problems before they occur. Depending on your industry, this could mean higher customer satisfaction ratings, reduced risk/cost of compliance, or a competitive edge based on the ability for processes to dynamically adapt to changing conditions.

What services, products do you offer for intelligent business processes?

I don’t offer any products (although Software AG, the webinar sponsor, does). You can get a better idea of my services on my website or contact me directly if you think that I can add value to your process projects.

How are Enterprise Intelligent Processes related to Big Data?

If your intelligent process is consuming external events (e.g., Twitter messages, weather data), or events from devices, or anything else that generates a lot of events, then you’re probably having to deal with the intersection between intelligent processes and big data. Essentially, the inputs to the analytics that provide the intelligence in the process may be considered big data, and have some specific data cleansing and aggregation required on the way in. You don’t necessarily have big data with intelligent processes, but one or more of your inputs might be big data. 

And my personal favorite question from the webinar:

Humans have difficulty acting in an intelligent manner; isn’t it overreaching to claim processes can be “intelligent”?

I realize that you’re cracking a joke here (it did make me smile), but intelligence is just the ability to acquire and apply knowledge and skills, which are well within the capabilities of systems that combine process, rules, events and analytics. We’re not talking HAL 9000 here.

To the guy who didn’t ask a question, but just said “this is a GREAT GREAT webinar ” – thanks, dude. :-)

Webinar And White Paper On Intelligent Business Processes

I recently wrote a white paper on intelligent business processes: making business processes smarter through the addition of visibility, automation and prediction. On Wednesday (October 30), I’ll be giving a webinar to discuss the concepts in more detail. You can sign up for the webinar here, which should cause a link of the replay to be sent to you even if you can’t make it to the webinar. Software AG sponsored the white paper and webinar (as well as another one coming up next month); you can download the white paper from Software AG directly or on CIO.com.

As with all of my vendor-sponsored white papers/webinars, these are my opinions in an educational/thought leadership format, not vendor promotional pieces.

SAP TechEd Day 1 Keynote With @vsikka

Vishal Sikka – head technology geek at SAP – started us off at TechEd with a keynote on the theme of how great technology always serves to augment and amplify us. He discussed examples such as the printing press, Nordic skis and the Rosetta Stone, and ends up with HANA (of course) and how a massively parallel, in-memory columnar database with built-in application services provides a platform for empowering people. All of SAP’s business applications – ERP, CRM, procurement, HR and others – are available on or moving to HANA, stripping out the complexity of the underlying databases and infrastructure without changing the business system functionality. The “HANA effect” also allows for new applications to be built on the platform with much less infrastructure work through the use of the application services built into HANA.

He also discussed their Fiori user interface paradigm and platform which can be used to create better UX on top of the existing ERP, CRM, procurement, HR and other business applications that have formed the core of their business. Sikka drew the architecture as he went along, which was a bit of fun:

SAP architecture with HANA and Fiori

He was joined live from Germany by Franz Faerber, who heads up HANA development, who discussed some of the advances in HANA and what is coming next month in version SP7, then Sam Yen joined on stage to demonstrate the HANA developer experience, the Operational Intelligence dashboard that was shown at SAPPHIRE earlier this year as in use at DHL for tracking KPIs in real time, and the HANA Cloud platform developer tools for SuccessFactors. We heard about SAS running on HANA for serious data scientists, HANA on AWS, HANA and Hadoop, and much more.

There’s a lot of information pushing out in the keynote: even if you’re not here, you can watch the keynotes live (and probably watch it recorded after that fact), and there will be some new information coming out at TechEd in Bangalore in six weeks. The Twitter stream is going by too fast to read, with lots of good insights in there, too.

Bernd Leukert came to the stage to highlight how SAP is running their own systems on HANA, and to talk more about building applications, focusing on Fiori for mobile and desktop user interfaces: not just a beautification of the existing screens, but new UX paradigms. Some of the examples that we saw are very tile-based (think Windows 8), but also things like fact sheets for business objects within SAP enterprise systems. He summed up by stating that HANA is for all types of businesses due to a range of platform offerings; my comments on Hasso Plattner’s keynote from SAPPHIRE earlier this year called it the new mainframe (in a good way). We also heard from Dmitri Krakovsky from the SuccessFactors team, and from Nayaki Nayyar about iFlows for connecting cloud solutions.

TechEd is inherently less sales and more education than their SAPPHIRE conference, but there’s a strong sense of selling the concepts of the new technologies to their existing customer and partner base here. At the heart of it, HANA (including HANA cloud) and Fiori are major technology platform refreshes, and the big question is how difficult – and expensive – it will be for an existing SAP customer to migrate to the new platforms. Many SAP implementations, especially the core business suite ERP, are highly customized; this is not a simple matter of upgrading a product and retraining users on new features: it’s a serious refactoring effort. However, it’s more than just a platform upgrade: having vastly faster business systems can radically change how businesses work, since “reporting” is replaced by near-realtime analytics that provide transparency and responsiveness; it also simplifies life for IT due to footprint reduction, new development paradigms and cloud support.

We finished up 30 minutes late and with my brain exploding from all the information. It will definitely take the next two days to absorb all of this and drill down into my points of interest.

Disclosure: SAP is a customer, and they paid my travel expenses to be at this conference. However, what I write here is my own opinion and I have not been financially compensated for it.

The Rise Of The Machines: @BillRuh_GE On The Industrial Internet

Last day of Software AG’s Innovation World, and the morning keynote is Bill Ruh, VP of GE’s Global Software and Analytics Center, on how GE is becoming a digital business. He points out that part of that is what you do internally, but part is also your products: GE is transforming both their products and their operations on their transformation path. For example, their previous aircraft jet engines provided only aggregates measurements about takeoff, cruise and landing; now they have the potential to collect 1TB of measurement data per day from a two-engine aircraft. That’s really big data. Unfortunately, most data is dark: only 0.5% of the world’s data is being analyzed. We don’t need to analyze and act upon all of it, but there’s a lot of missed potential here.

His second point was about the “industrial internet”, where 50 billion machines are interconnected. We saw a revolution in entertainment, social marketing, communications, IT architecture and retail when a billion people were connected, but the much larger number of interconnected machines has the potential to virtualize operational technology, and to enable predictive analytics, automated and self-healing machines, mobilized monitoring and maintenance, and even increased employee productivity. Industrial businesses are starting to change how they get things done, in the same way as retail and other consumer businesses have been transformed over the past decade.

This flood of data is pushing big changes to IT architecture: industrial software now needs real-time predictive analytics, big data, mobile, cloud, end-to-end security, distributed computation, and a consistent and meaningful experience. Analytics is key to all of this, and he pointed out that data scientists are becoming the hardest position to fill in many companies. Behavioral changes around using the analytics is also important: if the analytics are being used to advise, rather than control, then the people being advised have to accept that advice.

Bill Ruh (GE) presentation at Innovation World - architecture for digital industry

The digital enterprise needs to focus on their customers’ outcomes – in their engine case, reducing fuel consumption and downtime, while improving efficiency of the operations around that machine – because at this scale, a tiny percentage improvement can have a huge impact: a 1% savings for GE translates to huge numbers in different industries, from $27B saved by increasing rail freight utilization to $63B saved by improving process efficiency in predictive maintenance in healthcare.

He had some great examples (speaking as a member of a two-engineer household, you can be sure that many of these will be talked about at my dinner table in the future), such as how wind turbines are not just generating data for remote monitoring, but are self-optimizing as well as actually talking to each other in order to optimize within and between wind farms. Smart machines and big data are disrupting manufacturing and related industries, and require a change in mindset from analog to digital thinking. If you think that it can’t happen because we’re talking about physical things, you’re wrong: think of how Amazon changed how physical books are sold. As Ruh pointed out, software coupled with new processing architectures are the enablers for digital industry.

Bill Ruh (GE) presentation at Innovation World - smart wind turbines

It’s early days for digital industry, and there needs to be a focus on changing processes to take advantage of the big data and connectivity of machines. His advice is to get started and try things out, or you’ll be left far behind leaders like GE.

Conference Within A Conference, Part Two: Big Fast Data World

John Bates, who I know from his days at Progress Software, actually holds the title SVP of Big Fast Data at Software AG. Love it. He led off the Big Fast Data World sub-conference at Innovation World to talk about real-time decisioning based on events, whether that is financial data such as trades, or device events from an oil rig. This isn’t just simple “if this event occurs, then trigger this action” sort of decisions, but real-time adaptive intelligence that might include social media, internal business systems, market information and more. It’s where events, data, analytics and process all come together.

The goal is to use all of the data and events possible to appear to be reading your customer’s mind and offering them the most likely thing that they want right then (without being too creepy about it), using historical patterns, current context and location information. For example, a customer is in the process of buying something, and their credit card company or retail partner uses that opportunity to upsell them on a related product or a payment plan, directly to their mobile phone and before they have finished making their purchase. Or, a customer is entering a mall, they are subscribed to a sports information service, there are available tables at a sports bar in the mall, so they are pushed a coupon to have lunch at the sports bar right then. Even recommendation engines, such as we see every time that we visit Amazon or Netflix, are examples of this. Completely context sensitive, and completely personalized.

On the flip side, companies have to use continuous monitoring of social media channels for proactive customer care: real-time event and data analysis for responding to unhappy customers before situations blow up on them. People like Dave Carroll and Heather Armstrong (and sometimes even me, on a much smaller scale) can strike fear in the hearts of customer service organizations who are unable to respond appropriately and quickly, but can cause big wins for these companies when they do the right things to fix things in an expedient manner for their customers.

What do you need to do to make this happen? Not much, just low-latency universal messaging, in-memory unstructured data, real-time predictive analytics, intelligent actions via real-time integration to operational systems, and real-time visual analytics. If you’re a Software AG customer, they’re bringing together Terracotta, Apama and JackBe into a unified platform for this sort of adaptive intelligence, producing intelligent actions from big data, in real time, to/from anywhere.

Software AG Big Fast Data

We then got a bit of a lesson on big data from Nathaniel Rowe, a research analyst at Aberdeen Group: how big is big, what’s the nature of that data, and some of the problems with it. The upshot: the fact that there’s a lot of data is important, but it’s the unstructured nature of it that presents many of the difficult analytical problems. It’s about volume, but also variety and velocity: the data could be coming from anywhere, and you don’t have control over a lot of it such as the social media data or that from business partners. You have to have a clear picture of what you want out of big data, such as better customer insights or operational visibility; Rowe had a number of use cases from e-commerce to healthcare to counterterrorism. The ability to effectively use unstructured data is key: those companies that are best in class are doing this better than average, and it translates directly to measures such as sales, customer satisfaction and net promoter score. He finished up with some of the tools required – automatic data capture, data compression, and data cleansing – and how those translate directly to employees’ ability to find data, particularly from multiple sources at once. Real-time analytics and in-memory analytics are the two high-speed technologies that result in the largest measurable benefits when working with big data, making the difference between seconds (or even sub-second) to see a result or take an action, versus minutes or hours. He ended up with the correlation between investing in big data and various customer experience measures (15-18% increases) as well as revenue measures (12-17% increases). Great presentation, although I’m pretty sure that I missed 75% of it since he is a serious speed-talker and zipped through slides at the speed of light.

And we’re done for the day: back tomorrow for another full day of Innovation World. I’m off to the drinks reception then a customer party event; as always, everything is off the record as soon as the bar opens. Smile

SAPPHIRENOW Vishal Sikka Keynote – HANA For Speed, Fiori For Usability

Vishal Sikka, who leads technology and innovation at SAP, followed Hasso Platner onto the keynote stage; I decided to break the post and publish just Plattner’s portion since my commentary was getting bit long.

Sikka also started his part of the keynote with HANA, and highlighted some customer case studies from their “10,000 Club”, where operations are more than 10,000 times faster when moved to HANA, plus one customer with an operation that runs 1 million times faster on HANA. He talked about how imperatives for innovation are equal parts math and design: it has to be fast, but it also has to solve business problems. HANA provides the speed and some amount of the problem-solving, but really good user experience design has to be part of the equation. To that end, SAP is launching Fiori, a collection of 25 easy-to-use applications for the most common SAP ERP and data warehouse functions, supported on phone, tablet and desktop platforms with a single code base. Although this doesn’t replace the 1000′s of existing screens, it can likely replace the old screens for many user personas. As part of the development of Fiori, they partnered with Google and optimized the applications for Chrome, which is a pretty bold move. They’ve also introduced a lot of new forms of data visualization, replacing mundane list-style reports with more fluid forms that are more common on specialized data visualization platforms such as Spotfire.

Fiori doesn’t depend on HANA (although you can imagine the potential for HANA analytics with Fiori visualization), but can be purchased directly from the HANA Marketplace. You can find out more about SAP’s UX development, including Fiori, on their user experience community site.

Returning to HANA, and to highlight that HANA is also a platform for non-SAP applications, Sikka showed some of the third-party analytics applications developed by other companies on the HANA platform, including eBay and Adobe. There are over 300 companies developing applications on HANA, many addressing specific vertical industries.

That’s it for me from SAPPHIRE NOW 2013 — there’s a press Q&A with Plattner and Sikka coming up, but I need to head for the airport so I will catch it online. As a reminder, you can see all of the recorded video (as well as some remaining live streams today) from the conference here.