TIBCO TUCON2012 Day 1 Keynotes, Part 1

The keynotes started with TIBCO’s CEO, Vivek Ranadivé, talking about the forces driving change: a massive explosion of data (big data), the emergence of mobility, the emergence of platforms, the rise of Asia (he referenced the Gangnam Style video, although did not actually do the dance), and how math is trumping science (e.g., the detection and exploitation of patterns). The ability to harness these forces and produce extreme value is a competitive differentiator, and is working for companies like Apple and Amazon.

Raj Verma, TIBCO’s CMO, was up next, continuing the message of how fast things are changing: more iPhones were sold over the past few days than babies were born worldwide, and Amazon added more computing capacity last night than they had in total in 2001. He (re)introduced their concept of the two-second advantage – the right information a little bit before an event is worth infinitely more than any amount of information after the event – enabled by an event-enabled enterprise (or E3, supported by, of course, TIBCO infrastructure). Regardless of whether or not you use TIBCO products, this is a key point: if you’re going to exploit the massive amounts of data being generated today in order to produce extreme value, you’re going to need to be an event-enabled enterprise, responding to events rather than just measuring outcomes after the fact.

He discussed the intersection of four forces: cloud, big data, social collaboration and mobility. This is not a unique message – every vendor, analyst and consultant are talking about this – but he dug into some of these in detail: mobile, for example, is no longer discretionary, even (or maybe especially) in countries where food and resources are scarce. The four of these together all overlap in the consumerization of IT, and are reshaping enterprise IT. A key corporate change driven by these is customer experience management: becoming the brand that customers think of first when the product class is mentioned, and turning customers into fans. Digital marketing, properly done, turns your business into a social network, and turns customer management into fan management.

Matt Quinn, CTO, continued the idea of turning customers into fans, and solidifying customer loyalty. To do this, he introduced TIBCO’s “billion dollar backend” with its platform components of automation, event processing, analytics, cloud and social, and hosted a series of speakers on the subject of customer experience management.

We then heard from a customer, Chris Nordling, EVP of Operations and CIO of MGM Resorts and CityCenter, who use TIBCO for their MLife customer experience management/loyalty program. Their vision is to track everything about you from your gambling wins/losses to your preferences in restaurants and entertainment, and use that to build personalized experiences on the fly. By capturing the flow of big data and responding to events in realtime, the technology provides their marketing team with the ability to provide a zero-friction offer to each customer individually before they even know that they want something: offering reduced entertainment tickets just as you’re finishing a big losing streak at the blackjack tables, for example. It’s a bit creepy, but at the same time, has the potential to provide a better customer experience. Just a bit of insight into what they’re spending that outrageous $25/day resort fee on.

Quinn came back to have a discussion with one of their “loyalty scientists” (really??) about Loyalty Lab, TIBCO’s platform/service for loyalty management, which is all about analyzing events and data in realtime, and providing “audience of one” service and offerings. Traditional loyalty programs were transaction-based, but today’s loyalty programs are much more about providing a more holistic view of the customer. This can include not just events that happen in a company’s own systems, but include external social media information, such as the customer’s tweets. I know all about that.

Another customer, Rick Welts of the Golden State Warriors (who, ironically, play at the Oracle stadium) talked about not just customer loyalty management, but the Moneyball-style analytics that they apply to players on a very granular scale: each play of each game is captured and analyzed to maximize performance. They’re also using their mobile app for a variety of customer service initiatives, from on-premise seat upgrades to ordering food directly from your seat in the stadium.

Mid-morning break, and I’ll continue afterwards.

As an aside, I’m not usually wide awake enough to get much out of the breakfast-in-the-showcase walkabout, but this morning prior to the opening sessions, I did have a chance to see the new TIBCO decision services integrated into BPM, also available as standalone services. Looked cool, more on that later.

IBM Impact Day 2: Engage. Extend. Succeed.

Phil Gilbert spoke at the main tent session this morning, summarizing how they announced IBM BPM as a unified offering at last year’s Impact, and since then they’ve combined Business Events and ILOG to form IBM ODM (operational decision management). Business process and decision management provide visibility and governance, forming a conduit to provide information about transactions and data to people who need to access it. IBM claims to have the broadest, most integrated process portfolio, having taken a few dozen products and turned them into two products; Phil was quick to shoot down the idea that this is a disjointed, non-integrated collection of tools, referring to it instead as a “loosely coupled integration architecture”. Whatever.

Around those two core products (or product assemblies) are links to other enterprise tools – Tivoli, MDM, ECM and SAP – forming the heart of business processes and system orchestration. In version 8 of BPM and ODM, they’ve added collaboration, which is the third key imperative for business alongside visibility and governance.

We saw a demo of the new capabilities, most of which I talked about in yesterday’s post. For ODM, that included the new decision console (social activity stream, rules timeline) and global rules search. For BPM, there’s the new socially-aware process portal, which has been created on their publicly-available APIs so that you can roll your own portal with the same level of functionality. There’s searching in the process portal to find tasks easily. The new coach (UI form) designer allows you to create very rich task interfaces more easily, including the sidebar of task/instance details, instance-specific activity stream, and experts available for collaboration. They’ve incorporated the real-time collaboration capabilities of Blueworks Live into the BPM coaches to allow someone to request and receive help from an expert, with the user and the expert seeing each other’s inputs synchronously on the form in question. Lastly, Approve/Reject type tasks can be completed in-line directly in the task list, making it much faster to move through a long set of tasks that require only simple responses. He wrapped up with the obligatory iPad demo (have to give him credit for doing that part of the live demo himself, which most VPs wouldn’t consider).

The general session also included presentations of some innovative uses of BPM and ODM by IBM’s customers: Ottawa General Hospital, which has put patient information and processes on an iPad in the doctors’ pockets, and BodyMedia, which captures, analyzes and visualizes a flood of biometric data points gathered by an armband device to assist with a weight loss program.

What Analysts Need to Understand About Business Events

Paul Vincent, CTO of Business Rules and CEP at TIBCO (and possibly the only person at Building Business Capability sporting a bow tie), presented a less technical view of events that you would normally see in one of his presentation, intended to have the business analysts here at Building Business Capability understand what events are, how they impact business processes, and how to model them. He started with a basic definition of events – an observation, a change in state, or a message – and why we should care about them. I cover events in the context of processes in many of the presentations that I give (including the BPM in EA tutorial that I did here on Monday), and his message is the same: life is event-driven, and our business processes need to learn to deal with that fact. Events are one of the fundamentals of business and business systems, but many systems do not handle external events well. Furthermore, many process analysts don’t understand events or how to model them, and can end up creating massive spaghetti process models to try and capture the result of events since they don’t understand how to model events explicitly.

He went through several different model types that allow for events to be captured and modeled explicitly, and compared the pros and cons of each: state models, event process chain models, resources events agents (REA) models, and BPMN models. The BPMN model is the only one that really models events in the context of business processes, and relates events as drivers of process tasks, but is really only appropriate for fairly structured processes. It does, however, allow for modeling 63 different types of events, meaning that there’s probably nothing that can happen that can’t be modeled by a BPMN event. The heavy use of events in BPMN models can make sense for heavily automated processes, and can make the process models much more succinct. Once the event notation is understood, it’s fairly easy to trace through them, but events are the one thing in BPMN that probably won’t be immediately obvious to the novice process analyst.

In many cases, individual events are not the interesting part, but rather a correlation between many events; for example, fraud events may be detected only have many small related transactions have occurred. This is the heart of complex event processing (CEP), which can be applied to a wide variety of business situations that rely on large volumes of events, and distinguishes between simple process patterns and business rules that can be applied to individual transactions.

Looking at events from an analyst’s view, it’s necessary to identify actors and roles, just as in most use cases, then identify what they do and (more importantly) when they do it in order to drive out the events, their sources and destinations. Events can be classified as positive (e.g., something that you are expecting to happen actually happened), negative (e.g., something that you are expecting to happen didn’t happen within a specific time interval) or sets (e.g., the percentage of a particular type of event is exceeding an SLA). In many cases, the more complex events that we start to see in sets are the ones that you’re really interested in from a business standpoint: fraud, missed SLAs, gradual equipment failure, or customer churn.

He presented the EPTS event reference architecture for complex events, then discussed how the different components are developed during analysis:

  • Event production and consumption, namely, where events come from and where they go
  • Event preparation, or what selection operations need to be performed to extract the events, such as monitoring, identification and filtering
  • Event analysis, or the computations that need to be performed on the individual events
  • Complex event detection, that is, the event correlations and patterns that need to performed in order to determine if the complex event of interest has occurred
  • Event reaction, or what event actions need to be performed in reaction to the detected complex event; this can overlap to some degree with predictive analytics in order to predict and learn the appropriate reactions

He discussed event dependencies models, which show event orderings, and relate events together as meaningful facts that can then be used in rules. Although not a common practice, this model type does show relationships between events as well as linking to business rules.

He finished with some customer case studies that include CEP and event decision-making: FedEx achieving zero latency in determining where a package is right now; and Allstate using CEP to adjust their rules on a daily basis, resulting in a 15% increase in closing rates.

A final thought that he left us with: we want agile processes and agile decisions; process changes and rule changes are just events. Analyzing business events is good, but exploiting business events is even better.

TIBCO Acquisitions With Tom Laffey: OpenSpirit, Loyalty Lab and Nimbus

Tom Laffey, EVP of products and technology, moderated a session highlighting three of TIBCO’s recent acquisitions: OpenSpirit, Loyalty Lab and Nimbus.

Clay Harter, CTO of OpenSpirit (which was acquired by TIBCO a year ago), discussed their focus on delivering data and integration applications to the oil and gas industry. Their runtime framework provided a canonical data model over a heterogeneous set of data stores, and their desktop applications integrated with spatial data products such as ESRI’s ArcGIS and Schlumberger’s remote sensing. Due to their knowledge of the specialized data sources, they have a huge penetration into 330+ oil companies and relationships into industry-specific ISVs. In October, they will release a BusinessWorks plugin for OpenSpirit to make oil and gas technical data available through the TIBCO ESB. They are also prototyping a Spotfire extension for OpenSpirit for visualizing and analyzing this data, which is pretty cool – I worked as a field engineer in oil and gas in the early 80’s, and the sensing and visualization of data was a whole different ball game then, mostly black magic. OpenSpirit’s focus is on reducing exploration costs and increasing safely through better analysis of the petrotechnical data, particularly through interdisciplinary collaboration. From TIBCO’s standpoint, they were building their energy vertical, and the acquisition of OpenSpirit brings them expertise and credibility in that domain.

Keith Rose, formerly president of Loyalty Lab and now leading the sales efforts in that area since their acquisition by TIBCO, presented on their event-driven view of managing customer loyalty, particularly loyalty programs such as those used by airlines and retailers. They have a suite of products that support marketers in terms of visualizing and analyzing loyalty-related data, and building loyalty programs that can leverage that information. Their focus on events – the core of real-time and one-to-one loyalty marketing programs – was likely the big reason for the TIBCO acquisition, since TIBCO’s event and messaging infrastructure seems like a natural fit to feed into Loyalty Lab’s analysis and programs. Spotfire for visualization and analysis of data also makes a lot of sense here, if they can work out how to integrate that with their existing offerings. With 99% of their customers on a hosted cloud solution, they may also want to consider how a move to TIBCO’s cloud platform can benefit them and integrate with other initiatives that their customers may have.

Less than a month ago, Nimbus was acquired by TIBCO, and Mark Cotgrove, a founder and EVP, gave us a briefing on their product and why it made sense for TIBCO to acquire them. Nimbus provides tools for process discovery and analysis, including the 80% (or so) of an organization’s activities that are manual and are likely to remain manual. Currently, the automated activities are handled with enterprise applications and automated BPM (such as AMX/BPM), but the manual ones are managed with a mix of office productivity software (Word, PowerPoint, Visio) and business process analysis tools. Furthermore, end-to-end processes range back and forth between manual and automated activities as they progress through their lifecycle, such that often a single process instance ends up being managed by a variety of different tools. Nimbus provides what are essentially storyboards or guided walkthroughs for business processes: like procedures manuals, but more interactive. These “intelligent operations manuals” can include steps that will instruct the user to interact with a system of some sort – for example, an ERP system, or a BPMS such as AMX/BPM – but documents all of the steps including paper handling and other manual activities. Just as a BPMS can be an orchestration of multiple integrated systems, Nimbus Control can be an orchestration of human activities, including manual steps and interaction with systems. There are a few potential integration points between Nimbus and a few different TIBCO products: metrics in the context of a process using Spotfire; exporting discovered processes from Nimbus to BusinessStudio; instantiating an AMX/BPM process from Nimbus; worker accessing a Nimbus operations manual for instructions at the step in an AMX/BPM process; collaborative process discovery using tibbr; and tibbr collaboration as part of a manual process execution. Some or all of these may not happen exactly like this, but there is some interesting potential here. There’s also potential within an organization for finding opportunities for AMX/BPM implementation through process discovery using Nimbus.

An interesting view of three different acquisitions, based on three very different rationales: industry vertical; horizontal application platform; and expansion of core product functionality. TIBCO is definitely moving from their pure technology focus to one that includes verticals and business applications.

TIBCO Product Strategy With Matt Quinn

Matt Quinn, CTO, gave us the product strategy presentation that will be seen in the general session tomorrow. He repeated the “capture many events, store few transactions” message as well as the five key components of a 21st century platform that we heard from Murrary Rode in the previous session; this is obviously a big part of the new messaging. He drilled into their four broad areas of interest from a product technology standpoint: event platform innovation, big data and analytics, social networking, and cloud enablement.

In the event platform innovation, they released BusinessEvents 5.0 in April this year, including the embedded TIBCO Datagrid technology, temporal pattern matching, stream processing and rules integration, and some performance and big data optimizations. One result is that application developers are now using BusinessEvents to build applications from the ground up, which is a change in usage patterns. For the future, they’re looking at supporting other models, such as BPMN and rule models, integrating statistical models, improving queries, improving the web design environment, and providing ActiveMatrix deployment options.

In ActiveMatrix, they’ve released a fully integrated stack of BusinessWorks, BPM and ServiceGrid with broader .Net and C++ support, optimized for large deployments and with better high-availability support and hot deployment capabilities. AXM/BPM has a number of new enhancements, mostly around the platform (such as the aforementioned HA and hot deployment), with their upcoming 1.2 release providing some functional enhancements such as customer forms and business rules based on BusinessEvents. We’ll see some Nimbus functionality integration before too much longer, although we didn’t see that roadmap; as Quinn pointed out, they need to be cautious about positioning which tools are for business users versus technical users. When asked about case management, he said that “case management brings us into areas where we haven’t yet gone as a company and aren’t sure that we want to go”. Interesting comment, given the rather wild bandwagon-leaping that has been going on in the ACM market by BPM and ECM vendors.

The MDM suite has also seen some enhancements, with ActiveSpaces integration and collaborative analytics with Spotfire, allowing MDM to become a hub for reference data from the other products. I’m very excited to see that one-click integration between MDM and AMX/BPM is on the roadmap; I think that MDM integration is going to be a huge productivity boost for overall process modeling, and when I reviewed AMX/BPM last year, I liked their process data modeling stated that “the link between MDM and process instance data needs to be firmly established so that you don’t end up with data definitions within your BPMS that don’t match up with the other data sources in your organization”. In fact, the design-time tool for MDM is now the same as that used for business object data models that I saw in AMX/BPM, which will make it easier for those who move across the data and process domains.

TIBCO is trying to build out vertical solutions in certain industries, particularly those where they have acquired or built expertise. This not only changes what they can package and offer as products, but changes who (at the customer) that they can have a relationship with: it’s now a VP of loyalty, for example, rather than (or in addition to) someone in IT.

Moving on to big data and analytics technology advances, they have released FTL 2.0 (low-latency messaging) to reduce inter-host latency below 2.2 microseconds as well as provide some user interface enhancements to make it easier to set up the message exchanges. They’re introducing TIBCO Web Messaging to integrate consumer mobile devices with TIBCO messaging. They’ve also introduced a new version of ActiveSpaces in-memory data grid, providing big data handling at in-memory speeds by easing the integration with other tools such as event processing and Spotfire.

They’ve also released Spotfire 4.0 visual analytics, with a bit focus on ease of use and dashboarding, plus tibbr integration for social collaboration. In fact, tibbr is being used as a cornerstone for collaboration, with many of the TIBCO products integrating with tibbr for that purpose. In the future, tibbr will include collaborative calendars and events, contextual notifications, and other functionality, plus better usability and speed. Formvine has been integrated with tibbr for forms-based routing, and Nimbus Control integrates with tibbr for lightweight processes.

Quinn finished up discussing their Silver Fabric cloud platform to be announced tomorrow (today, if you count telling a group of tweet-happy industry analysts) for public, private and hybrid cloud deployments.

Obviously, there was a lot more information here that I could possibly capture (or that he could even cover, some of the slides just flew past), and I may have to get out of bed in time for his keynote tomorrow morning since we didn’t even get to a lot of the forward-looking strategy. With a product suite as large as what TIBCO has now, we need much more than an hour to get through an analyst briefing.

TIBCO Corporate Strategy Session with Murray Rode

I’m in Vegas this week at TUCON, TIBCO’s user conference, and this afternoon I’m at the analyst event. For the corporate strategy session, they put the industry analysts and financial analysts together, meaning that there were way too many dark suits in the room for my taste (and my wardrobe).

Murray Rode, COO, gave us a good overview presentation on the corporate strategy, touching on market factors, their suite of products, and their growth in terms of products, geographies and verticals. Definitely, event-driven processes are a driving force behind businesses these days – matching with the “responsive business” message I saw at the Progress conference last week – and TIBCO sees their product suite as being ideally positioned to serve those needs.

Rode defined the key components of a 21st century platform as:

  • Automation (SOA, messaging, BPM) as core infrastructure
  • Event processing
  • Social collaboration
  • Analytics
  • Cloud

Their vision is to be the 21st century middleware company, continuing to redefine the scope and purpose of middleware, and to provide their customers with the “2-second advantage” based on event processing, real-time analytics and process management. They see the middleware market as taking a bite out of the application development platforms and out of the box suites by providing higher-functioning, more agile capabilities, and plan to continue their pure-play leadership in middleware.

Looking at their performance in verticals, financial services is now only 25% of their business as they diversify into telecom, government, energy, retail and other market segments. This is an interesting point, since many middleware (including many BPM) vendors grew primarily in financial services, and have struggled to break out of that sector in a significant way.

From a product standpoint, their highest growth is happening in CEP, analytics and MDM, while core stable growth continues in BPM and SOA. They are starting to see new growth in cloud, tibbr, low-latency messaging and Nimbus to drive their future innovation.

They see their key competitors as IBM and Oracle, and realize that they’re the small fish in that pond; however, they see themselves as being more innovative and in touch with current trends, and having a better pure-play focus on infrastructure. Their strategy is to keep defining the platform through a culture of continuous innovation, so as not to become a one-hit wonder like many other now-defunct (or acquired) middleware vendors of the past; to maximize sales execution strengths for growth by setting vertical go-to-market strategies across their product suite; to organize for innovation particularly through cross-selling the newer products into mature opportunities; to cultivate their brand; and to manage for growth and continued profitability, in part by branching beyond their direct sales force, which has been a significant strength for them in the past, to invest in partner and SI relationships to broaden their sales further.

Rode spoke briefly about acquisitions (we’re slated for a longer session on this later today), and positioned Nimbus as having applicability to core infrastructure in terms of analytics and events, not just BPM. It will be interesting to see how that plays out. In general, their focus is on smaller acquisitions to complement and enhance their core offering, rather than big ones that would be much harder to align with their current offerings.

Progress Analyst Day: Industry and Product View

Rick Reidy, Progress CEO, opened their one-day analyst event in New York by talking about operational responsiveness: how your enterprise needs to be able to respond to events that happen that are outside your control. You can’t control crises, but you can control your organization’s response to those crises. Supporting operational responsiveness are four technology trends – cloud, mobile, social media and collaboration – with the key being to extend the use of technology to the business user. If you remember Progress’ roots in 4GL development, this isn’t exactly a new idea to them; 4GLs were supposed to be for business users, although it didn’t really work out that way. Today’s tools have a better chance at achieving that goal.

Today, they’re announcing the upcoming releases of Responsive Process Management Suite 2.0 and Control Tower 2.0, plus their cloud platform, Arcade. Interestingly, much like IBM’s Lombardi acquisition turned BPM into the biggest story at this year’s Impact conference, Progress’ Savvion acquisition is having the same effect here: RPM is the top-line story, albeit supported by event processing. It’s not that the entire product suite is the former Savvion BPMS, but rather than BPM is providing the face for the product.

Reidy turned the stage over to “the two Johns” (his words, not mine): Dr. John Bates, CTO, and John Goodson, Chief Product Officer. Bates dug further into the ideas of operational responsiveness, and how the combination of analytics, event sense and respond, and process management help to achieve that. As he put it, BPM alone won’t achieve the responsive business; businesses are event-driven. They’re really trying to define a new “responsive process management” (RPM) market, at the overlap between BPM, business event processing, business transaction management, and business intelligence and analytics. Cue Venn diagram, with RPM at the intersection, then fade to another Venn diagram between custom applications and packaged applications, again with RPM at the intersection. Their estimates put the current market at $2.5B, with a rise to $6.5B by 2014.

Bates talked about the value of BPM, and how that’s often not enough because businesses are very event-driven: events flow in and out of your business every day – from applications, devices and external feeds – and how you respond to them can define your competitive advantage. Patterns in the relationships between events can identify threats and opportunities, and are especially important when those events are traditionally held in separate silos that typically don’t interact. He gave a great example around the FAA fines for airlines who hold passengers captive on planes on the ground for more than 3 hours: by looking at the events related to crew, maintenance, weather and flight operations, it’s possible to anticipate and avoid those situations, and therefore the fines (and bad press) that go along with them. You don’t need to rework existing legacy systems in order to have this sort of operational responsiveness: automated agents trap the events generated by existing systems, which can then be analyzed and correlated, and processes kicked off in a BPMS to respond to specific patterns.

Progress presents all this through their Control Tower, which brings together a business view of analytics and visualization, event sense and respond, and process management. I’m sure that I’ve written about this previously and would love to link to it, but the wifi in here is so crap that I can’t get a solid connection, so can’t look up what I’ve written previously on my own blog. #fail

Goodson took over to discuss the product in more detail, showing how Savvion, Actional, Apama and other components make up their RPM suite and the event management layer below that. Control Tower is new product (or was new in 1.0, last year) that overlays all of this, and puts a consistent business-facing interface on all of this, and providing collaborative process design and other social media features. We saw a pre-recorded demo of Control Tower, showing how a dashboard can be created quickly by dragging analytics widgets onto the canvas. The key thing is that the widgets are pre-wired to specific analytics and processes, allowing drill-downs into the details and even into the process models. Process changes and simulations can be done interactively.

As the wifi flickers to life occasionally, it’s interesting to see the Twitter stream for the event (obviously being updated by people with their own connectivity solutions): Anne Thomas Manes says “Claims to be unique in the ‘responsiveness’ market, but the marketing story sounds very much like Tibco’s story”. Mike Gualtieri thinks that “Control Tower…looks good, but it would be cool to hold up an iPad and pass it into the audience”. Personally, I’m pretty much over the iPad gee-whiz demo phase.

We came back after the morning break to a continuation of the John and John show, with Bates initially responding to some of the tweets that had happened during the earlier session, then discussing their solution accelerators that include business processes, rules, analytics, alerts, interceptors and adapters. They have created accelerators for several verticals including capital markets front office, and communications and media order management, in part because of their history with those verticals and in part because of the suitability of RPM to those applications. Savvion had been going down this road long before the acquisition, and this makes a lot of sense for Progress in terms of competitive differentiation: a combination of industry-specific foundation services and adapters that are mostly productized (hence supported and maintained like product), and customizable solution accelerators that rest on top of that foundation. This makes them far more useful than the type of templates that are offered by other vendors, which are not upgradable after they’re customized; although not confirmed from the stage, my assumption is that the customization (and hence forking from the Progress-maintained code base) all happens in a thin layer at the top, not in the bulk of the foundation services.

They’re currently shipping six different accelerators, and are adding several more this year. These range across industry verticals including banking, capital markets, communications and media, insurance, supply chain, and travel and leisure. They’ve worked with partners and customers to develop these, as well as creating some internal industry experience. We saw a couple of canned demos, although it’s impossible to tell from this just how much effort is required to fit this to any particular company’s business. As part of the demos, there were some bits where a business user updated the event handling and created a new rule; I don’t think that this will be done by business users, although certainly a trained business analyst could handle it.

The solution accelerators form a big part of their go-to-market strategy, and see these as taking share away from packaged applications. They see solution accelerators as a differentiator for Progress, as well as their focus on responsive process management. They haven’t forgotten their OpenEdge agile development environment however: they’re announcing OpenEdge BPM to bring that development platform to BPM applications. I don’t know enough about OpenEdge to understand the implications here, but will be interesting to see what synergies are possible as they bring together the entire Progress product suite.

To finish the industry and product section of the day, we heard about their cloud strategy, focused on applications in the cloud (rather than just infrastructure or a platform): creating vertical ecosystems for their various industry foci, incorporating both Progress solutions and partners to create composite solutions based on RPM, with Control Tower for end-to-end visibility and improvement. Progress Arcade is their cloud application platform for allowing these vertical ecosystems to be easily created and deployed across a variety of public and private cloud environments. It reminds me a bit of TIBCO’s Silver BPM environment, where you can do all the provisioning and setup right from their environment rather than having to hop around between different configuration tools. They stated that this is targeted at small and medium businesses who want to be able to leverage technology to be competitive: this is definitely an echo of a conversation that I had about BPM in the cloud with Neil Ward-Dutton earlier this morning, where I stated that most of the growth would be in SMB since this is the only way that most of them can afford to consider this technology.

Progress does a combo analyst day for both industry and financial analysts; this morning was more for the industry analysts while this afternoon is more for the financial analysts, although we’re all invited to be here all day. Since I don’t cover a lot of the financial side, I likely won’t be writing a lot about it, although I may be tweeting since the wifi seems to be a bit better behaved now.

SAP Run Better Tour: Business Analytics Overview

Dan Kearnan, senior director of marketing for business analytics, provided a overview of SAP’s business analytics in the short breakout sessions following the keynote. Their “run smarter” strategy is based on three pillars of knowing your business, deciding with confidence and acting boldly; his discussion of the “act boldly” part seemed to indicate that the round-tripping from data to events back to processes is more prevalent than I would have thought based on my previous observations.

We covered a lot of this material in the bloggers briefing a couple of weeks ago with Steve Lucas; he delved into the strategy for specific customers, that is, whether you’re starting with SAP ERP, SAP NetWeaver BW or non-SAP applications as input into your analytics.

He briefly addressed the events/process side of things – I think that they finally realized that when they bought Sybase, they picked up Aleri CEP with it – and their Event Insight solution is how they’re starting to deliver on this. They could do such a kick-ass demo using all of their own products here: data generated from SAP ERP, analyzed with BusinessObjects, events generated with Event Insight, and exception processes instantiated in NetWeaver BPM. NW BPM, however, seems to be completely absent from any of the discussions today.

He went through a number of the improvements in the new BI releases, including a common (and easier to use) user interface across all of the analytics products, and deep integration with the ERP and BW environments; there is a more detailed session this afternoon to drill into some of these.

I’m going to stick around to chat with a few people, but won’t be staying for the afternoon, so my coverage of the SAP Run Better Tour ends here. Watch the Twitter stream for information from others onsite today and at the RBT events in other cities in the days to come, although expect Twitter to crash spectacularly today at 1pm ET/10am PT when the iPad announcement starts.

SAP Analytics Update

A group of bloggers had an update today from Steve Lucas, GM of the SAP business analytics group, covering what happened in 2010 and some outlook and strategy for 2011.

No surprise, they saw an explosion in growth in 2010: analytics has been identified as a key competitive differentiator for a couple of years now due to the huge growth into the amount of information and event being generated for every business; every organization is at least looking at business analytics, if not actually implementing it. SAP has approached analytics across several categories: analytic applications, performance management, business intelligence, information management, data warehousing, and governance/risk/compliance. In other words, it’s not just about the pretty data visualizations, but about all the data gathering, integration, cleanup, validation and storage that needs to go along with it. They’ve also released an analytics appliance, HANA, for sub-second data analysis and visualization on a massive scale. Add it all up, and you’ve got the right data, instantly available.

SAP Analytics products

New features in the recent product releases include an event processing/management component, to allow for real-time event insight for high-volume transactional systems: seems like a perfect match for monitoring events from, for example, an SAP ERP system. There has also been some deep integration into their ERP suite using the Business Intelligence Consumer Services (BICS) connector, although all of the new functionality in their analytics suite really pertains to Business Objects customers who are not SAP ERP customers; interestingly, he refers to customers who have an SAP analytics product but not their ERP suite as “non-SAP customers” – some things never change.

In a move that will be cheered by every SAP analytics user, they’ve finally standardized the user interface so that all of their analytics products share a common (or similar, it wasn’t clear) user experience – this is a bit of catch-up on their part, since they’ve brought together a number of different analytics acquisitions to form their analytics suites.

They’ve been addressing the mobile market as well as the desktop market, and are committing to all mainstream mobile platforms, including RIM’s Playbook. They’re developing their own apps, which will hurt partners such as Roambi who have made use of the open APIs to build apps that access SAP analytics data; there will be more information about the SAP apps in some product announcements coming up on the 23rd. Mobile information consumption is good, and possibly sufficient for some users, but I still think that most people need the ability to take action on the analytics, not just view them. That tied into a question about social BI; Lucas responded that there would be more announcements on the 23rd, but also pointed us towards their StreamWork product, which provides more of the sort of event streaming and collaboration environment that I wrote about earlier in Appian’s Tempo. In other words, maybe the main app on a mobile device will be StreamWork, so that actions and collaboration can be done, rather than the analytics apps directly. It will be interesting to see how well they integrate analytics with StreamWork so that a user doesn’t have to hop around from app to app in order to view and take action on information.

TIBCO Now Roadshow: Toronto Edition (Part 2)

We started after the break with Jeremy Westerman, head of BPM product marketing for TIBCO, presenting on AMX BPM. The crowd is a bit slow returning, which I suspect is due more to the availability of Wii Hockey down the hall than to the subject matter. Most telling, Westerman has the longest timeslot of the day, 45 minutes, which shows the importance that TIBCO is placing on marketing efforts for this new generation of their BPM platform. As I mentioned earlier, I’ve had 3+ hours of briefing on AMX BPM recently and think that they’ve done a good job of rearchitecting – not just refactoring – their BPM product to a modern architecture that puts them in a good competitive position, assuming that they can get the customer adoption. He started by talking about managing business processes as strategic assets, and the basics of what it means to move processes into a BPMS, then moved on to the TIBCO BPM products: Business Studio for modeling, the on-premise AMX BPM process execution environment, and the cloud-based Silver BPM process execution environment. This built well on their earlier messages about integration and SOA, since many business processes – especially for the financial-biased audience here today – are dependent on integrating data and messaging with other enterprise systems. Business-friendly is definitely important for any BPM system, but the processes also have to be able to punch at enterprise weight.

His explanation of work management also covered optimizing people within the process: maximizing utilization while still meeting business commitments through intelligent routing, unified work lists and process/work management visibility. A BPM system allows a geographically distributed group of resources to be treated as single pool for dynamic tunable work management, so that the actual organizational model can be used rather than an artificial model imposed by location or other factors. This led into a discussion of workflow patterns, such as separation of duties, which they are starting to build into AMX BPM as I noted in my recent review. He walked through other functionality such as UI creation, analytics and event processing; although I’ve seen most of this before, it was almost certainly new to everyone except the few people in the room who had attended TUCON back in May. The BPM booth was also the busiest one during the break, indicating a strong audience interest; I’m sure that most BPM vendors are seeing this same level of interest as organizations still recovering from the recession look to optimize their processes to cut costs and provide competitive advantage.

Ivan Casanova, director of cloud marketing for TIBCO, started with some pretty simple Cloud 101 stuff, then outlined their Silver line of cloud platforms: Silver CAP for developing cloud services, Silver Fabric for migrating existing applications, Silver BPM for process management, and Silver Spotfire for analytics. Some portion of the IT-heavy audience was probably thinking “not in my data centre, dude!”, but eventually every organization is going to have to think about what a cloud platform brings in terms of speed of deployment, scalability, cost and ability to collaborate outside the enterprise. Although he did talk about using Fabric for “private cloud” deployments that leverage cloud utility computing principles for on-premise systems, he didn’t mention the most likely baby step for organizations who are nervous about putting production data in the cloud, which is to use the cloud for development and testing, then deploy on premise. He finished with a valid point about how they have a lot of trust from their customers, and how they’ve built cloud services that suit their enterprise customers’ privacy needs; IBM uses much the same argument about why you want to use an large, established, trusted vendor for your cloud requirements rather than some young upstart.

We then heard from Greg Shevchik, a TIBCO MDM specialist, for a quick review of the discipline of master data management and TIBCO’s Collaborative Information Manager (CIM). CIM manages the master data repositories shared by multiple enterprise systems, and allows other systems – such as AMX BPM – to use data from that single source. It includes a central data repository; governance tools for validation and de-duplication; workflow for managing the data repository; synchronization of data between systems; and reporting on MDM.

Last up for the Toronto TIBCO Now was Al Harrington (who was the mystery man who opened the day), giving us a quick view of the new generation of TIBCO’s CEP product, BusinessEvents. There’s a lot to see here, and I probably need to get a real briefing to do it justice; events are at the heart of so many business processes that CEP and BPM are becoming ever more intertwined.

My battery just hit 7% and we’re after 5pm, so I’ll wrap up here. The TIBCO Now roadshow provides a good overview of their updated technology portfolio and the benefits for customers; check for one coming your way.