Closing the loop with analytics: TIBCONOW 2016 day 2 keynote

Yesterday at TIBCO NOW 2016, we heard about the first half of TIBCO’s theme — interconnect everything — and today, Matt Quinn introduced the second half — augment intelligence — before turning the stage over to Mark Palmer, SVP engineering for streaming analytics.

wp-1463592662913.png

Palmer talked about the role of analytics over history, and how today’s smart visual analytics allow you to be first to insight, then first to action. We then had a quick switch to wp-1463592680877.pngBrad Hopper, VP strategy for analytics, for a demo of Spotfire visual analytics while wearing a long blond wig (attempting to make a point about the importance of beauty, I think). He built an analytics dashboard while he talked, showing how easy it is to create visual analytics and trigger smart actions. He went on to talk about data preparation and cleansing, which can often take as much as 50% of an analyst’s time, and demonstrated importing a CSV file and using quick visualizations to expose and correct potential problems in the underlying data. As always, the Spotfire demos are very impressive; I don’t follow Spotfire closely enough to know what’s new, but it all looks pretty slick.

wp-1463592703428.pngMichael O’Connell, TIBCO’s chief analytics officer, came up to demonstrate a set of analytics applications for a fictitious coffee company: sales figures and drilldowns, with what-if predictions for planning promotions; and supply chain management and smart routing of product deliveries.

Palmer came back to talk about TIBCO Jaspersoft, the other side of their analytics portfolio that provides business intelligence capabilities built in to applications, but it was a pretty quick mention with no demo. A Jaspersoft demo would look pretty mundane after seeing all of the sexy Spotfire features, but it undoubtedly is a workhorse for analytics with many customers. He moved on to ways that TIBCO is helping customers to roll analytics out, from accelerators and sample source code to engagement in the community.

wp-1463592727471.png

wp-1463592749782.png

He continued on with streaming analytics (Palmer was the CEO of Streambase before it was acquired TIBCO), and O’Connell came back to show an
wp-1463592771034.pngoil industry application that leverages sensor analytics to maximize equipment productivity by initiating preventative maintenance when the events emitted by the device indicate that failure may be imminent. He showed a more comprehensive interface that would be used in the head office for real-time monitoring and analysis, and a simpler tablet interface for field service personnel to receive information about wells requiring service. Palmer finished the analytics segment with a brief look at LiveView Web, a zero-code environment for building operational intelligence dashboards.

wp-1463592816127.png

Quinn returned to talk about their B-tree-based Graph Database, which is in preview mode now with an open API, and other areas where they are looking to provide innovative solutions. He went through a history of how they’ve grown as a technology organization, and got quite verklempt when thanking his team for how awesome they’ve continued to be over the past 18 months since the acquisition, which was really touching.

IMG_9495After the break, Adam Steltzner, NASA’s lead engineer on the Mars Rover and author of The Right Kind of Crazy: A True Story of Teamwork, Leadership, and High-Stakes Innovation, talked about innovation, collaboration and decision-making under pressure. Check out the replay of the keynote for his talk, a fascinating story of the team that built and landed the Mars landing vehicles, along with some practical tips for leaders to foster exploration and innovation in teams.

Murray Rode returned to close out the keynote by announcing the winners of their Trailblazer customer awards:

  • Norfolk Southern (Pioneer) for implementing a real-time view of their railway operations
  • CargoSmart (Innovator) for incorporating real-time optimization of shipping logistics into their cargo management software
  • First Citizens Bank (Impact) for simplifying IT structure to allow for quick creation and delivery of new branch services
  • University of Chicago Medicine (Visionary) for optimizing operating room turnover to save costs and improve service
  • TUI Group (Transformer) for transforming their platforms through integration to enable new customer-facing tourism applications

That’s it for the morning keynote, and I’m off to catch some of the breakout sessions for most of the rest of the day before we come back for the customer panel and closing keynote at the end of the day.

Software AG Analyst Day: The Enterprise Gets Digital

After the DST Advance conference in Phoenix two weeks ago, I headed north for a few days vacation at the Grand Canyon. Yes, there was snow, but it was lovely.

Back at work, I spent a day last week in Boston for the first-ever North American Software AG analyst event, attended by a collection of industry and financial analysts. It was a long-ish half day followed by lunch and opportunities for one-on-one meetings with executives: worth the short trip, especially considering that I managed to fly in and out between the snow storms that have been plaguing Boston this year. I didn’t live-blog this since there was a lot of material spread over the day, so had a chance to see some of the other analysts’ coverage published after the event, such as this summary from Peter Krensky of Aberdeen Group.

The focus of the event was squarely on the digital enterprise, a trend that I’m seeing at many other vendors but not so many customers yet. Software AG’s CEO, Karl-Heinz Streibich kicked off the day talking about how everywhere you turn, you hear about the digital enterprise: not just using digital technology, but having enough real-time data and devices integrated into our work and lives that they can be said to be truly digital. Streibich feels that companies with a basis in integration middleware – like Software AG with webMethods and other products – are in a good position to enable digital enterprises by integrating data, devices and systems of all types.

Although Software AG is not a household consumer name, its software is in 70% of the Fortune 1000, with a community of over 2M developers; it’s fair to say that you will likely interact with a company that uses Software AG products at least once per day: banks, airports and airlines, manufacturing, telecommunications, energy and more. Their revenues are split fairly evenly between Europe and the Americas, with a small amount in Asia Pacific. License revenues are 32% of the total, with maintenance and consulting splitting the remainder; this relatively low proportion of license revenue is an indicator of a mature software company, and not unexpected from a company more than 40 years old. I found a different representation of their revenues more interesting: they had 66% of their business in the “digital business” segment in 2014, expected to climb to 75% this year, which includes their portfolio minus the legacy ADABAS/NATURAL mainframe development tools. Impressive, considering that it was about a 50:50 split in 2010. 2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103114 PM.bmpPart of this increase is likely due to their several acquisitions over that period, but also because they are repositioning their portfolio as the Digital Business Platform, a necessary shift towards the systems of engagement where more of the customer spend is happening. Based on the marketecture diagram, this platform forms a cut-out layer between back office core operational systems and front office customer engagement systems. Middleware, by any other name; but according to Streibich, more business logic is moving to the middleware layer, although this is what middleware vendors have been telling us for decades.

There’s definitely a lot of capable products in the portfolio that form this “development platform for digital business” – webMethods (integration and BPM), ARIS (BPA), Terracotta (in memory big data), Longjump (application PaaS), Metaquark (mobility), Alfabet, Apama, JackBe and more – but the key will be to see how well they can make them all work together to be a true platform rather than just a collection of Software AG-branded tools.

We had an in-depth presentation on their Digital Business Platform from Wolfram Jost, Software AG’s CTO; you can read the long version on their site, so I’ll just hit the high points. He started with some industry quotes, such as “every company will become a software company”, and one analyst firm’s laughable brainstorm for 2014, “Big Change”, but moved on to define digital business as having the following characteristics:

  • Blurring the digital and physical world
  • More influence of customers (on business direction as well as external perceptions)
  • Combining people, business and physical things
  • Agility, speed, scale, responsiveness
  • “Supermaneuverable” business processes
  • Disrupting existing business models

The problem with this shift in business models is that conventional business applications don’t support the way that the new breed of business applications are designed, developed, used and operated. Current applications and development techniques are still valuable, but are being pushed behind the scenes as core operational systems and packaged applications.

Software AG’s Digital Business Platform, then, is based on the premise that few packaged applications are useful in the face of business transformation and the required agility. We need tools to create adaptive applications – built to change, not to last – especially in front office customer engagement applications, replacing or augmenting packaged CRM and other applications. This is not fundamentally different from the message about any agile/adaptive/mashup/model-driven application development environment over the past few years, including BPMS; it’s interesting to see how a large vendor such as Software AG positions their entire portfolio around that message. In fact, one of their slides refers to the adaptive application platform as iBPMS, since the definition of iBPMS has expanded to include everything related to model-driven application development.

2015-03-04 Boston Analyst Day WJ-WEB.pdf - Adobe Reader 07032015 103731 PM.bmpThe core capabilities of their platform include intelligent business operations (webMethods Operational Intelligence, Apama Streaming Analytics); agile processes (webMethods BPM and AgileApps); integration (webMethods Integration and API Management); in-memory data fabric (Terracotta); and business and IT transformation (ARIS BPA and GRC, Alfabet IT Portfolio Management and EA Management). In a detailed slide overlaying their products, they also added a transaction processing capability to allow the inclusion of ADABAS-NATURAL, as well as the cloud offerings that they’ve released over the past year.

Jost dug further in to definitions of business application layers and architectural requirements. They provide the structure and linkages for event routing and event persistence frameworks, using relatively loose event-based coupling between their own products to allow them to be deployed selectively, but also (I imagine) to reduce the amount of refactoring of the products that would be required for tighter coupling. Their cloud IoT offering plays an interesting role by ingesting events from smart devices – developed via co-innovation with device companies such as Bosch and Siemens – for integration with on-premise business applications.

We then heard two shorter presentations, each followed by a panel. First was Eric Duffaut, the Chief Customer Officer, presenting their go-to-market strategy then moderating a panel with two partners, Audi Lucas of Wipro and Chris Brinton of Mosaic Data Science. Their GTM plan was fairly standard for a large enterprise software vendor, although they are improving effectiveness by having a single marketing team across all products as well as improving the sales productivity processes. Their partners are critical for scalability in this plan, and provide the necessary industry experience and solutions; both of the partner panelists talked about co-innovation with Software AG, rather than just providing resources trained on the products.

The second presentation and panel was led by John Bates, CMO and head of industry solutions; he was joined by a customer panel including Bryan Zigler of Boeing, Mark DuBrock of Standard&Poor, and Greg James of Outerwall. Bates discussed the role of industry solutions and solution accelerators, built by Software AG and/or partners, that provide a pre-built, customizable and adaptive application for fast deployment. They’re not using the Smart Process Application terminology that other vendors adopted from the Forrester trend from a couple of years ago, but it’s a very similar concept, and Bates announced the solution marketplace that they are launching to allow these to be easily discovered and purchased by customers.

My issue with solution accelerators and industry solutions in general is that many of these solutions are tied to a specific version of the underlying technology, and are templates rather than frameworks in that you change the solution itself during implementation: upgrades to platform may not be easily performed, and upgrades to the actual solution likely requires re-customizing for each deployed instance. I didn’t get a chance to ask Bates how SAG helps partners and customers to create and deploy more upgradable solutions, e.g., recommended technology guardrails; this is a sticky problem that every technology vendor needs to deal with.

AVPageView 07032015 111148 PM.bmpBates also discussed the patterns of digital disruption that can be seen in the marketplace, and how these are manifesting in three specific areas that they can help to address with their Digital Business Platform:

  • Connected customers, providing opportunities for location-based marketing and offers, automated concierge service, customer location tracking, demographic marketing
  • Internet of Things/Machine-to-Machine (IoT/M2M), with real-time monitoring and diagnostics, and predictive maintenance
  • Proactive risk and compliance, including proactive financial trade surveillance for unusual/rogue behavior

After a wrapup by Streibich, we received copies of his latest book, The Digital Enterprise, plus Thingalytics by Bates; ironically, these were paper rather than digital copies. Winking smile

Disclosure: Software AG paid my airfare and hotel to attend this event, plus gave me a nice lunch and two books, but did not otherwise compensate me for my time nor for anything that I have written here.

This week, I’m in Las Vegas for Kofax Transform, although just as an attendee this year rather than a speaker; expect to see a few notes from here over the two days of the conference.

Event Analytics in Oil and Gas at TIBCONOW

Michael O’Connell, TIBCO’s chief data scientist, and Hayden Schultz, a TIBCO architect, discussed and demonstrated an event-handling example using remote sensor data with Spotfire and Streambase. One oil company may have thousands of submersible pumps moving oil up from well, and these modern pumps include sensors and telemetry to allow them to be monitored and controlled remotely. One of their oil and gas customers said that through active monitoring and control such as this, they are avoiding downtime worth $1000/day/well, meaning an additional $100M in additional revenue each year. In addition to production monitoring, they can also use remote monitoring in drilling operations to detect conditions that might be a physical risk. They use standards for sensor data format, and a variety of data sources including SAP HANA.

For the production monitoring, the submersible pumps emit a lot of data about their current state: monitoring for changes to temperature, pressure and current shows patterns that can be correlated with specific pre-failure conditions. By developing models of these pre-failure patterns using Spotfire’s data discovery capabilities on historical failure data, data pushed into Streambase can be monitored for the patterns, then Spotfire used to trigger a notification and allow visualization and analytics by someone monitoring the pumps.

We saw a demonstration of how the pre-failure patterns are modeled in Spotfire, then how the rules are implemented in Streambase for real-time monitoring and response using visual modeling and some XML snippets generated by Spotfire. We saw the result in Streambase LiveView, which provides visualization of streaming data and highlights those data points that are exhibiting the pre-failure condition. The engineers monitoring the pumps can change some of the configuration of the failure conditions, allowing them to fine-tune to reduce false positives without missing actual failure events. Events can kick off notification emails, generate Spotfire root cause analysis reports, or invoke other applications such as instantiating a BPM process.

There are a number of similar industrial applications, such as in mining: wherever there are a large number of remote devices that require monitoring and control.

TIBCONOW 2014 Day 2 Keynote: Product Direction

Yesterday’s keynote was less about TIBCO products and customers, and more about discussions with industry thought leaders about disruptive innovation. This morning’s keynote continued that theme with a pre-recorded interview with Vivek Ranadive and Microsoft CEO Satya Nadella talking about cloud, mobile, big data and the transformational effects on individual and business productivity. Nadella took this as an opportunity to plug Microsoft products such as Office 365, Cortana and Azure; eventually he moved on to talk about the role of leadership in providing a meaningful environment for people to work and thrive. Through the use of Microsoft products, of course.

Thankfully, we then moved on to actual TIBCO products.

We had a live demo of TIBCO Engage, their real-time customer engagement marketing product, showing how a store can recognize a customer and create a context-sensitive offer that can be immediately consumed via their mobile app. From the marketer’s side, they can define and monitor engagement flows — almost like mini-campaigns, such as social sharing in exchange for points, or enrolling in their VIP program — that are defined by their target, trigger and response. The target audience can be filtered by past interests or demographics; triggers can be a combination of geolocation (via their app), social media interactions, shopping cart contents and time of day; and responses may be an award such as loyalty points or a discount coupon, a message or both, with a follow link customized to the customer. A date range can then be set for each engagement flow, and set to be live/scheduled to start, or in a draft or review mode. Analytics are gathered as the flows execute, and the effectiveness can be measured in real time.

Matt Quinn, TIBCO’s CTO, spoke about the challenges of fast data: volume, speed and complexity. We saw the three blocks of the TIBCO Fast Data platform — analytics, event processing, and integration — in a bit more detail, with him describing how these three layers work together. Their strategy for the past 12 months, and going forward, has three prongs: evolution of the Fast Data platform; improved ease of use; and delivery of the Fast Data platform including cloud and mobile support. The Fast Data platform appears to be a rebranding of their large portfolio of products as if it were a single integrated product; that’s a bit of marketing-speak, although they do appear to be doing a better job of providing integrations and use cases of how the different products within the platform can be combined.

image

In the first part of the strategy, evolution of the platform (that is, product enhancements and new releases), they continue to make improvements to their messaging infrastructure. Fast, secure message transactions are where they started, and they continue to do this really well, in software and on their FTL appliances. Their ActiveSpaces in-memory data grid has improved monitoring and management, as well as multi-site replication, and is now more easily consumed via Node.js and other lighter-weight development protocols. BusinessWorks 6, their integration IDE, now provides more integrated development tooling with greatly improved user interfaces to more easily create and deploy integration applications. They’ve provided plug-ins for SaaS integrations such as Salesforce, and made it easier to create your own plug-ins for integration sources that they don’t yet support directly. On the event processing side, they’ve brought together some related products to more easily combine stream processing, rules and live data marts for real-time aggregation and visualization. And to serve the internet of things (IoT), they are providing connectivity to devices and sensors.

image

User experience is a big challenge with any enterprise software company, especially one that grows through acquisition: in general, user interfaces end up as a hodge-podge of inconsistent interfaces. TIBCO is certainly making some headway at refactoring these into a more consistent and easier to use suite of interfaces. They’ve improved the tooling in the BusinessWorks IDE, but also in the administration and management of integrations during development, deployment and runtime. They’ve provided a graphical UI designer for master data management (MDM). Presented as part of the ease of use initiative, he discussed the case management functions added to AMX BPM, including manual and automatic ad hoc tasks, case folder and documents with CMIS/ECMS access, and support for elastic organization structures (branch model). BPM reporting has also been improved through the integration of Jaspersoft (acquired by TIBCO earlier this year) with out of the box and customizable reports, and Jaspersoft also has been enhanced to more easily embed analytics in any application. They still need to do some work on interoperability between Jaspersoft and Spotfire: having two analytics platforms is not good for the customers who can’t figure out when to use which, and how to move between them.

The third prong of the strategy, delivery of the platform, is being addressed by offering on-premise, cloud, Silver Fabric platform-as-a-service, TIBCO Cloud Bus for hybrid cloud/on premise configurations, consumable apps and more; it’s not clear that you can get everything on every delivery platform, and I suspect that customers will have challenges here as TIBCO continues to build out their capabilities. In the near future, they will launch Simplr for non-technical integration (similar to IFTTT), and Expresso for consuming APIs. They are also releasing TIBCO Clarity for cleansing cloud data, providing cleaner input for these situational consumable apps. For TIBCO Engage, which we saw demonstrated earlier, they will be adding next best engagement optimization and support for third-party mobile wallets, which should improve the hit rate on their customer engagement flows.

He discussed some of the trends that they are seeing impacting business, and which they have on the drawing board for TIBCO products: socialization and gamification of everything; cloud requirements becoming hybrid to combine public cloud, private cloud and on premise; the rise of micro-services from a wide variety of sources that can be combined into apps; and HTML5/web-based developer tooling rather than the heavier Eclipse environments. They are working on Project Athena, a triplestore database that includes context to allow for faster decisioning; this will start to show up in some of the future product development.

Good review of the last year of product development and what to expect in the next year.

The keynote finished with Raj Verma, EVP of sales, presenting “trailblazer” awards to their customers that are using TIBCO technologies as part of their transformative innovation: Softrek for their ClearView CRM that embeds Jaspersoft; General Mills for their internal use of Spotfire for product and brand management; jetBlue for their use of TIBCO integration and eventing for operations and customer-facing services; and Three (UK telecom) for their use of TIBCO integration and eventing for customer engagement.

Thankfully shorter than yesterday’s 3-hour marathon keynote, and lots of good product updates.

TIBCONOW 2014 Opening Keynote: @Gladwell and More. Much More.

San Francisco! Finally, a large vendor figured out that they really can do a 2,500-person conference here rather than Las Vegas, it just means that attendees are spread out in a number of local hotels rather than in one monster location. Feels like home.

It seems impossible that I haven’t blogged about TIBCO in so long: I know that I was at last year’s conference but was a speaker (as I am this year) so may have been distracted by that. Also, they somehow missed giving me a briefing about the upcoming ActiveMatrix BPM release, which was supposed to be relatively minor but ended up  bit bigger — I’ll be at the breakout session on that later today.

We started the first day with a marathon keynote, with TIBCO CEO Vivek Ranadive welcoming San Francisco’s mayor, Ed Lee, for a brief address about how technology is fueling San Francisco’s growth and employment, as well as helping the city government to run more effectively. The city actually have a chief data officer responsible for their open data intiatives.

Ranadive addressed the private equity buy-out of TIBCO head-on: 15 years ago, they took the company public, and by the end of this year, they will be a private company again. I think that this is a good thing, since it removes them from the pressures of quarterly public filings, which artificially impacts product announcements and sales. It allows them to make any necessary organization restructuring or divestiture without being punished on the stock market. Also, way better than being absorbed by one of the bigger tech companies, where the product lines would have be to realigned with incumbent technologies. He talked about key changes in the past years: the explosion of data; the rise of mobility; the emergence of social platforms; Asian economies; and how math is trumping science by making the “how” more important than the “why”. Wicked problems, but some wicked solutions, too. He claims that every industry will have an “Uberization”: controversies aside, companies such as Uber and AirBnB are letting service businesses flourish on a small scale using technology and social networks.

We then heard from Malcolm Gladwell — he featured Ranadive in one of his books — on technology-driven transformation, and the kinds of attitudes that make this possible. He told the story of Malcolm McLean, who created the first feasible intermodal containerized shipping in the 1950s because of his frustration with how long it took to unload his truck fleet at seaports, and how that innovation transformed the physical goods economy. In order to do this, McLean had to overcome the popular opinion that containerized shipping would fail (based on earlier failed attempts by others): as Gladwell put it, he had the three necessary characteristics of successful entrepreneurs: he was open/imaginative with creative ideas; he was conscientious and had the discipline to bring ideas to fruition including a transformation of the supply chain and sales model; and he was “disagreeable”, that is, had the resolve to pursue an idea in the face of his peers’ disapproval and ridicule. Every transformative innovation must be driven by someone with these three traits, who has the imagination to reframe the incumbent business to address unmet needs, and kill the sacred cows. Great talk.

Ranadive then invited Marc Andreessen on stage for a conversation (Andreessen thanked him for letting him “follow Malcolm freaking Gladwell on the stage”) about innovation, which Andreessen says is currently driven by mobile devices: businesses now must assume that every customer is connected 24×7 with a mobile device. This provides incredible opportunities — allowing customers to order products/services on the go — but also threats for businesses behind the curve, who will see customers comparing them to their competitors in real-time before making a purchasing decision. They discussed the future of work; Andreessen sees this as leveraging small teams, but that things need to change to make that successful, including incentives (a particular interest of mine, since I’ve been looking at incentives for collaboration amongst knowledge workers). Diversity is becoming a competitive advantage since it draws talent from a larger pool. He talked about the success rates of typical venture-funded companies, such as those that they fund: of 4,000 companies, about 15 will make it to being big companies, that is, with a revenue of $100M or more that would position them to go public; most of their profits as a VC come from those 15 companies. They fund good ideas that look like terrible ideas, because if everyone thought that these were great ideas, the big companies would already be doing them; the trick is filtering out all of ideas that look terrible because they actually are. More important is the team: a bad team can ruin a good idea, but a great team with a bad idea can find their way to a good idea.

Next up was TIBCO’s CTO Matt Quinn talking with Box CEO Aaron Levie: Box has been innovating in the enterprise by taking the consumer cloud storage that we were accustomed to, and bringing it into the enterprise space. This not only enables internal innovation because of the drastically lower cost and simpler user experience than enterprise content solutions such as SharePoint, but also has the ability to transform the interface between businesses and their customers. Removing storage constraints is critical to supporting that explosion of data that Ranadive talked about earlier, enabling the internet of everything.

We saw a pre-recorded interview that Ranadive did with PepsiCo CEO Indra Nooyi: she discussed the requirement to perform while transforming, and the increase in transparency (and loss of privacy) as companies seek to engage with customers. She characterized a leader’s role as that of not just envisioning the future, but making that vision visible and attainable.

Mitch Barns, CEO of Nielsen (the company that measures and analyzes what people watch on TV), talked about how their business of measurement has changed as people went from watching broadcast TV at times determined by the broadcasters, to time-shifting with DVRs and consuming TV content on mobile devices on demand. They have had to shift their methods and business to accommodate this change in viewing models, and deal with a flood of data about how, when and where that consumption is occurring.

I have to confess, by this point, 2.5 hours into the keynote without a break, my attention span was not what it could have been. Or maybe these later speakers just didn’t inspire me as much as Gladwell and Andreessen.

Martin Taylor from Vista Equity Partners, the soon-to-be owners of TIBCO, spoke next about what they do and their vision for TIBCO. Taylor was at Microsoft for 14 years before joining Vista, and helps to support their focus on applying their best practices and operating platform to technology companies that they acquire. Since their start in 2000, they have spent over $14B on 140 transactions in enterprise software. He showed some of their companies; since most of these are vertical industry solutions, TIBCO is the only name on that slide that I recognized. They attempt to foster collaboration between their portfolio companies: not just sharing best practices, but doing business together where possible; I assume that this could be very good for TIBCO as a horizontal platform provider that could leveraged by their sibling companies. The technology best practices that they apply to their companies include improved product management roadmaps that address the needs of their customers, and improved R&D practices to speed product release cycles and improve quality. They’re still working through the paperwork and regulatory issues, but are starting to work with the TIBCO internal teams to ensure a smooth transition. It doesn’t sound as if there will be any big technology leadership changes, but a continued drive into new technologies including cloud, IoT, big data and more.

Murray Rode, TIBCO’s COO, finished up the keynote talking about their Fast Data positioning: organizations are collecting a massive volume of data, but that data has a definite shelf life and degrades in value over time. In order to take advantage of short-lived opportunities where timing is everything, you have to be able to analyze and take actions on that data quickly. As he put it, big data lets you understand what’s already happened, but fast data lets you influence what’s about to happen. To do this, you need to combine analytics to define situations of interest and decisions; event processing to understand and act on real-time information; and integration (including BPM) to unify your transactional and big data sources. Rode outlined the four themes of their positioning: expanded reach, ease of consumption, compelling user journey, and faster time to value; I expect that we will see more along these themes throughout the conference.

All in all, a great keynote, even though it stretched to an ass-numbing three hours.

Disclosure: TIBCO is paying my expenses to be at TIBCO NOW and a speaking fee for me to be on a panel tomorrow. What I write here is my own opinion, and I am not compensated in any way for blogging.

Conference Within A Conference, Part Two: Big Fast Data World

John Bates, who I know from his days at Progress Software, actually holds the title SVP of Big Fast Data at Software AG. Love it. He led off the Big Fast Data World sub-conference at Innovation World to talk about real-time decisioning based on events, whether that is financial data such as trades, or device events from an oil rig. This isn’t just simple “if this event occurs, then trigger this action” sort of decisions, but real-time adaptive intelligence that might include social media, internal business systems, market information and more. It’s where events, data, analytics and process all come together.

The goal is to use all of the data and events possible to appear to be reading your customer’s mind and offering them the most likely thing that they want right then (without being too creepy about it), using historical patterns, current context and location information. For example, a customer is in the process of buying something, and their credit card company or retail partner uses that opportunity to upsell them on a related product or a payment plan, directly to their mobile phone and before they have finished making their purchase. Or, a customer is entering a mall, they are subscribed to a sports information service, there are available tables at a sports bar in the mall, so they are pushed a coupon to have lunch at the sports bar right then. Even recommendation engines, such as we see every time that we visit Amazon or Netflix, are examples of this. Completely context sensitive, and completely personalized.

On the flip side, companies have to use continuous monitoring of social media channels for proactive customer care: real-time event and data analysis for responding to unhappy customers before situations blow up on them. People like Dave Carroll and Heather Armstrong (and sometimes even me, on a much smaller scale) can strike fear in the hearts of customer service organizations who are unable to respond appropriately and quickly, but can cause big wins for these companies when they do the right things to fix things in an expedient manner for their customers.

What do you need to do to make this happen? Not much, just low-latency universal messaging, in-memory unstructured data, real-time predictive analytics, intelligent actions via real-time integration to operational systems, and real-time visual analytics. If you’re a Software AG customer, they’re bringing together Terracotta, Apama and JackBe into a unified platform for this sort of adaptive intelligence, producing intelligent actions from big data, in real time, to/from anywhere.

Software AG Big Fast Data

We then got a bit of a lesson on big data from Nathaniel Rowe, a research analyst at Aberdeen Group: how big is big, what’s the nature of that data, and some of the problems with it. The upshot: the fact that there’s a lot of data is important, but it’s the unstructured nature of it that presents many of the difficult analytical problems. It’s about volume, but also variety and velocity: the data could be coming from anywhere, and you don’t have control over a lot of it such as the social media data or that from business partners. You have to have a clear picture of what you want out of big data, such as better customer insights or operational visibility; Rowe had a number of use cases from e-commerce to healthcare to counterterrorism. The ability to effectively use unstructured data is key: those companies that are best in class are doing this better than average, and it translates directly to measures such as sales, customer satisfaction and net promoter score. He finished up with some of the tools required – automatic data capture, data compression, and data cleansing – and how those translate directly to employees’ ability to find data, particularly from multiple sources at once. Real-time analytics and in-memory analytics are the two high-speed technologies that result in the largest measurable benefits when working with big data, making the difference between seconds (or even sub-second) to see a result or take an action, versus minutes or hours. He ended up with the correlation between investing in big data and various customer experience measures (15-18% increases) as well as revenue measures (12-17% increases). Great presentation, although I’m pretty sure that I missed 75% of it since he is a serious speed-talker and zipped through slides at the speed of light.

And we’re done for the day: back tomorrow for another full day of Innovation World. I’m off to the drinks reception then a customer party event; as always, everything is off the record as soon as the bar opens. Smile

Kicking Off @SoftwareAG @InnovationWorld

For the first time in a few years, I’m at Software AG’s Innovation World conference in San Francisco (I think that the last time I was here, it was still the webMethods Integration World), and the focus is on the Digital Enterprise. At the press panel that I attended just prior to this evening’s opening keynote, one journalist made the point that “digital enterprise” is kind of a dumb term (I paraphrase here) because everything is digital now: we need a more specific term to mean what Software AG is getting at with this. Clay Richardson of Forrester, who I dragged along to the press session, said that his colleagues are talking about the post-digital age, which I take to mean is based on the assumption that all business is digital so that term is a bit meaningless, although “post-digital” isn’t exactly descriptive either.

Terminology aside, Software AG’s heart is in the right place: CEO Karl-Heinz Streibich took the stage at the opening keynote to talk about how enterprises need to leverage this digital footprint by integrating systems in ways that enable transformation through alignment and agility. You can still detect the schisms in the Software AG product portfolio, however: many of the customer case studies were single-product (e.g., ARIS or webMethods), although we did hear about the growing synergy between Apama (CEP and analytics) and webMethods for operational visibility, as well as Apama and Terracotta (in-memory big data number crunching). As with many of the other large vendors that grow through acquisitions,

We heard briefly from Ivo Totev, Software AG’s CMO; saw presentations of two of their customer innovation awards; then had a lengthier talk on the power of mobile and social from Erik Qualman, author of Socialnomics and Digital Leader. Unlike the usual pop culture keynote speaker, Qualman’s stuff is right on for this audience: looking at how successful companies are leveraging online social relationships, data and influence to further their success through engagement: listening, interacting and reacting (and then selling). He points out that trying to sell first before engaging doesn’t work online because it doesn’t work offline; the methods of engagement are different online and offline, but the principles from a sales lead standpoint are the same. You can’t start the conversation by saying “hey, I’m great, buy this thing that I’m selling” (something that a lot of people/companies just starting with Twitter and/or blogging haven’t learned yet).

Qualman took the popular Dave Carroll’s “United Breaks Guitars” example from a couple of years ago, and talked about not just how United changed their policies on damage as a result of this, but the other people who leveraged the situation into increased sales: Taylor Guitars; a company that created a “Dave Carroll” travelling guitar case; and Carroll himself through sales of the song and his subsequent book on the power of one voice in the age of social media. He looked at companies that have transformed their customer experience through mobile (e.g., Starbucks mobile app, which has personally changed my café loyalty) by giving the customer a way to do what they want to do – which hopefully involves buying your product – in the easiest possible way; and how a fast and slightly cheeky social media presence can give you an incredible boost for very little investment (e.g., Oreo’s “dunk in the dark” tweet when the lights went out during the Superbowl). I gave a presentation last year on creating your own process revolution that talked about some of these issues and the new business models that are emerging because of it.

Great to see John Bates here, who I know from his tenure at Progress Software and came on at Software AG with the Apama acquisition, as well as finally meet Theo Priestley face to face after years of tweeting at each other.

Disclosure: Software AG is a customer (I’m in the middle of creating some white papers and webinars for them), and they paid my travel expenses to be at this conference. However, what I write here is my own opinion and I have not been financially compensated for it.

Can BPM Save Lives? Siemens Thinks So

My last session at Gartner BPM 2013 is a discussion between Ian Gotts of TIBCO and their customer Tommy Richardson, CTO of Siemens Medical Solutions. I spoke with Siemens last year at Gartner and TUCON and was very interested in their transition from the old iProcess BPM platform (which originally came from TIBCO’s Staffware acquisition) to the newly-engineered AMX platform, which includes BPM and several other stack components such as CEP. Siemens isn’t an end-user, however: they OEM the TIBCO products into their own Soarian software, which is then sold to medical organizations for what Richardson refers to as “ERP for hospitals”. If you go to a hospital that uses their software, a case (process instance) is created for you at check-in, and is maintained for the length of your stay, tracking all of the activity that happens while you’re there.

With about 150 customers around the world, Seimens offers both hosted and on-premise versions of their software. Standard processes are built into the platform, and the hospitals can use the process modeler to create or modify the models to match their own business processes. These processes can then guide the healthcare professionals as they administer treatment (without forcing them to follow a flow), and capture the actions that did occur so that analytics can determine how to refine the processes to better support patient diagnosis and treatment. This is especially important for complex treatment regimes such as when an unusual infectious disease is diagnosed, which requires both treatment and isolation actions that may not be completely familiar to the hospital staff. Data is fed to and from other hospital systems as part of the processes, so the processes are not executing in isolation from all of the other information about the patient and their care.

For Siemens, BPM is a silver bullet for software development: they can make changes quickly since little is hard-coded, allowing treatment processes to be modified as research and clinical results indicate new treatment methods. In fact, the people who maintain the flows (both at Siemens and their customers) are not developers: they have clinical backgrounds so that they are actually subject matter experts, although are trained on the tools and in a process analyst role rather than medical practitioner role. If more technical integration is required, then developers do get involved, but not for process model changes.

The Siemens product does a significant amount of integration between the executing processes and other systems, such as waiting for and responding to test results, and monitoring when medications are administered or the patient is moved to another location in the hospital. This is where the move to AMX is helping them, since there’s a more direct link to data modeling, organizational models, analytics, event handling from other systems via the ESB, and other functionality in the TIBCO stack, replacing some amount of custom software that they had developed as part of the previous generations of the system. As I’ve mentioned previously, there is no true upgrade from iProcess to AMX/BPM since it’s a completely new platform, so Siemens actually did a vendor evaluation to see if this was an opportunity to switch which product OEMed into their product, and decided to stay with TIBCO. When they roll out the AMX-based version in the months ahead, they will keep the existing iProcess-based system in place for each existing client for a year, with new patient cases being entered on the new system while allowing the existing cases to be worked in place on the old system. Since a case completes when a patient is discharged, there will be very few cases remaining on the iProcess system after a year, which can then be transferred manually to the new system. This migration strategy is far beyond what most companies do when switching BPM platforms, but necessary for Siemens because of the potentially life-threatening (or life-saving) nature of their customers’ processes. This also highlights how the BPMS is used for managing the processes, but not as a final repository for the persistent patient case information: once a case/process instance completes on patient check-out, the necessary information has been pushed to other systems that maintain the permanent record.

Modernizing the healthcare information systems such as what Siemens is doing also opens up the potential for better sharing of medical information (subject to privacy regulations, of course): the existence of an ESB as a basic component means that trusted systems can exchange information, regardless of whether they’re in the same or different organizations. With their hosted software, there’s also the potential to use the Siemens platform as a way for organizations to collaborate; although this isn’t happening now (as far as I can tell), it may be only a matter of time before Siemens is hosting end-to-end healthcare processes with participants from hospitals, speciality clinics and even independent healthcare professionals in a single case to provide the best possible care for a patient.

TIBCO Corporate and Technology Analyst Briefing at TUCON2012

Murray Rode, COO of TIBCO, started the analyst briefings with an overview of technology trends (as we heard this morning, mobile, cloud, social, events) and business trends (loyalty and cross-selling, cost reduction and efficiency gains, risk management and compliance, metrics and analytics) to create the four themes that they’re discussing at this conference: digital customer experience, big data, social collaboration, and consumerization of IT. TIBCO provides a platform of integrated products and functionality in five main areas:

  • Automation, including messaging, SOA, BPM, MDM, and other middleware
  • Event processing, including events/CEP, rules, in-memory data grid and log management
  • Analytics, including visual analysis, data discovery, and statistics
  • Cloud, including private/hybrid model, cloud platform apps, and deployment options
  • Social, including enterprise social media, and collaboration

A bit disappointing to see BPM relegated to being just a piece of the automation middleware, but important to remember that TIBCO is an integration technology company at heart, and that’s ultimately what BPM is to them.

Taking a look at their corporate performance, they have almost $1B in revenue for FY2011, showing growth of 44% over the past two years, with 4,000 customers and 3,500 employees. They continue to invest 14% of revenue into R&D with a 20% increase in headcount, and significant increases in investment in sales and marketing, which is pushing this growth. Their top verticals are financial services and telecom, and while they still do 50% of their business in the Americas, EMEA is at 40%, and APJ making up the other 10% and showing the largest growth. They have a broad core sales force, but have dedicated sales forces for a few specialized products, including Spotfire, tibbr and Nimbus, as well as for vertical industries.

They continue to extend their technology platform through acquisitions and organic growth across all five areas of the platform functionality. They see the automation components as being “large and stable”, meaning we can’t expect to see a lot of new investment here, while the other four areas are all “increasing”. Not too surprising considering that AMX BPM was a fairly recent and major overhaul of their BPM platform and (hopefully) won’t need major rework for a while, and the other areas all include components that would integrate as part of a BPM deployment.

Matt Quinn then reviewed the technology strategy: extending the number of components in the platform as well as deepening the functionality. We heard about some of this earlier, such as the new messaging appliances and Spotfire 5 release, some recent releases of existing platforms such as ActiveSpaces, ActiveMatrix and Business Events, plus some cloud, mobile and social enhancements that will be announced tomorrow so I can’t tell you about them yet.

We also heard a bit more on the rules modeling that I saw before the sessions this morning: it’s their new BPMN modeling for rules. This uses BPMN 1.2 notation to chain together decision tables and other rule components into decision services, which can then be called directly as tasks within a BPMN process model, or exposed as web services (SOAP only for now, but since ActiveMatrix is now supporting REST/JSON, I’m hopeful for this). Sounds a bit weird, but it actually makes sense when you think about how rules are formed into composite decision services.

There was a lot more information about a lot more products, and then my head exploded.

Like others in the audience, I started getting product fatigue, and just picking out details of products that are relevant to me. This really drove home that the TIBCO product portfolio is big and complex, and this might benefit from having a few separate analyst sessions with some sort of product grouping, although there is so much overlap and integration in product areas that I’m not sure how they would sensibly split it up. Even for my area of coverage, there was just too much information to capture, much less absorb.

We finished up with a panel of the top-level TIBCO execs, the first question of which was about how the sales force can even start to comprehend the entire breadth of the product portfolio in order to be successful selling it. This isn’t a problem unique to TIBCO: any broad-based platform vendor such as IBM and Oracle have the same issue. TIBCO’s answer: specialized sales force overlays for specific products and industry verticals, and selling solutions rather than individual products. Both of those work to a certain extent, but often solutions end up being no more than glorified templates developed as sales tools rather than actual solutions, and can lead to more rather than less legacy code.

Because of the broad portfolio, there’s also confusion in the customer base, many of whom see one TIBCO product and have no idea of everything else that TIBCO does. Since TIBCO is not quite the household name like IBM or Oracle, companies don’t necessarily know that TIBCO has other things to offer. One of my banking clients, on hearing that I am at the TIBCO conference this week, emailed “Heard of them as a player in the Cloud Computing space.  What’s different or unique about them vs others?” Yes, they play in the cloud. But that’s hardly what you would expect a bank (that uses very little cloud infrastructure, and likely does have some TIBCO products installed somewhere) to think of first when you mention TIBCO.

TIBCO TUCON2012 Day 1 Keynotes, Part 2: Big Honking Data

Back from the mid-morning break, CMO Raj Verma shifted gears from customer experience management to look at one of the other factors introduced in the first part of the session: big data.

Matt Quinn was back to talk about big data: in some ways, this isn’t new, since there has been a lot of data within enterprises for many years. What’s changed is that we now have the tools to deal with it, both in place and in motion, to find the patterns hiding within it through cleansing and transformation. He makes a sports analogy, saying that a game is not just about the final score, but about all of the events that happen to make up the entire game; similarly, it is not sufficient any more to just measure outcomes in business transactions, you have to monitor patterns in the event streams and combine that with historical data to make the best possible decisions about what is happening right now. He referred to this combination of event processing and analytics as closing the loop between data in motion and data at rest. TIBCO provides a number of products that combine to handle big data: not just CEP, but ActiveSpaces (the in-memory data grid) to enable realtime processing, Spotfire for visual analytics and integration with Hadoop.

We saw a demo of LogLogic, recently acquired by TIBCO, which provides analytics and event detection on server logs. This might sound like a bit of a boring topic, but I’m totally on with this: too many companies just turn off logging on their servers because it generates too many events that they just can’t do anything with, and it impacts performance since logging is done on the operational server. LogLogic’s appliance can collect enormous amounts of log data, detect unusual events based on various rules, and integrate with Spotfire for visualization of potential security threats.

Mark Lorion, CMO for TIBCO Spotfire, came up to announce Spotfire 5, with a complete overhaul to the analytics engine, and including the industry’s first enterprise runtime for the R statistical language, providing 10 times the performance of the open source R project for predictive analytics. Self-service predictive analytics, ftw. They are also going beyond in-memory, integrating with Teradata, Oracle and Microsoft SQL Server for in-database analysis. With Teradata horsepower behind it – today’s announcement of Spotfire being optimized for in-database computation on Teradata – you can now do near-realtime exploration and visualization of some shocking amounts of data. Brad Hopper gave us a great Spotfire demo, not something that most TUCON attendees are used to seeing on the main stage.

Rob Friel, CEO of PerkinElmer, took the stage to talk about how they are using big data and analytics in their scientific innovations in life sciences: screening patient data, environmental samples, human genomes, and drug trials to detect patterns that can improve quality of life in some way. They screened 31 million babies born last year (one in four around the globe) through the standard heel-prick blood test, and detected 18,000 with otherwise undiagnosed disorders that could be cured or treated. Their instrumentation is key in acquiring all the data, but once it’s there, tools such as Spotfire empower their scientists to discover and act on what they find in the data. Just as MGM Grand is delivering unique experiences to each customer, PerkinElmer is trying to enable personalized health monitoring and care for each patient.

To wrap up the big data section, Denny Page, TIBCO’s VP of Engineering, came on stage with his new hardware babies: a FTL Message switch and an EMS appliance, both to be available by the end of November 2012.

For the final part of the day 1 keynotes, we heard from an innovators’ panel of Scott McNealy (founder of Sun Microsystems, now chairman of Wayin), Tom Siebel (founder of Siebel Systems, now at C3 Energy where they are using TIBCO for energy usage analytics), Vivek Ranadivé, and KR Sridhar (CEO of Bloom Energy), chaired by David Kirkpatrick. Interesting and wide-ranging discussion about big data, analytics, sentiment analysis, enterprise social media, making data actionable, the internet of things and how a low barrier to platform exit drives innovation. The panel thinks that the best things in tech are yet to come, and I’m in agreement, although those who are paranoid about the impact of big data on their privacy should be very, very afraid.

I’ll be blogging from the analyst event for the rest of the day: we have corporate and technology briefings from the TIBCO execs plus some 1:1 sessions. No pool time for me today!