Monthly Archives: May 2013

SAPPHIRENOW Vishal Sikka Keynote – HANA For Speed, Fiori For Usability

Vishal Sikka, who leads technology and innovation at SAP, followed Hasso Platner onto the keynote stage; I decided to break the post and publish just Plattner’s portion since my commentary was getting bit long.

Sikka also started his part of the keynote with HANA, and highlighted some customer case studies from their “10,000 Club”, where operations are more than 10,000 times faster when moved to HANA, plus one customer with an operation that runs 1 million times faster on HANA. He talked about how imperatives for innovation are equal parts math and design: it has to be fast, but it also has to solve business problems. HANA provides the speed and some amount of the problem-solving, but really good user experience design has to be part of the equation. To that end, SAP is launching Fiori, a collection of 25 easy-to-use applications for the most common SAP ERP and data warehouse functions, supported on phone, tablet and desktop platforms with a single code base. Although this doesn’t replace the 1000′s of existing screens, it can likely replace the old screens for many user personas. As part of the development of Fiori, they partnered with Google and optimized the applications for Chrome, which is a pretty bold move. They’ve also introduced a lot of new forms of data visualization, replacing mundane list-style reports with more fluid forms that are more common on specialized data visualization platforms such as Spotfire.

Fiori doesn’t depend on HANA (although you can imagine the potential for HANA analytics with Fiori visualization), but can be purchased directly from the HANA Marketplace. You can find out more about SAP’s UX development, including Fiori, on their user experience community site.

Returning to HANA, and to highlight that HANA is also a platform for non-SAP applications, Sikka showed some of the third-party analytics applications developed by other companies on the HANA platform, including eBay and Adobe. There are over 300 companies developing applications on HANA, many addressing specific vertical industries.

That’s it for me from SAPPHIRE NOW 2013 — there’s a press Q&A with Plattner and Sikka coming up, but I need to head for the airport so I will catch it online. As a reminder, you can see all of the recorded video (as well as some remaining live streams today) from the conference here.

SAPPHIRENOW Hasso Plattner Keynote – Is HANA The New Mainframe (In A Good Way)?

It’s the last day of SAP’s enormous SAPPHIRE NOW 2013 conference here in Orlando, and the day opens with Hasso Plattner, one of the founders of SAP who still holds a role in defining technology strategy. As expected, he starts with HANA and cloud. He got a good laugh from the audience when saying that HANA is there to radically speed some of the very slow bits in SAP’s ERP software, such as overnight process, he stated apologetically “I had no idea that we had software that took longer than 24 hours to run. You should have sent me an email.” He also discussed cloud architectures, specifically multi-tenancy versus dedicated instances, and said that although many large businesses didn’t want to share instances with anyone else for privacy and competitive reasons, multi-tenancy becomes less important when everything is in memory. They have three different cloud architectures to deal with all scenarios: HANA One on Amazon AWS, which is fully public multi-tenant cloud currently used by about 600 companies; their own managed cloud using virtualization to provide a private instance for medium to large companies, and dedicated servers without virtualization in their managed cloud (really a hosted server configuration) for huge companies where the size warrants it.

Much of his keynote rebutting myths about HANA — obviously, SAP has been a bit stung by the press and competitors calling their baby ugly — including the compression factor between how much data is on disk versus in memory at any given time, the relative efficiency of HANA columnar storage over classic relational record storage, support on non-proprietary hardware, continued support of other database platforms for their Business Suite, HANA stability and use of HANA for non-SAP applications. I’m not sure that was the right message: it seemed very defensive rather than talking about the future of SAP technology, although maybe the standard SAP user sitting the audience needed to hear this directly from Plattner. He did end up with some words on how customers can move forward: even if they don’t want to change database or platform, moving to the current version of the suite will provide some performance and functionality improvements, while putting them in the position to move to Business Suite on HANA (either on-premise or on the Enterprise Cloud) in the future for a much bigger performance boost.

HANA is more than just database: it’s database, application server, analytics and portals bundled together for greater performance. It’s like the new mainframe, except running on industry-standard x86-based hardware, and in-memory so lacking the lengthy batch operations that we associate with old-school mainframe applications. It’s OLTP and OLAP all in one, so there’s no separation between operational data stores and data warehouses. As long as all of the platform components are (relatively) innovative, this is great, for the same reason that mainframes were great in their day. HANA provides a great degree of openness, allowing for code written in Java and a number of other common languages to be deployed in a JVM environment and use HANA as just a database and application server, but the real differentiating benefits will come with using the HANA-specific analytics and other functionality. Therein lies the risk: if SAP can keep HANA innovative, then it will be a great platform for application development; if they harken to their somewhat conservative roots and the innovations are slow to roll out, HANA developers will become frustrated, and less likely to create applications that fully exploit (and therefore depend upon) the HANA platform.

SAP HANA Enterprise Cloud

Ingrid Van Den Hoogen and Kevin Ichhpurani gave a press briefing on what’s coming for HANA Enterprise Cloud following the launch last week. Now that the cloud offering is available,  existing customers can move any of their HANA-based applications — Business Suite, CRM, Business Warehouse, and custom applications — to the cloud platform. There’s also a gateway that allows interaction between the cloud-based applications and other applications left on premise. Customers can bring their own HANA licences, and use SAP services to onboard and migrate their existing systems to the cloud.

HANA Enterprise Cloud is the enterprise-strength, managed cloud version of HANA in the cloud: there’s also HANA One, which uses the Amazon public cloud for a lower-end entry point at $0.99/hour and a maximum of 30GB of data. Combined with HANA on premise (using gear from a certified hardware partner) and hosting partner OEM versions of HANA cloud that they repackage and run on their own environment (e.g., IBM or telcos), this provides a range of HANA deployment environments. HANA functionality is the same whether on AWS, on premise or on SAP’s managed cloud; moving between environments (such as moving an application from development/test on HANA One to production on HANA Enterprise Cloud) is a simple “lift and shift” to export from one environment and import into the target environment. The CIO from Florida Crystals was in the audience to talk about their experience moving to HANA in the cloud; they moved their SAP ERP environment from an outsourced data center to HANA Enterprise Cloud in 180 hours (that’s the migration time, not the assessment and planning time).

SAP is in the process of baking some of the HANA extensions into the base HANA platform; currently, there’s some amount of confusion about what “HANA” will actually provide in the future, although I’m sure that we’ll hear more about this as the updates are released.

SAPPHIRENOW Day 2 Keynote

This morning, our opening keynote was from SAP’s other co-CEO, Jim Snabe. He started with a bit about competitive advantage and adaptation to changing conditions, illustrated with the fact that Sumatran tigers have evolved webbed feet so that they can chase their prey into water: evolution and even extinction in business is not much different from that in the natural world, it just happens at a much faster pace. In business, we have both gradual evolution through continuous improvement, and quantum leaps caused primarily by the introduction of disruptive technology. Snabe positions HANA as being one of those disruptive technologies.

McLaren racing dashboardRon Dennis, chairman of McLaren Group, joined Snabe to talk about how they’re using HANA to gather, analyze and visualize data from their cars during Formula 1 races: 6.5 billion data points per car per race. We saw a prototype dashboard for visualizing that data, and heard how the data is used to make predictions and optimize performance during the race. Your processes probably don’t generate 6.5B events per instance, but in-flight optimization is something that’s beyond the capabilities of many organizations unless they use big data and predictive analytics. Integrating this functionality into process management may well be what allows the large vendors such as SAP and IBM to regain the BPM innovation advantage over some of the smaller and more nimble vendors. Survival of the fittest, indeed.

Snabe talked about other applications for HANA, such as in healthcare, where big data allows for comprehensive individual DNA analysis and disease prevention, before returning to the idea of using it for realtime business optimization that allows organizations to adapt and thrive. SAP is pushing all of their products onto HANA as the database platform, first providing data warehousing capabilities, SuccessFactors and now their Business Suite on HANA for greatly improved performance due to in-memory processing. They’ve opened up the platform so that other companies can develop applications on HANA, which will help to drive it into vertical industries. Interestingly, Snabe made the point that having realtime in-memory processing not only makes things faster, it also makes applications less complex, since some of the complexity in code is due to disk and processing latency. They have 1,500 customers on HANA now, and that number is growing fast.

HANA and in-memory processing was just one of the three “quantum leaps” that SAP has been effecting during the last three years; the second is having everything available in the cloud. Just as in-memory processing is about increasing speed and reducing complexity, so is cloud, except that it is about increasing speed and reducing complexity of IT implementations. In the three years that they’ve been at it, and including their SuccessFactors and Ariba acquisitions, they’ve gained 29 million users in the cloud. He was joined by executives from PepsiCo, Timken and Nespresso to talk about their transition to cloud, which included SuccessFactors for cloud-based performance management and HR across their global operations, and CRM in the cloud.

Combining their HANA and cloud initiatives, SAP launched HANA Enterprise Cloud last week, with HANA running on SAP’s infrastructure, which will allow organizations to run all of their SAP applications in the cloud, with the resulting benefits of elasticity and availability. I have a more detailed briefing on HANA Enterprise Cloud this afternoon

Their third quantum leap in the past three years is user experience, culminating in today’s launch of Fiori, a new user interface that brings the aesthetic of consumer UI — including mobile interfaces — to enterprise software. We’ll be hearing more about this in tomorrow’s keynote with Vishal Sikka.

By the way, you can watch the keynotes live and replays of many sessions here; I confess to have watched this morning’s keynote online from my hotel room in order to have reliable wifi to research while I watched and wrote this post.

Process Intelligence With @alanrick

I met up with the NetWeaver BPM product management team and sat in on a session given by Alan Rickayzen of SAP and their customer King Tantivejkul of Colgate-Palmolive on putting intelligence into processes. This wasn’t about process automation — it was assumed that you have some sort of process automation in some system already, which constitutes the instrumentation on the processes — but rather taking all of the process events from a heterogeneous collection of systems and analyzing them in the aggregate in order to drive and support decision-making.

Colgate brings funnels all of their data from their global operations through a master data hub to their SAP back-end, including financials, materials, customer and reference data. SAP’s business suite ERP software is great for crunching data, but not so great at visualizing it — Colgate is using some hard-coded monthly reports that showed some metrics, but little about the process itself — so Colgate signed up for the operational process intelligence (OPINT) ramp-up (first customer release) to help them identify potential issues and bottlenecks in the process. They don’t have anything to show yet, but seem pretty excited about what they can get out of it.

OPINT, built on HANA, provides a more responsive and flexible view of process metrics. Without writing any Java or ABAP code, you can put together a dashboard that shows metrics from multiple systems, since HANA is acting as a process event warehouse for Business Workflow and NetWeaver BPM process events as well as custom processes made visible via Process Observer. In the future, they’ll be adding in other data sources, so you can pull in process models and event data from other systems. The HANA studio design environment allows these processes to be imported from the back-end systems and represents them as BPMN; events in these processes can then be mapped to different phases of a business scenario in order to generate the dashboard.

Predictive analytics are built in, as you might expect given the capabilities of HANA, allow for forecasting of missing specific KPIs and milestones. As we saw at IBM Impact a couple of weeks ago, predictive process analytics are becoming big for high-value process instances: it’s not enough to know if you’re meeting a specific KPI right now, you need to know how the process is going to roll out through its entire lifecycle.

The dashboard widgets that we saw in a short video clip look completely adequate: different data visualizations, colors to denote states, KPIs and drilldowns. No big UI innovations, but the real gold here is in the HANA analytics going on behind the scenes, and the ease with which a solution developer can create a dashboard view of the HANA data. Furthermore, this runs completely on HANA: HANA is the database, the analytics engine and the app server, making it a bit easier to deploy than some other analytics solutions. This is big data applied to process, and it’s fair to say that this combination is going to be significant for the future of BPM.

Back At SAPPHIRENOW – Day 1 Keynote

It’s been a couple of years since I last attended SAP’s huge SAPPHIRE NOW conference, but this week I’m here with my 20,000 closest friends at the Orlando Convention Center (plus another 80,000 watching online) to get caught up. The conference kicked off with a keynote from Bill McDermott, SAP’s co-CEO, and it’s all about HANA and cloud: everything from SAP now runs on HANA, and combined with their cloud platforms realize the dream of realtime, predictive supply chains. HANA is also at the heart of how SAP is addressing social enterprise functionality, allowing a company to analyze a flood of consumer social data to find what’s relevant.

They highlighted some of their sports-related customers’ applications — which definitely allowed for some good lead-in video — with executives from Under Armour, the San Francisco 49′ers and the NBA. In part, sports applications are about helping teams play better and manage their talent through play/player data analysis (think Moneyball), but are also about customer engagement online and in the stadium. The most traditional usage of SAP on the panel is with Under Armour, which manufactures sportswear and sports-related biometrics devices, but their incredible growth means that they needed enterprise systems that they won’t outgrow. An interesting new industry vertical focus for SAP.

The keynote finished with Bob Calderoni, CEO of Ariba (recently acquired by SAP) talking about how cloud — in the form of private business networks, of course — drives productivity. Good focus, since too often the current technology buzzwords (social, mobile, cloud) are discussed purely as the end, not the means, and we can lose sight of how these can make us more productive and efficient, as well as fully buzzword-enabled.

As usual, wifi in the keynote area is impossible, and since I’m tablet-only, I couldn’t even plug into the hard-wired internet that they provided for we guests of Global Communications – I’m not the only one in this section with a tablet rather than a laptop, so imagine that they’ll have to do something in the future to allow the media to consume and publish during the keynote. T-Mobile’s iPhone coverage is resolutely stuck at EDGE in this area, so I can’t even reliably set up a hotspot, although that would just contribute to the wifi problems. The WordPress Android app works fine offline, however, so I was able to take notes and publish later.

OpenText EIMDay Toronto, Financial Services Session

After lunch at the Toronto OpenText EIM Day, Catharine MacKenzie of the Mutual Fund Dealers Association talked about how they’re using OpenText MBPM (from the Metastorm acquisition). She spoke on an OpenText webinar last year, and I was interested in how they’ve progressed since then.

The MFDA is very process-based, since they’re a regulatory body, and although their policies don’t change that often, the processes used to deal with members and policies are constantly being improved. There was no packaged solution for their regulatory processes, and the need to have process flexibility without a full-on custom solution (which was beyond their budget and IT capabilities) led them to BPM. As I described in the post about the webinar (linked above), they started with four processes including compliance and enforcement, and sped through the implementation of several other processes through 2012. Although during the webinar, she stated that they would be implementing five new processes in 2012, most of that has been pushed to 2013, in part (it appears) because of a platform upgrade to MBPM 9.

She pointed out that everyone in MFDA is using BPM for internal administrative processes, such as booking time off, as well as for the member-facing processes; for many of these processes, the users don’t even know that they’re using BPM. They’re also an OpenText eDocs customer, so can present content within processes, although apparently they have had to do a lot of that integration work themselves.

As for benefits, they’re seeing a huge decrease in development and deployment time compared to custom applications that they build in Visual Studio, with process versioning and auditing built in. They’ve had challenges around having the business own the processes, rather than IT, while maintaining good process design and disciplined testing; the MBPM upgrade and migration is also taking longer than expected, hence is delaying some of their planned process implementations. This is an interesting result, against the backdrop of this morning’s customer keynote talking about major system upgrades: an upgrade that requires data migration and custom application refactoring is almost always going to cause delays in a previously-defined schedule of roll-outs, but very necessary for setting the stage for future functionality.

I’m skipping out for the rest of the afternoon to get back to my desk, but this has been a good opportunity to get caught up on the entire OpenText product suite and talk to some of their local customers.

Disclosure: OpenText is a customer, for whom I recently did a webinar and related white paper, but I am not paid to be here today, nor for writing any of these blog posts.