Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz, Flint Brenton

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, PowerBuilder, Artificial Intelligence, @CloudExpo

Containers Expo Blog: Article

SAP HANA’s Real Time Challenge to the Oracle Empire

Real-time In-Memory platform presents a groundbreaking approach

When the character Maverick from the movie Top Gun exclaimed, "I feel the need, the need for speed", you'd be forgiven for mistaking it for a sound bite from a CIO discussing their transactional databases. Whether it's a financial organization predicting share prices, a bank knowing whether it can approve a loan or a marketing organisation reaching consumers with a compelling promotional offer, the need to access, store, process and analyze data as quickly as possible is an imperative for any business looking to gain a competitive edge. Hence when in 2011, SAP announced their new in-memory platform HANA for enterprise applications everyone took note as they coined the advantage of real-time analytics. SAP HANA promised to not just make databases dramatically faster like traditional business warehouse accelerator systems but instead speed up the front end, enabling companies to run arbitrary, complex queries on billions of records in a matter of seconds as opposed to hours. The vendors of old legacy traditional databases were facing a major challenge, most notably the king of them all...Oracle.

The Birth and Emergence of Big Data
Back in the days of mainframe, you'd find the application and transactional data of reporting databases physically stored in the same system. This was due to applications, operating systems and databases being designed to maximize their hardware resources, which consequently meant you couldn't process transactions and process report simultaneously. The bottleneck here was cost, in that if you wanted to scale you needed another mainframe.

After the advent of client servers where applications could run on a centralized database server via multiple and cost effective servers, scalability was achieved by simply adding additional application servers. Regardless, of this a new bottleneck was quickly established with systems relying on a single database server and requests from ever increasing application servers that ended up causing I/O stagnation. This problem became exasperated with OLTP (online transaction processing), where report creation required the system to concurrently read multiple tables in the database. Added to this servers and processors kept getting faster while disks (despite the emergence of SSD) were quickly becoming the bottleneck to automated processes that were producing large amounts of data that concurrently resulted in more report requests.

The net effect was a downward spiral where the increase of users requiring an increase of reports from the databases meant an increase in huge amounts of data being requested from disks that simply weren't up to the job. When you then factored in the data proliferation of external users caused by the Internet and pressure inducing laws such as Sarbanes-Oxley, the demand to analyze even more data even quicker has reached fever point. With data and user volumes increasing by a factor of thousands compared to the I/O capability of databases, the transaction-based industry faced a challenge that required a dramatic shift and change.  Cue the 2011 emergence of SAP's HANA.

Real-Time In Memory Platform Presents a Groundbreaking Approach
One of the major advantages of SAP HANA's ability to run in real time is that it offers a non-requirement for data redundancy as it's built to run as a single database. With clusters of affordable and scalable servers, transactional and analytical data are run on the same database, hence eliminating different types of databases for different application needs. Oracle on the other hand has built an empire on exactly the opposite.

Oracle has thrived on a model where generally companies start with a simple database that's utilized for checking sales orders and ensuring product delivery to customers but as the business grows they need more databases with different and more demanding functions. Functions such as managing customer relationships, complex reporting and analysis drives a need for new databases that are separate from the actual business requiring data to be moved from one system to another. Eventually you have a sprawl of databases as existing ones are unable to handle the workloads making it almost impossible to track data movements yet alone attain real time updates. So while the Oracle marketing machine is also pitching the benefits of in-memory via its Exalytics appliance and in-memory database, TimesTen, Oracle are certainly in no rush to break this traditional model of database sprawl and the money-spinning licenses that come with it.

Looking closely at the Oracle Exalytics / TimesTen package, despite the hype, it merely is just an add-on product meaning that an end user will still need a license for the transactional database, another license for the data warehouse database and yet another license for TimesTen for Oracle Exalytics.

Moreover, the Oracle bolt-on approach serves to sell more of their hardware commodity and in some ways perversely justify their acquisition of SUN Microsystems, all at the expense of the customer. Due to the Exalytics approach continuing the traditional requirement for transactional data to be duplicated from the application to the warehouse and once again to Exalytics, the end user not only ends up with three copies of the data, they also have to have three levels of storage and servers. In contrast SAP HANA is designed to be a single database that runs both transactional applications and Business Warehouse deployments. Not only does SAP HANA's one copy of data replace the two or three required for Oracle it also eliminates the need for materialized views, redundant aggregates and indexes leaving a significantly reduced data footprint.

Comparing HANA to Oracle's TimesTen and Exalytics
As expected Oracle have already initiated their FUD team with bogus claims and untruths against HANA as well as even pushing their TimesTen as a like for like comparison. Where this is hugely flawed is that they fail to acknowledge or admit that SAP HANA is a completely groundbreaking design as opposed to a bolt-on approach.  With SAP HANA data is completely managed and accessed in RAM consequently doing away with the requirement of MOLAP, multiple indexes and other tuning features that Oracle pride themselves on.

Furthermore, despite the Oracle FUD, SAP HANA does indeed handle both unstructured and structured data, as well as utilise parallel queries for scaling out across server nodes. In this instance Oracle are trying hard to create the most confusion and subsequently detract the market from realizing that the TimesTen with Exalytics package still can't scale out beyond the 1TB RAM limit unlike SAP HANA where each container can store up to 500TB of data all executable at high speed.

With an aggressive TCO and ROI model compared to a traditional Oracle deployment, SAP HANA also proves a lot more cost effective. With pricing based on an incremental of 64GB RAM and the total amount of data held in memory, licenses are fully inclusive of production and test/development requirements as well as the necessary tools.

SAP HANA's Embracing of VMware
Furthermore with Oracle's belligerent stance towards VMware and the cost savings it brings to end users, SAP on the other hand has embraced it.  The recent announcement that SAP HANA is supporting VMware vSphere will provide them a vast competitive advantdge, as it will enable customers to provision instances of SAP HANA in minutes as VM templates, as well as gain benefits such as Dynamic Resource Scheduling and vSphere vMotion. By virtualizing SAP HANA with VMware, end users can quickly have several smaller HANA instances all sharing a single physical server leading to better utilization of existing resources. With the promise of certified preconfigured and optimised converged infrastructures such as the Vblock around the corner, SAP HANA appliances could be shipped with vSphere 5 and SAP HANA pre-installed within days, enabling rapid deployment for businesses.

The Business Benefits of Real-Time
With business and transactions being done in real time, SAP HANA ensures that the data and the analytics that come with them are also in real time. The process of manually polling data from multiple systems and sorting them through are inadequate in a time when businesses are facing unpredictable economic conditions and volatile demand and complex supply chains. The need is for real time metrics that are aligned to supply and demand where a retailers' shelves can accurately and immediately be stocked eliminating unnecessary inventory costs, lost sales opportunities and failed product launches. Being able to instantly analyze data at any level of granularity enables a business to quickly respond to these market insights and take decisive actions such as transferring inventory between distribution centers based on expected sales or altering the prices of promotions based on customer demand. Instead of waiting for processes that take hours, days or even weeks, SAP HANA's real time capabilities enable businesses to react in real time to incidents.

Ultimately SAP HANA is a revolutionary step forward that will empower organizations to focus more on the business and less on the infrastructure that supports them. With the promise of new applications being built by SAP to support real time decision making as well being able to run existing applications, SAP HANA presents the opportunity to not only transform a business but also the underlying technology that supports it.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

@MicroservicesExpo Stories
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...