Welcome!

Microservices Expo Authors: Pat Romanski, Stackify Blog, Elizabeth White, Steve Wilson, Mehdi Daoudi

Related Topics: Microservices Expo

Microservices Expo: Article

From OLAP Cubes to ElastiCubes – The Natural Evolution of BI

The convergence of disk-based and in-memory technologies

OLAP (Online Analytical Processing) technology is the most prevalent technology used in corporate BI solutions today. And while it does what it’s supposed to do very well, it has a bad (and accurate) reputation for being very expensive and difficult to implement, as well as extremely challenging to maintain. This fact has prevented OLAP technology from gaining wide popularity outside of Fortune 500-scale companies, which are the only ones who have the budgets for company-wide, OLAP-based BI implementations.


Since the inception of BI and consequent entrance of OLAP technology into the space, the need for BI has been rapidly growing. Recognizing that OLAP-based solutions were (and still are) hard to introduce into a wider market, thought leaders and visionaries in the space have been since then trying to bring BI down to the masses through technological and conceptual innovation.

The most recently recognized innovation (even though it’s been around for quite a while) was in-memory technology, whose main advantage was cutting implementation time and simplifying the process as a whole (a definite step in the right direction). However, as described in my recent article, In-Memory BI is Not the Future, It's the Past, using in-memory technology for speedy BI implementation introduces significant compromises, especially in terms of scalability (both for data volumes and support for many concurrent users). Now, after in-memory technology has been on the market for some time, it is clear that it is not really a replacement for OLAP technology, but did in fact expand the BI market to a wider audience. In fact, it is probably more accurate to say that in-memory technology and OLAP technology complement each other, each with its own advantages and tradeoffs.

In that article I also briefly mentioned the new disk-based ElastiCube technology (invented by SiSense). ElastiCube technology basically eliminates the inherent IMDB tradeoffs by providing unlimited scalability using off-the-shelf hardware while delivering both implementation and query response times as fast (or faster) as pure in-memory-based solutions. This claim was the subject of many of the emails and inquires I received following the article’s publication. I was repeatedly asked how ElastiCube technology had achieved what OLAP technology had failed to do for so many years, and what role in-memory technology played in its conception.

Thus, in this article I will describe how ElastiCube technology came to be, what inspired it, what made it possible and how it has already become a game-changer in the BI space, both in large corporations and small startups.

A Brief History of BI and OLAP
OLAP technology started gaining popularity in the late 1990s, and that had a lot to do with Microsoft’s first release of their OLAP Services product (now Analysis Services), based on technology acquired from Panorama Software. At that point in time, computer hardware wasn’t nearly as powerful as it is today; given the circumstances at the time, OLAP was groundbreaking. It introduced a spectacular way for business users (typically analysts) to easily perform multidimensional analysis of large volumes of business data. When Microsoft’s Multidimensional Expressions language (MDX) came closer to becoming a standard, more and more client tools (e.g., Panorama NovaView, ProClarity) started popping up to provide even more power to these users.

While Microsoft was not the first BI vendor around, their OLAP Services product was unique and significantly helped increase overall awareness of the possibilities offered by BI. Microsoft started gaining market share fairly quickly, as more as more companies started investing in BI solutions.

But as the years passed by, it became very apparent that while the type of multidimensional BI empowered by OLAP technology was a valuable asset to any organization, it seemed to be used mainly by large corporations. OLAP is just too complex and requires too much time and money to be implemented and maintained, thus eliminating it as a viable option for the majority of the market.

See: Microsoft (SSAS), IBM (Cognos)

The Visualization Front-End Craze
As more companies began investing in BI solutions, many vendors recognized the great opportunity in bringing BI to the mass market of companies with less money to spend than Fortune 500 firms. This is where visualization front-end vendors started popping up like mushrooms after the rain, each of them promising advanced business analytics to the end user, with minimal or no IT projects involved. Their appeal was based on radically reducing the infamous total cost of ownership (TCO) of typical BI solutions. These products, many of which are still available today, are full of useful and advanced visualization features.

However, after years of selling these products, it became very clear that they are incapable of providing a true alternative to OLAP-based solutions. Since they fail to provide similar centralized data integration and management capabilities, they found themselves competing mainly with Excel, and were being used only for analysis and reporting of limited data sets by individuals or small workgroups.

In order to work around these limitations (and increase revenues), these tools were introduced connectivity to OLAP sources as well as to the tabular (e.g., spreadsheet) data they supported until then. By doing that, these products basically negated the purpose for which they were initially designed – to provide an alternative to the expensive OLAP-based BI solutions.

See: Tableau Software, Tibco SpotFire, Panorama Software

The In-Memory Opportunity
The proliferation of cheap and widely available 64-bit PCs during the past few years has somewhat changed the rules of the game. More RAM could be installed in a PC, a boon for those visualization front-end vendors struggling to get more market share. More RAM on a PC means that more data can be quickly queried. If crunching a million rows of data on a machine with only 2GB of RAM was a drag, users could now add more gigabytes of RAM to their PCs and instantly solve the problem. But still, without providing centralized data integration and management, this was not a true alternative to OLAP-based solutions that are still prominent in massive organization-wide (or even inter-departmental) implementations.

Strangely enough, out of all the in-memory technology vendors out there, only one realized that using in-memory technology to empower individual users wasn't enough and that the way to gain more significant market share was to provide an end-to-end solution, from ETL to centralized data sharing to a front-end development environment. This vendor is QlikTech and it is no wonder that the company is flying high above the rest of the non-OLAP BI players. QlikTech used in-memory technology to cover a much wider range of BI solutions than any single front-end visualization tool could ever do.

By providing data integration and centralized data access capabilities, QlikTech was able to provide solutions that, for other vendors (in-memory or otherwise), required at least a lengthy data warehouse project if not a full-blown OLAP implementation. By utilizing in-memory technology in conjunction with 64-bit computing, QlikTech solutions work even on substantial amounts of data (significantly more than their traditional disk-based competitors could).

However, QlikTech has not been able to make a case for replacing OLAP yet. I believe this is not only because of the scalability issues and hardware requirements involved when large amounts of data and/or users are involved, but it’s also because they do not inherently support dimensional modeling like OLAP does. Apart from making life simpler for IT when maintaining multiple applications, OLAP’s implementation of a dimensional model also gives end users, via supporting front end tools, a broader range of flexibility in creating their own BI applications.

Microsoft, the newest entry into the in-memory BI game, also started marketing its in-memory PowerPivot solution as an alternative to OLAP, basically admitting it gives up on its Analysis Services as a viable solution for the wider mid-market.

See: QlikTech (QlikView), Microsoft (PowerPivot)

The SaaS/Cloud BI Hype
The SaaS/Cloud hype hasn’t skipped over the BI space, though running BI in the cloud does not dramatically change anything in respect to implementation time and/or complexity of implementation. In fact, cloud BI vendors use the same technologies that are widely used on-premises. There are several startup companies in this space, competing for niche markets. It’s still hard to tell what impact the cloud would have on the BI space as a whole as none of these companies has yet to prove there’s even a viable business for hosting BI in the cloud. One thing is certain, though: these companies cannot rely on in-memory technology to grow significantly. The costs of hardware and the amount of work required to support the number of customers they would need to thrive are prohibitive, to say the least. For more on the problem with cloud BI, see my earlier post, Would I Use Cloud Business Intelligence?

See: GoodData, YouCalc, Birst, PivotLink, Indicee

ElastiCube: Convergent Technologies for an Optimum Solution
ElastiCube technology was officially introduced to the market in late 2009, after more than five years of research and development conducted in complete secrecy. After being proved practical and effective in the real world (by being successfully implemented at over 100 companies, paying customers in numerous industries, from startups to multinational corporations), SiSense secured a $4 million investment to continue the development of the ElastiCube technology, and to expand awareness of the Prism Business Intelligence product which is based on the technology.

ElastiCube is the result of thoroughly analyzing the strengths and weaknesses of both OLAP and in-memory technologies, while taking into consideration the off-the-shelf hardware of today and tomorrow. The vision was to provide a true alternative to OLAP technology, without compromising on the speediness of the development cycle and query response times for which in-memory technologies are lauded. This would allow a single technology to be used in BI solutions of any scale, in any industry.

Here are the 10 main goals on which SiSense focused when designing the ElastiCube technology:

  1. A data warehouse must not be assumed to exist for effectively querying multiple sources.
  2. A star schema must not be assumed to exist for effective querying large amounts of data.
  3. The solution must provide unlimited scalability, both in terms of number of rows and number of fields, within a finite and reasonable amount of RAM.
  4. The solution must be able to operate using off-the-shelf hardware, even for extreme data/user scenarios.
  5. The solution must provide high-speed, out-of-the-box query performance, without requiring pre-calculations.
  6. There must be a separation between the application layer and the physical data layer via a virtual metadata layer.
  7. There must be support for a dimensional model and multidimensional analysis.
  8. The same application must be able to support a single user with a laptop to thousands of users via a central, server-based data repository.
  9. Without running an SQL database, an SQL layer must be available to conform to industry standards.
  10. The solution must offer the ability to incorporate additional/changed data (e.g., new rows, new fields) on the fly, without reprocessing the entire data model.
The presently available version of Prism, based on ElastiCube technology, delivers on every one of these requirements. Even though it would be a lot of fun for me, I unfortunately can’t delve into the nuts and bolts of how these goals are technologically achieved. What I can say is that ElastiCube utilizes columnar storage concepts as well as just-in-time in-memory query processing technology. If you want to read a little about it, you can take a look at SiSense’s ElastiCube technology page.

I can add that the feasibility of ElastiCube was greatly affected by the amazing CPU and disk technologies that now come with any run-of-the-mill personal computer.

ElastiCube is extremely powerful technology that enables speedy implementation of individual, workgroup and corporate-wide BI. As a solution that delivers the promise of OLAP-style BI without the cost, time and IT overhead of OLAP, it is no surprise that Prism is rapidly gaining popularity in the market. Businesses that use ElastiCube technology include household names such as, Target, Yahoo, Cisco, Samsung, Philips and Caterpillar. But a significant portion of business that use ElastiCube are significantly smaller, such as Wix and other startup companies - who otherwise could not afford BI at all.

See: SiSense (Prism)

More Stories By Elad Israeli

Elad Israeli is co-founder of business intelligence software company, SiSense. SiSense has developed Prism, a next-generation business intelligence platform based on its own, unique ElastiCube BI technology. Elad is responsible for driving the vision and strategy of SiSense’s unique BI products. Before co-founding SiSense, Elad served as a Product Manager at global IT services firm Ness Technologies (NASDAQ: NSTC). Previously, Elad was a Product Manager at Anysoft and, before that, he co-founded and led technology development at BiSense, a BI technology company.

@MicroservicesExpo Stories
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
These days, change is the only constant. In order to adapt and thrive in an ever-advancing and sometimes chaotic workforce, companies must leverage intelligent tools to streamline operations. While we're only at the dawn of machine intelligence, using a workflow manager will benefit your company in both the short and long term. Think: reduced errors, improved efficiency and more empowered employees-and that's just the start. Here are five other reasons workflow automation is leading a revolution...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
In our first installment of this blog series, we went over the different types of applications migrated to the cloud and the benefits IT organizations hope to achieve by moving applications to the cloud. Unfortunately, IT can’t just press a button or even whip up a few lines of code to move applications to the cloud. Like any strategic move by IT, a cloud migration requires advanced planning.
Docker is on a roll. In the last few years, this container management service has become immensely popular in development, especially given the great fit with agile-based projects and continuous delivery. In this article, I want to take a brief look at how you can use Docker to accelerate and streamline the software development lifecycle (SDLC) process.
We have Continuous Integration and we have Continuous Deployment, but what’s continuous across all of what we do is people. Even when tasks are automated, someone wrote the automation. So, Jayne Groll evangelizes about Continuous Everyone. Jayne is the CEO of the DevOps Institute and the author of Agile Service Management Guide. She talked about Continuous Everyone at the 2016 All Day DevOps conference. She describes it as "about people, culture, and collaboration mapped into your value streams....
“Why didn’t testing catch this” must become “How did this make it to testing?” Traditional quality teams are the crutch and excuse keeping organizations from making the necessary investment in people, process, and technology to accelerate test automation. Just like societies that did not build waterways because the labor to keep carrying the water was so cheap, we have created disincentives to automate. In her session at @DevOpsSummit at 20th Cloud Expo, Anne Hungate, President of Daring System...
Did you know that you can develop for mainframes in Java? Or that the testing and deployment can be automated across mobile to mainframe? In his session and demo at @DevOpsSummit at 21st Cloud Expo, Dana Boudreau, a Senior Director at CA Technologies, will discuss how increasingly teams are developing with agile methodologies, using modern development environments, and automating testing and deployments, mobile to mainframe.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single-threaded, you can effectively identify hot spots in your serverless code. In his session at @DevOpsSummit at 21st Cloud Expo, Dave Martin, Product owner at CA Technologies, will give a live demonstration and code walkthrough, showing how ...
If you are part of the cloud development community, you certainly know about “serverless computing”, almost a misnomer. Because it implies there are no servers which is untrue. However the servers are hidden from the developers. This model eliminates operational complexity and increases developer productivity. We came from monolithic computing to client-server to services to microservices to serverless model. In other words, our systems have slowly “dissolved” from monolithic to function-by-func...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
We define Hybrid IT as a management approach in which organizations create a workload-centric and value-driven integrated technology stack that may include legacy infrastructure, web-scale architectures, private cloud implementations along with public cloud platforms ranging from Infrastructure-as-a-Service to Software-as-a-Service.
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...