Click here to close now.

Welcome!

Microservices Journal Authors: Liz McMillan, Pat Romanski, Elizabeth White, Carmen Gonzalez, Blue Box Blog

Related Topics: Microservices Journal

Microservices Journal: Article

From OLAP Cubes to ElastiCubes – The Natural Evolution of BI

The convergence of disk-based and in-memory technologies

OLAP (Online Analytical Processing) technology is the most prevalent technology used in corporate BI solutions today. And while it does what it’s supposed to do very well, it has a bad (and accurate) reputation for being very expensive and difficult to implement, as well as extremely challenging to maintain. This fact has prevented OLAP technology from gaining wide popularity outside of Fortune 500-scale companies, which are the only ones who have the budgets for company-wide, OLAP-based BI implementations.


Since the inception of BI and consequent entrance of OLAP technology into the space, the need for BI has been rapidly growing. Recognizing that OLAP-based solutions were (and still are) hard to introduce into a wider market, thought leaders and visionaries in the space have been since then trying to bring BI down to the masses through technological and conceptual innovation.

The most recently recognized innovation (even though it’s been around for quite a while) was in-memory technology, whose main advantage was cutting implementation time and simplifying the process as a whole (a definite step in the right direction). However, as described in my recent article, In-Memory BI is Not the Future, It's the Past, using in-memory technology for speedy BI implementation introduces significant compromises, especially in terms of scalability (both for data volumes and support for many concurrent users). Now, after in-memory technology has been on the market for some time, it is clear that it is not really a replacement for OLAP technology, but did in fact expand the BI market to a wider audience. In fact, it is probably more accurate to say that in-memory technology and OLAP technology complement each other, each with its own advantages and tradeoffs.

In that article I also briefly mentioned the new disk-based ElastiCube technology (invented by SiSense). ElastiCube technology basically eliminates the inherent IMDB tradeoffs by providing unlimited scalability using off-the-shelf hardware while delivering both implementation and query response times as fast (or faster) as pure in-memory-based solutions. This claim was the subject of many of the emails and inquires I received following the article’s publication. I was repeatedly asked how ElastiCube technology had achieved what OLAP technology had failed to do for so many years, and what role in-memory technology played in its conception.

Thus, in this article I will describe how ElastiCube technology came to be, what inspired it, what made it possible and how it has already become a game-changer in the BI space, both in large corporations and small startups.

A Brief History of BI and OLAP
OLAP technology started gaining popularity in the late 1990s, and that had a lot to do with Microsoft’s first release of their OLAP Services product (now Analysis Services), based on technology acquired from Panorama Software. At that point in time, computer hardware wasn’t nearly as powerful as it is today; given the circumstances at the time, OLAP was groundbreaking. It introduced a spectacular way for business users (typically analysts) to easily perform multidimensional analysis of large volumes of business data. When Microsoft’s Multidimensional Expressions language (MDX) came closer to becoming a standard, more and more client tools (e.g., Panorama NovaView, ProClarity) started popping up to provide even more power to these users.

While Microsoft was not the first BI vendor around, their OLAP Services product was unique and significantly helped increase overall awareness of the possibilities offered by BI. Microsoft started gaining market share fairly quickly, as more as more companies started investing in BI solutions.

But as the years passed by, it became very apparent that while the type of multidimensional BI empowered by OLAP technology was a valuable asset to any organization, it seemed to be used mainly by large corporations. OLAP is just too complex and requires too much time and money to be implemented and maintained, thus eliminating it as a viable option for the majority of the market.

See: Microsoft (SSAS), IBM (Cognos)

The Visualization Front-End Craze
As more companies began investing in BI solutions, many vendors recognized the great opportunity in bringing BI to the mass market of companies with less money to spend than Fortune 500 firms. This is where visualization front-end vendors started popping up like mushrooms after the rain, each of them promising advanced business analytics to the end user, with minimal or no IT projects involved. Their appeal was based on radically reducing the infamous total cost of ownership (TCO) of typical BI solutions. These products, many of which are still available today, are full of useful and advanced visualization features.

However, after years of selling these products, it became very clear that they are incapable of providing a true alternative to OLAP-based solutions. Since they fail to provide similar centralized data integration and management capabilities, they found themselves competing mainly with Excel, and were being used only for analysis and reporting of limited data sets by individuals or small workgroups.

In order to work around these limitations (and increase revenues), these tools were introduced connectivity to OLAP sources as well as to the tabular (e.g., spreadsheet) data they supported until then. By doing that, these products basically negated the purpose for which they were initially designed – to provide an alternative to the expensive OLAP-based BI solutions.

See: Tableau Software, Tibco SpotFire, Panorama Software

The In-Memory Opportunity
The proliferation of cheap and widely available 64-bit PCs during the past few years has somewhat changed the rules of the game. More RAM could be installed in a PC, a boon for those visualization front-end vendors struggling to get more market share. More RAM on a PC means that more data can be quickly queried. If crunching a million rows of data on a machine with only 2GB of RAM was a drag, users could now add more gigabytes of RAM to their PCs and instantly solve the problem. But still, without providing centralized data integration and management, this was not a true alternative to OLAP-based solutions that are still prominent in massive organization-wide (or even inter-departmental) implementations.

Strangely enough, out of all the in-memory technology vendors out there, only one realized that using in-memory technology to empower individual users wasn't enough and that the way to gain more significant market share was to provide an end-to-end solution, from ETL to centralized data sharing to a front-end development environment. This vendor is QlikTech and it is no wonder that the company is flying high above the rest of the non-OLAP BI players. QlikTech used in-memory technology to cover a much wider range of BI solutions than any single front-end visualization tool could ever do.

By providing data integration and centralized data access capabilities, QlikTech was able to provide solutions that, for other vendors (in-memory or otherwise), required at least a lengthy data warehouse project if not a full-blown OLAP implementation. By utilizing in-memory technology in conjunction with 64-bit computing, QlikTech solutions work even on substantial amounts of data (significantly more than their traditional disk-based competitors could).

However, QlikTech has not been able to make a case for replacing OLAP yet. I believe this is not only because of the scalability issues and hardware requirements involved when large amounts of data and/or users are involved, but it’s also because they do not inherently support dimensional modeling like OLAP does. Apart from making life simpler for IT when maintaining multiple applications, OLAP’s implementation of a dimensional model also gives end users, via supporting front end tools, a broader range of flexibility in creating their own BI applications.

Microsoft, the newest entry into the in-memory BI game, also started marketing its in-memory PowerPivot solution as an alternative to OLAP, basically admitting it gives up on its Analysis Services as a viable solution for the wider mid-market.

See: QlikTech (QlikView), Microsoft (PowerPivot)

The SaaS/Cloud BI Hype
The SaaS/Cloud hype hasn’t skipped over the BI space, though running BI in the cloud does not dramatically change anything in respect to implementation time and/or complexity of implementation. In fact, cloud BI vendors use the same technologies that are widely used on-premises. There are several startup companies in this space, competing for niche markets. It’s still hard to tell what impact the cloud would have on the BI space as a whole as none of these companies has yet to prove there’s even a viable business for hosting BI in the cloud. One thing is certain, though: these companies cannot rely on in-memory technology to grow significantly. The costs of hardware and the amount of work required to support the number of customers they would need to thrive are prohibitive, to say the least. For more on the problem with cloud BI, see my earlier post, Would I Use Cloud Business Intelligence?

See: GoodData, YouCalc, Birst, PivotLink, Indicee

ElastiCube: Convergent Technologies for an Optimum Solution
ElastiCube technology was officially introduced to the market in late 2009, after more than five years of research and development conducted in complete secrecy. After being proved practical and effective in the real world (by being successfully implemented at over 100 companies, paying customers in numerous industries, from startups to multinational corporations), SiSense secured a $4 million investment to continue the development of the ElastiCube technology, and to expand awareness of the Prism Business Intelligence product which is based on the technology.

ElastiCube is the result of thoroughly analyzing the strengths and weaknesses of both OLAP and in-memory technologies, while taking into consideration the off-the-shelf hardware of today and tomorrow. The vision was to provide a true alternative to OLAP technology, without compromising on the speediness of the development cycle and query response times for which in-memory technologies are lauded. This would allow a single technology to be used in BI solutions of any scale, in any industry.

Here are the 10 main goals on which SiSense focused when designing the ElastiCube technology:

  1. A data warehouse must not be assumed to exist for effectively querying multiple sources.
  2. A star schema must not be assumed to exist for effective querying large amounts of data.
  3. The solution must provide unlimited scalability, both in terms of number of rows and number of fields, within a finite and reasonable amount of RAM.
  4. The solution must be able to operate using off-the-shelf hardware, even for extreme data/user scenarios.
  5. The solution must provide high-speed, out-of-the-box query performance, without requiring pre-calculations.
  6. There must be a separation between the application layer and the physical data layer via a virtual metadata layer.
  7. There must be support for a dimensional model and multidimensional analysis.
  8. The same application must be able to support a single user with a laptop to thousands of users via a central, server-based data repository.
  9. Without running an SQL database, an SQL layer must be available to conform to industry standards.
  10. The solution must offer the ability to incorporate additional/changed data (e.g., new rows, new fields) on the fly, without reprocessing the entire data model.
The presently available version of Prism, based on ElastiCube technology, delivers on every one of these requirements. Even though it would be a lot of fun for me, I unfortunately can’t delve into the nuts and bolts of how these goals are technologically achieved. What I can say is that ElastiCube utilizes columnar storage concepts as well as just-in-time in-memory query processing technology. If you want to read a little about it, you can take a look at SiSense’s ElastiCube technology page.

I can add that the feasibility of ElastiCube was greatly affected by the amazing CPU and disk technologies that now come with any run-of-the-mill personal computer.

ElastiCube is extremely powerful technology that enables speedy implementation of individual, workgroup and corporate-wide BI. As a solution that delivers the promise of OLAP-style BI without the cost, time and IT overhead of OLAP, it is no surprise that Prism is rapidly gaining popularity in the market. Businesses that use ElastiCube technology include household names such as, Target, Yahoo, Cisco, Samsung, Philips and Caterpillar. But a significant portion of business that use ElastiCube are significantly smaller, such as Wix and other startup companies - who otherwise could not afford BI at all.

See: SiSense (Prism)

More Stories By Elad Israeli

Elad Israeli is co-founder of business intelligence software company, SiSense. SiSense has developed Prism, a next-generation business intelligence platform based on its own, unique ElastiCube BI technology. Elad is responsible for driving the vision and strategy of SiSense’s unique BI products. Before co-founding SiSense, Elad served as a Product Manager at global IT services firm Ness Technologies (NASDAQ: NSTC). Previously, Elad was a Product Manager at Anysoft and, before that, he co-founded and led technology development at BiSense, a BI technology company.

@MicroservicesExpo Stories
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Grow your business with enterprise wearable apps using SAP Platforms and Google Glass. SAP and Google just launched the SAP and Google Glass Challenge, an opportunity for you to innovate and develop the best Enterprise Wearable App using SAP Platforms and Google Glass and gain valuable market exposure. In his session at @ThingsExpo, Brian McPhail, Senior Director of Business Development, ISVs & Digital Commerce at SAP, outlined the timeline of the SAP Google Glass Challenge and the opportunity...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
of cloud, colocation, managed services and disaster recovery solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. TierPoint, LLC, is a leading national provider of information technology and data center services, including cloud, colocation, disaster recovery and managed IT services, with corporate headquarters in St. Louis, MO. TierPoint was formed through the strategic combination of some of t...
What are the benefits of using an enterprise-grade orchestration platform? In their session at 15th Cloud Expo, Nate Gordon, Director of Technology at Appcore, and Kedar Poduri, Senior Director of Product Management at Citrix Systems, took a closer look at the architectural design factors needed to support diverse workloads and how to run these workloads efficiently as a service provider. They also discussed how to deploy private cloud environments in 15 minutes or less.
SYS-CON Events announced today Sematext Group, Inc., a Brooklyn-based Performance Monitoring and Log Management solution provider, will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Sematext is a globally distributed organization that builds innovative Cloud and On Premises solutions for performance monitoring, alerting and anomaly detection (SPM), log management and analytics (Logsene), search analytics (S...
SYS-CON Media announced today that Blue Box as launched a popular blog feed on Cloud Computing Journal. Cloud Computing Journal aims to help open the eyes of Enterprise IT professionals to the economics and strategies that utility/cloud computing provides. Blue Box Cloud gives you unequaled agility, without the burden of designing, deploying and managing your own infrastructure. It’s the right choice when public cloud just won’t do. Blue Box Cloud is a managed Private Cloud as a Service (...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
SYS-CON Media named Andi Mann editor of DevOps Journal. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. DevOps Journal brings valuable information to DevOps professionals who are transforming the way enterprise IT is done. Andi Mann, Vice President, Strategic Solutions, at CA Technologies, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, communicator, and thought lea...
When OpenStack aficionados gather in Vancouver in a couple of weeks, one of the hot topics will be containers, a “new” alternative to virtualization. Actually, container technology has been around for a couple of decades, but it is trending among the IT community at a fever pitch these days and stands to have a huge impact on the future of cloud computing.The appeal of container technology is easy to appreciate. In a nutshell, containers can enable you to run many more applications on the same h...
Docker is an open platform for developers and sysadmins of distributed applications that enables them to build, ship, and run any app anywhere. Docker allows applications to run on any platform irrespective of what tools were used to build it making it easy to distribute, test, and run software. I found this 5 Minute Docker video, which is very helpful when you want to get a quick and digestible overview. If you want to learn more, you can go to Docker’s web page and start with this Docker intro...
"ClusterUP is the most exciting company in the DevOps space," said Chhavi Upadhyay, CEO of ClusterUP. "We are solving the toughest challenges for DevOps, around creating a solution from using a container for app deployment, to facilitating creation of software stacks using drag and drop workflows for complex applications. At ClusterUP we believe that containers are here to stay and most applications can benefit from microservices architecture."
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...