Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Mehdi Daoudi, Yeshim Deniz

Related Topics: Microservices Expo

Microservices Expo: Blog Feed Post

The Catch 22 of Traditional Business Intelligence

Business intelligence vendors have been constantly rolling out new functionality and technology through the years

So much has already been said about how much of a pain business intelligence is. The complexity, the constant IT bottlenecks, the crazy cost of software, hardware, consultants and whatnot. Gil Dibner of Gemini Venture Funds (formerly of Genesis Partners) described it very eloquently and in great detail in his blog post about the SiSense investment round.

Since business intelligence imposes so many challenges, every existing BI vendor picks his favorite ones and positions itself as the best at addressing it. Some focus on providing easy to use front end tools for the business user, some on handling complex ETL scenarios and large data sets, others on open source software to remove software licensing costs and so on.



Business intelligence vendors have been constantly rolling out new functionality and technology through the years. But still, it seems like business intelligence has been standing still. No progress has been made in expanding it to the wider market that can't afford long and costly development/customization cycles. In fact, most of the BI vendors that do not sell enterprise-class solutions like SAP, IBM or Microsoft haven't been able to grow much and remain focused on niche markets.

Well, my friends, it's time somebody told you the truth.

Business intelligence can deliver on its promise, but the entire idea needs a complete overhaul. As long as vendors keep improving specific junctions within the traditional BI paradigm, no progress will be made. The traditional business intelligence paradigm needs to be scraped and replaced by something that is humanly manageable.

Why? Because the traditional paradigm contains an inherent flaw that prevents it from taking BI to the next level where ROI is indisputable and business users get another powerful tool added their arsenal - in companies of all (or most) sizes.

The Inherent Flaw in the Traditional BI Paradigm
If you search "why business intelligence projects fail" in Google you will find an abundance of white papers and articles (mostly written by BI vendors themselves) giving their two cents worth. When BI vendors pick their top reasons, they usually pick issues their offerings deal with and the competition's doesn't. Marketing 101. Fair enough.

But one top reason they all seem to agree on for a BI project's failure is the lack of up front planning. That is to say, in order for a business intelligence project to succeed, you must compile your requirements ahead of time, coordinate with all the relevant parties (IT, business departments and executives) and plan the project in accordance to those requirements. Otherwise, you are destined to fail.

In other words, they blame you - the consumer - for a failed BI project. Had you planned ahead, the project would have been a success and you wouldn't have flushed hundreds of thousands of dollars worth of software licenses, hardware and personnel time down the toilet.

Sadly, they have a point. Since traditional BI solutions aren't very sympathetic to unplanned changes from an architectural point of view, anything you don't think of in advance is a real pain to introduce later. So you better think long and hard about what you need, otherwise those requirements you missed could mean the difference between a successful project and complete a complete mess.

But herein lies the catch.

It doesn't matter who you are or how much experience you have, it is utterly impossible to know in advance what your requirements are when it comes to BI. BI is highly dynamic and requirements change all the time because the business changes all the time. A report you need now is not the report you need later, an analysis you do now is only relevant for a short period of time and meaningless shortly after.

Most importantly - if you are a department/company seeking BI but has no BI development experience, you have no way of knowing how a particular requirement will affect the architecture of your solution. Thus, you could easily find yourself disregarding the immediate testing of some particular capability because it seems trivial to you, just to discover later that the entire solution comes tumbling down when you actually try to use it and that without it - the system is useless.

You cannot imagine how often this happens, especially when a solution calls for OLAP cubes built over a data warehouse (bleh).

It's the traditional BI vendors who made up the rules for this game over 10 years ago. They are the ones who've been aggressively promoting a paradigm where everything needs to be thought of in advance otherwise you are sure to fail. It makes sense because these vendors focus on enterprise-wide BI for fortune 500s where the complexity of a BI project is masked by the complexity of the corporation's own business processes. These organizations are used to things taking years to reach perfection because every other process they have pretty much takes the same amount of time to reach it.

But trying to implement the same concepts on slightly smaller corporations is the exact reason why most BI projects fail.

Don't get me wrong. It's always good to plan ahead. But know this - business intelligence requirements are impossible to predict and nearly impossible to measure until the end users use it on real data - in real-life scenarios - over time.

You cannot do this with traditional BI without investing a TON beforehand, and even then you have no guarantees. When you go for BI as advocated by the traditional platform players, you are basically throwing hundred dollar bills down a wishing well and hoping for the best.

Learn from the thousands and thousands of companies who have already learned this harsh lesson with blood and tears. Don't do it. There are ways to change the rules of the game while still getting the same class of business intelligence, without compromising on capability or functionality. But you cannot expect to find it by turning to the traditional BI players that have an over-sized BI developer eco-system they need to provide work for. This can only be done by younger, innovative BI companies armed with new technologies, fresh ideas and sensible pricing models.

Read the original blog entry...

More Stories By Elad Israeli

Elad Israeli is co-founder of business intelligence software company, SiSense. SiSense has developed Prism, a next-generation business intelligence platform based on its own, unique ElastiCube BI technology. Elad is responsible for driving the vision and strategy of SiSense’s unique BI products. Before co-founding SiSense, Elad served as a Product Manager at global IT services firm Ness Technologies (NASDAQ: NSTC). Previously, Elad was a Product Manager at Anysoft and, before that, he co-founded and led technology development at BiSense, a BI technology company.

Microservices Articles
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
If your cloud deployment is on AWS with predictable workloads, Reserved Instances (RIs) can provide your business substantial savings compared to pay-as-you-go, on-demand services alone. Continuous monitoring of cloud usage and active management of Elastic Compute Cloud (EC2), Relational Database Service (RDS) and ElastiCache through RIs will optimize performance. Learn how you can purchase and apply the right Reserved Instances for optimum utilization and increased ROI.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...