Welcome!

Microservices Expo Authors: Anders Wallgren, Cloud Best Practices Network, Thanh Tran, Liz McMillan, Carmen Gonzalez

Related Topics: Open Source Cloud, Microservices Expo

Open Source Cloud: Article

Case Study: Open Source + Business Intelligence

A marriage made for data-driven businesses

Data-driven businesses are facing some tough challenges in today's rapidly changing information landscape. As decision cycles continue to shrink, companies need to act on information within hours and minutes rather than weeks and days. At the same time, the volume of data that needs to be analyzed is growing exponentially. Business intelligence (BI) approaches that might have made sense a decade or even five years ago may no longer be the best fit for organizations that must quickly and affordably make sense of terabytes of incoming data that shows no sign of slowing down.

For my company, MX Force, speedy data analysis is not simply a "nice to have," it's critical to our business. As a cloud-based provider of email security for organizations of all sizes, we need to identify the origins of spam, viruses and other potential threats for our clients, fast. But as our business has grown, so has the volume of email log data that we must store, filter, search, analyze and report on. Recently, we were challenged to find a database that could reliably enable quick and efficient ad-hoc queries on up to a year's worth of email log data. Our staff uses this data to analyze and report on statistical information, and we also give our clients the ability to query their own logs to diagnose mail delivery issues. It was important to find a database that could deliver the high performance we required, but affordability and ease of administration were also of vital concern. These considerations prompted us to seek an open source solution.

Open Source Meets Business Intelligence
MX Force uses a number of open source tools within our organization. The low cost of open source is one reason for this, but flexibility is another important driver. Because open source projects are community-driven, users can tweak, customize and tinker with the software as much as they like. This is a big advantage when it comes to business intelligence, as data analysis requirements can change quickly, and you don't want to have to wait weeks or months to get a new query set up or to change the parameters of those that are already running. MX Force was already using MySQL in our business, so we decided to try Infobright's open source analytic database, ICE (Infobright Community Edition.) ICE combines a columnar database with innovative compression and self-tuning capabilities that eliminate the need to create indexes, partition data or do any manual intervention to achieve fast response for queries and reports. The software is built on MySQL, so for us there was a very small implementation and training curve - ICE uses the same familiar MySQL interface. The fact that ICE is an open source analytic solution presented us with several key benefits:

  1. Deployment speed: The time from download and installation to first production use was just three weeks.
  2. Affordability: Many of the proprietary commercial BI solutions available today require custom configuration, expensive licensing agreements and equally expensive hardware to support and run it. Not only was ICE free to install, we could also run the software on inexpensive commodity servers, eliminating the need to invest in high performance servers and storage arrays. (Our entire workload is supported by a single quad-core server.)
  3. Simplicity and flexibility: Because ICE is open and standards-based, we can quickly make changes as needed without requiring extensive IT assistance. In addition, it's often a lot simpler to make fixes or upgrade an open source solution because an entire community contributes their expertise to fixing bugs and making improvements. With proprietary software, users have to wait for issues to be addressed by the vendor, which can take much longer.

MX Force is currently using ICE to quickly isolate mail flow problems and trends. In our experience, using a free, open source product has not in any way involved a compromise on performance or capabilities. We are achieving 10:1 data compression, which saves on storage costs and boosts performance. Most statistical queries render results in less than five seconds. Ongoing administration is simple. The net result is that the product delivers the fast query performance and reporting functionality we needed, at an incredibly low cost for hardware and ongoing maintenance.

Look, then Leap
Interested in giving open source a try for your BI and analytic efforts? There are a number of compelling benefits to doing so, but as with any type of software, it's also important to look before you leap. Evaluation and testing considerations are no different than they would be for licensed software - you want to make sure the solution has the features and capabilities most relevant to your business. Also, there's a difference between open source projects that are at a very early and experimental stage and software that is well established and has a vibrant and involved community behind it, strong vendor support, or both. Investigate the support offered for the solution under consideration. How often are new features added? Are bug fixes made in a timely manner? Is there useful and accurate supporting documentation?

With ICE, we were certainly attracted by the many resources and significant participation of both Infobright and the user community. We also knew there was a commercial version available if we decided we needed the additional functionality it offered or a formal support contract. For companies just jumping in to the open source arena, it's best to avoid tools that haven't yet cultivated a strong following. But even if you do make a mistake, the low (and usually free) cost of open source means that there's minimal risk.

The BI requirements of today's data-driven businesses demand speed, simplicity and affordability. As open source solutions continue to mature, it's worth looking at projects that are focused on analytics, BI and other data management activities. The more nimble and flexible approach embodied by open source may just be the best fit for addressing the many information management challenges driven by data growth and complexity.

More Stories By Mike Makowski

Mike Makowski is CTO of MX Force, a leading provider of email security in the cloud and member of Infobright’s Customer Advisory Council. More information about MX Force can be found at http://www.mxforce.com/

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Earlier this week, we hosted a Continuous Discussion (#c9d9) on Continuous Delivery (CD) automation and orchestration, featuring expert panelists Dondee Tan, Test Architect at Alaska Air, Taco Bakker, a LEAN Six Sigma black belt focusing on CD, and our own Sam Fell and Anders Wallgren. During this episode, we discussed the differences between CD automation and orchestration, their challenges with setting up CD pipelines and some of the common chokepoints, as well as some best practices and tips...
As AT&Ts VP of Domain 2.0 architecture writes one aspect of their Domain 2.0 strategy is a goal to embrace a Microservices Application Architecture. One page 9 they describe how these envisage them fitting into the ECOMP architecture: "The initial steps of the recipes include a homing and placement task using constraints specified in the requests. ‘Homing and Placement' are micro-services involving orchestration, inventory, and controllers responsible for infrastructure, network, and applicati...
Automation is a critical component of DevOps and Continuous Delivery. This morning on #c9d9 we discussed CD Automation and how you can apply Automation to accelerate release cycles, improve quality, safety and governance? What is the difference between Automation and Orchestration? Where should you begin your journey to introduce both?
While there has been much ado about interoperability, there are still no real solutions, same as last year and the year before that. The large EHR vendors who continue to dominate the market still maintain that interoperability is all but solved, still can't connect EHRs across the continuum causing frustration by providers and a disservice to patients. The ONC pays lip service to the problem, but that is about it. It is time for the healthcare industry to consider alternatives like middleware w...
SYS-CON Events announced today the Docker Meets Kubernetes – Intro into the Kubernetes World, being held June 9, 2016, in conjunction with 18th Cloud Expo | @ThingsExpo, at the Javits Center in New York, NY. Register for 'Docker Meets Kubernetes Workshop' Here! This workshop led by Sebastian Scheele, co-founder of Loodse, introduces participants to Kubernetes (container orchestration). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, participants learn ...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, will discuss how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He will discuss how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty ...
Our CTO, Anders Wallgren, recently sat down to take part in the “B2B Nation: IT” podcast — the series dedicated to serving the IT professional community with expert opinions and advice on the world of information technology. Listen to the great conversation, where Anders shares his thoughts on DevOps lessons from large enterprises, the growth of microservices and containers, and more.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit y...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
@DevOpsSummit taking place June 7-9, 2016 at Javits Center, New York City, and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
IoT generates lots of temporal data. But how do you unlock its value? How do you coordinate the diverse moving parts that must come together when developing your IoT product? What are the key challenges addressed by Data as a Service? How does cloud computing underlie and connect the notions of Digital and DevOps What is the impact of the API economy? What is the business imperative for Cognitive Computing? Get all these questions and hundreds more like them answered at the 18th Cloud Expo...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Just last week a senior Hybris consultant shared the story of a customer engagement on which he was working. This customer had problems, serious problems. We’re talking about response times far beyond the most liberal acceptable standard. They were unable to solve the issue in their eCommerce platform – specifically Hybris. Although the eCommerce project was delivered by a system integrator / implementation partner, the vendor still gets involved when things go really wrong. After all, the vendo...
As enterprises around the world struggle with their digital transformation efforts, many are finding that innovative digital teams are moving much faster than their hidebound IT organizations. Rather than struggling to convince traditional IT to get with the digital program, executives are taking advice from IT research firm Gartner, and encouraging existing IT to continue in their desultory ways. However, many CIOs are realizing the dangers of following Gartner’s advice. The central challenge ...
The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments. In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will look at the present state of cloud from the C-level view, and how great companies and rock star executives can use cloud computing to meet their most ambitious and disruptive business ...
The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity. In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can...
You might already know them from theagileadmin.com, but let me introduce you to two of the leading minds in the Rugged DevOps movement: James Wickett and Ernest Mueller. Both James and Ernest are active leaders in the DevOps space, in addition to helping organize events such as DevOpsDays Austinand LASCON. Our conversation covered a lot of bases from the founding of Rugged DevOps to aligning organizational silos to lessons learned from W. Edwards Demings.
Agile teams report the lowest rate of measuring non-functional requirements. What does this mean for the evolution of quality in this era of Continuous Everything? To explore how the rise of SDLC acceleration trends such as Agile, DevOps, and Continuous Delivery are impacting software quality, Parasoft conducted a survey about measuring and monitoring non-functional requirements (NFRs). Here's a glimpse at what we discovered and what it means for the evolution of quality in this era of Continuo...
When I talk about driving innovation with self-organizing teams, I emphasize that such self-organization includes expecting the participants to organize their own teams, give themselves their own goals, and determine for themselves how to measure their success. In contrast, the definition of skunkworks points out that members of such teams are “usually specially selected.” Good thing he added the word usually – because specially selecting such teams throws a wrench in the entire works, limiting...