Click here to close now.




















Welcome!

Microservices Expo Authors: Liz McMillan, Trevor Parsons, Lori MacVittie, Roger Strukhoff, Tom Lounibos

Related Topics: Open Source Cloud, Microservices Expo

Open Source Cloud: Article

Case Study: Open Source + Business Intelligence

A marriage made for data-driven businesses

Data-driven businesses are facing some tough challenges in today's rapidly changing information landscape. As decision cycles continue to shrink, companies need to act on information within hours and minutes rather than weeks and days. At the same time, the volume of data that needs to be analyzed is growing exponentially. Business intelligence (BI) approaches that might have made sense a decade or even five years ago may no longer be the best fit for organizations that must quickly and affordably make sense of terabytes of incoming data that shows no sign of slowing down.

For my company, MX Force, speedy data analysis is not simply a "nice to have," it's critical to our business. As a cloud-based provider of email security for organizations of all sizes, we need to identify the origins of spam, viruses and other potential threats for our clients, fast. But as our business has grown, so has the volume of email log data that we must store, filter, search, analyze and report on. Recently, we were challenged to find a database that could reliably enable quick and efficient ad-hoc queries on up to a year's worth of email log data. Our staff uses this data to analyze and report on statistical information, and we also give our clients the ability to query their own logs to diagnose mail delivery issues. It was important to find a database that could deliver the high performance we required, but affordability and ease of administration were also of vital concern. These considerations prompted us to seek an open source solution.

Open Source Meets Business Intelligence
MX Force uses a number of open source tools within our organization. The low cost of open source is one reason for this, but flexibility is another important driver. Because open source projects are community-driven, users can tweak, customize and tinker with the software as much as they like. This is a big advantage when it comes to business intelligence, as data analysis requirements can change quickly, and you don't want to have to wait weeks or months to get a new query set up or to change the parameters of those that are already running. MX Force was already using MySQL in our business, so we decided to try Infobright's open source analytic database, ICE (Infobright Community Edition.) ICE combines a columnar database with innovative compression and self-tuning capabilities that eliminate the need to create indexes, partition data or do any manual intervention to achieve fast response for queries and reports. The software is built on MySQL, so for us there was a very small implementation and training curve - ICE uses the same familiar MySQL interface. The fact that ICE is an open source analytic solution presented us with several key benefits:

  1. Deployment speed: The time from download and installation to first production use was just three weeks.
  2. Affordability: Many of the proprietary commercial BI solutions available today require custom configuration, expensive licensing agreements and equally expensive hardware to support and run it. Not only was ICE free to install, we could also run the software on inexpensive commodity servers, eliminating the need to invest in high performance servers and storage arrays. (Our entire workload is supported by a single quad-core server.)
  3. Simplicity and flexibility: Because ICE is open and standards-based, we can quickly make changes as needed without requiring extensive IT assistance. In addition, it's often a lot simpler to make fixes or upgrade an open source solution because an entire community contributes their expertise to fixing bugs and making improvements. With proprietary software, users have to wait for issues to be addressed by the vendor, which can take much longer.

MX Force is currently using ICE to quickly isolate mail flow problems and trends. In our experience, using a free, open source product has not in any way involved a compromise on performance or capabilities. We are achieving 10:1 data compression, which saves on storage costs and boosts performance. Most statistical queries render results in less than five seconds. Ongoing administration is simple. The net result is that the product delivers the fast query performance and reporting functionality we needed, at an incredibly low cost for hardware and ongoing maintenance.

Look, then Leap
Interested in giving open source a try for your BI and analytic efforts? There are a number of compelling benefits to doing so, but as with any type of software, it's also important to look before you leap. Evaluation and testing considerations are no different than they would be for licensed software - you want to make sure the solution has the features and capabilities most relevant to your business. Also, there's a difference between open source projects that are at a very early and experimental stage and software that is well established and has a vibrant and involved community behind it, strong vendor support, or both. Investigate the support offered for the solution under consideration. How often are new features added? Are bug fixes made in a timely manner? Is there useful and accurate supporting documentation?

With ICE, we were certainly attracted by the many resources and significant participation of both Infobright and the user community. We also knew there was a commercial version available if we decided we needed the additional functionality it offered or a formal support contract. For companies just jumping in to the open source arena, it's best to avoid tools that haven't yet cultivated a strong following. But even if you do make a mistake, the low (and usually free) cost of open source means that there's minimal risk.

The BI requirements of today's data-driven businesses demand speed, simplicity and affordability. As open source solutions continue to mature, it's worth looking at projects that are focused on analytics, BI and other data management activities. The more nimble and flexible approach embodied by open source may just be the best fit for addressing the many information management challenges driven by data growth and complexity.

More Stories By Mike Makowski

Mike Makowski is CTO of MX Force, a leading provider of email security in the cloud and member of Infobright’s Customer Advisory Council. More information about MX Force can be found at http://www.mxforce.com/

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Auto-scaling environments, micro-service architectures and globally-distributed teams are just three common examples of why organizations today need automation and interoperability more than ever. But is interoperability something we simply start doing, or does it require a reexamination of our processes? And can we really improve our processes without first making interoperability a requirement for how we choose our tools?
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
This week, I joined SOASTA as Senior Vice President of Performance Analytics. Given my background in cloud computing and distributed systems operations — you may have read my blogs on CNET or GigaOm — this may surprise you, but I want to explain why this is the perfect time to take on this opportunity with this team. In fact, that’s probably the best way to break this down. To explain why I’d leave the world of infrastructure and code for the world of data and analytics, let’s explore the timing...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
JavaScript is primarily a client-based dynamic scripting language most commonly used within web browsers as client-side scripts to interact with the user, browser, and communicate asynchronously to servers. If you have been part of any web-based development, odds are you have worked with JavaScript in one form or another. In this article, I'll focus on the aspects of JavaScript that are relevant within the Node.js environment.
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Summer is finally here and it’s time for a DevOps summer vacation. From San Francisco to New York City, our top summer conferences list is going to continuously deliver you to the summer destinations of your dreams. These DevOps parties are hitting all the hottest summer trends with Microservices, Agile, Continuous Delivery, DevSecOps, and even Continuous Testing. Move over Kanye. These are the top 5 Summer DevOps Conferences of 2015.
Countless business models have spawned from the IaaS industry. Resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his General Session at 16th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, broke down what we've got to work with and discuss the benefits and pitfalls to discover how we can best use them to d...
Puppet Labs has published their annual State of DevOps report and it is loaded with interesting information as always. Last year’s report brought home the point that DevOps was becoming widely accepted in the enterprise. This year’s report further validates that point and provides us with some interesting insights from surveying a wide variety of companies in different phases of their DevOps journey.
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Microservices are hot. And for good reason. To compete in today’s fast-moving application economy, it makes sense to break large, monolithic applications down into discrete functional units. Such an approach makes it easier to update and add functionalities (text-messaging a customer, calculating sales tax for a specific geography, etc.) and get those updates / adds into production fast. In fact, some would argue that microservices are a prerequisite for true continuous delivery. But is it too...
What we really mean to ask is whether microservices architecture is SOA done right. But then, of course, we’d have to figure out what microservices architecture was. And if you think defining SOA is difficult, pinning down microservices architecture is unquestionably frying pan into fire time. Given my years at ZapThink, fighting to help architects understand what Service-Oriented Architecture really was and how to get it right, it’s no surprise that many people ask me this question.
"ProfitBricks was founded in 2010 and we are the painless cloud - and we are also the Infrastructure as a Service 2.0 company," noted Achim Weiss, Chief Executive Officer and Co-Founder of ProfitBricks, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
One of the ways to increase scalability of services – and applications – is to go “stateless.” The reasons for this are many, but in general by eliminating the mapping between a single client and a single app or service instance you eliminate the need for resources to manage state in the app (overhead) and improve the distributability (I can make up words if I want) of requests across a pool of instances. The latter occurs because sessions don’t need to hang out and consume resources that could ...
Approved this February by the Internet Engineering Task Force (IETF), HTTP/2 is the first major update to HTTP since 1999, when HTTP/1.1 was standardized. Designed with performance in mind, one of the biggest goals of HTTP/2 implementation is to decrease latency while maintaining a high-level compatibility with HTTP/1.1. Though not all testing activities will be impacted by the new protocol, it's important for testers to be aware of any changes moving forward.