Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Liz McMillan, Carmen Gonzalez, JP Morgenthal

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Agile Computing, Release Management

@CloudExpo: Article

Why Your Analytics Should Be Hosted

It’s become increasingly clear that Big Data is transforming the business landscape

It's become increasingly clear that Big Data, and the tools for manipulating, visualizing and analyzing it, are transforming the business landscape. McKinsey released a report in 2011 that projects 40 percent growth in global data generated per year. This is all well and good, but more and more companies are finding that their toolbox for dealing with all of this data is antiquated and confusing.

Indeed, 58 percent of enterprise decision makers surveyed in March 2012 by DataXu felt they lacked the skills and technology required for marketing analytics. Marketers should be chomping at the bit to fruitfully employ the data they have. Successful marketing requires proper segmentation of the customer base to create more targeted campaigns. Real-time insight into the performance of existing campaigns and a clear grasp of where to redirect efforts can also turn a campaign that would have failed into a success. These are the promises made by the drivers of the current "data movement." The unfortunate reality, however, is that the accumulation of data just adds to the costs of an organization as it struggles to merely store the incoming torrent of data, let alone harness it and allow non-technical individuals to explore and understand it.

Luckily, this isn't the first time that industries have experienced this type of problem. The data movement is just like any other one that starts out as a niche interest to a select few people, eventually growing into a commoditized marketplace that competes on usability and ease of access.

Of all metaphors to pick for this process, the restaurant is an apt one. Cooking is something everyone can do. Mix up some batter, put it on a hot skillet, and you'll get pancakes. Add some eggs and a glass of orange juice and you've either got your brain on drugs or a complete breakfast. You can also go to your local IHOP and order the same thing. If you make it yourself, you know everything that's in it and can control the various aspects of the meal. But you also have to deal with acquiring the ingredients, having the facilities to cook, and doing the cleanup. If you go to a restaurant, all you have to do is show up, tell them what you want, and pay.

Similarly, the analytics space has two types of offerings. You can choose to do it yourself or you can use a hosted service to take care of things for you. As with cooking versus going to a restaurant, there are costs and benefits associated with both, but my biased opinion is that a hosted solution is the best choice for tackling the current influx of data.

Economies of Scale
Restaurants provide the benefits of economies of scale to their patrons, allowing customers to consume and enjoy foods that they normally wouldn't be able to at home. High-quality tuna is rather expensive and generally comes in quantities that no individual person could ever consume before it goes bad. Yet, you can go to a sushi restaurant and get various parts of the fish. This is economies of scale in action. The restaurant can afford to put down a significant sum of money to acquire the whole tuna and resell it in pieces to its patrons.

Hosted analytics presents a similar case. A hosted analytics provider is able to pay more money upfront for hardware than any one of its customers would. The reality of data processing is that there are physical limitations to the amount of data a computer can process given a certain amount of time. This problem can only be overcome with more and better hardware.

Because it serves multiple users, a hosted system is actually incentivized to provision enough machines to answer questions quickly. The compute resources are only required for the duration of a query against the system. The faster a query gets answered, the quicker those resources are freed up to answer someone else's query. Responding to queries fast enough to free up resources for the next query is actually the only way to achieve high levels of concurrency. Because the hosted provider is building their business on the idea that multiple customers will share the same infrastructure, they have to support more than just one query at a time and thus are naturally forced to provide their users with a faster querying experience. Economies of scale work to the users' advantage.

Integration of Diverse Data Streams
Another benefit of hosted analytics systems is that they can provide overnight integration with other data sets, both public and private. Taking this back to the restaurant analogy, restaurants add new items to their menu on a regular basis. If they find a supplier that will give them Alaskan king crab for the same price as a lesser form of crab, patrons will all of a sudden start eating better crab without having even known it was coming. The hosted analytics case is similar in that users can take advantage of new data sets that the provider has integrated.

Consider the following scenario. A marketer might normally have access to customer profile and engagement information through their analytics system. Companies like Amazon Web Services offer up data sets from the human genome, the U.S. Census Bureau, and Wikipedia. If a hosted analytics company integrates a public data set like one of these, they can then expose it to all of their clients. This means that if there are 1,000 customers of the hosted offering and only one of them asks for the integration of the public data set, 999 customers get that same integration overnight. All of the participants reap the benefits of having more data sets available. Through the process of overlaying various data streams, marketers can learn more about their customers and their behavior in order to better target their campaigns. This is just one more benefit hosted offerings provide to ensure that companies can maximally leverage the value of their data.

Useful Analytics
Analytics are only good if they are understandable and actionable, just as restaurants are only good if their food is edible and delicious. There are thousands of ingredients that could be mixed in with fried eggs, but some will taste delicious and some will just result in an inedible concoction. As patrons of many restaurants, we often come to a consensus on what various restaurants do well, personal taste notwithstanding. This knowledge can be employed to eat only the best meals. The same mechanism of collective understanding will play itself out in the hosted analytics space.

Any company that provides hosted analytics to a variety of businesses wants to give its customers only the most useful analytical metrics and functionalities. Marketers may not have the specific training to pinpoint exactly which analysis methods to leverage for maximal effect. That's where the multi-tenant properties of hosted analytics work to your benefit. The hosted analytics provider will be sensitive to which of their tools are providing the most value across their entire customer base. In other words, the individual customers all come together to form a collaborative filter to ensure that the less useful analytics features will be cast aside in favor of those that yield valuable insights. As with the integration of public data sets, this filtering mechanism ensures that benefits cascade throughout the entire system of analytics users. Even for features that do not seem to be immediately relevant to your company's success, as a customer of a hosted provider you can rest assured that once your company turns that corner in its business growth, the hosted provider already knows the kinds of analysis you'll find yourself needing and has the tools available. Newcomers to the platform are thus quickly able to reap the benefits of an analytical toolset that has been vetted by the crowd.

In the past few years, Big Data has exploded in importance. Marketers must learn how to take away useful, actionable insights from the mass of data at their hands in order to create a competitive advantage for their companies. Hosted analytics systems will truly prove themselves to be a staple choice for deciphering the increasing amounts of data that companies have to deal with, just as restaurants are a ubiquitous presence in our current lives.

In closing, we can stretch the restaurant metaphor just a little bit more. In both a restaurant and a home kitchen, there's an able cook who knows how to turn raw ingredients into a delicious meal. Similarly, the future still includes analysts who understand the intricacies of your business. You will, however, achieve much more efficient use of your analyst's time by leveraging the benefits of a hosted analytics provider: improved performance, "free" integration of external data sets, and collaborative vetting of the analytical feature set.

More Stories By Eric Tschetter

Eric Tschetter is the lead architect at Metamarkets, a leader in big data analytics for web-scale companies. Follow Metamarkets on Twitter @Metamarkets and learn more at www.metamarkets.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, will discuss how to use Kubernetes to setup a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace....
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry’s single source for the cloud. Fusion’s advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including cloud...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facing...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud: This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.