Welcome!

Microservices Expo Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, AppNeta Blog, Liz McMillan

Related Topics: @BigDataExpo, Microservices Expo, IBM Cloud, Agile Computing, @CloudExpo, Cloud Security, SDN Journal

@BigDataExpo: Article

Innovative Analytics - Changing the IT Landscape

How log analysis is transforming how products are monitored

A common theme heard in the IT marketplace is that innovative technologies are enabling new insights to be harvested from data. Cloud computing provides the necessary infrastructure that allows smaller organizations to enjoy the value from these insights. As cloud computing becomes mainstream, analytics driven by new sources are leveraged by formation of new partnerships. GE and Google recently announced a partnership for the utility business that integrates Google map functions with GE's geospatial analytical tools to improve visualization. Any way you look at it, integrating consumer and industrial assets with speed and ease is game changing.

Recently, I attended the Strata and IBM Pulse 2013 Conferences that were relatively different but that both incorporated common analytics themes. The Strata Conference is independently organized and focused on data analytics while IBM Pulse, run by the company's Tivoli organization, primarily targets systems management.

Pulse 2013 focused more attention on analytics than last year's event.  While the Strata Conference focused on start-ups that demonstrated interesting analytic products, IBM is looking to leverage analytics as a core competency by embedding those processes into current solutions. Speaking at IBM's recent investor meeting, CEO Ginni Rometty projected $20B in analytics and big data revenues by 2015. To meet this goal, all IBM brands seem to be adding analytic capabilities.

One analytics area I found interesting at both Strata and IBM Pulse were solutions focused on system logs. Computer data logging is a common practice and supports the goal of keeping a history of administrator activities and problem diagnoses.  After an outage, logs are thoroughly analyzed by technicians who scour for the root cause then take appropriate steps to fix malfunctioning components.

The number of inter-connected components in a cloud computing infrastructure significantly increases the volume and variety of log data produced, as well as its importance. Traditional log file solutions leveraged legacy database technologies that are unable to process data being created at the variety and velocity today in a timely and cost effective manner. At the Strata Conference, Glassbeam and Splunk offered interesting solutions; and at Pulse, IBM released a new product capability in their SmartCloud portfolio that analyzes logs, events and metrics in an integrated manner. Let's discuss each in turn.

  • Glassbeam's goal is to provide a platform to help companies get intelligence from machine data. Their partnerships with LogiXML, OpSource (now part of Dimension Data) and Vertica (now part of HP) give them the capability to quickly assemble an innovative solution. Capabilities that focus on customer intelligence, product engineering and service revenue will be of interest to any organization looking to leverage intelligence and value from log data.
  • Splunk surprised a lot of folks when their stock price doubled on the first day of trading after they went public, offering insight to investors' thinking regarding Splunk's capabilities and the value of log analytics to customers. While Glassbeam is a niche and vertical solution, Splunk is a broad solution that covers the gamut of logs, from call centers to click streams.
  • IBM is a late entrant into developing a core product focused on log analytics leveraging new technologies. Their new product is promising as it integrates competencies from multiple software divisions as well as the recent Vivisimo acquisition. This combination of products brings the challenge of integration and installation, while allowing the new product to pull strengths of multiple best-of-breed products. By integrating systems manuals as an additional source to identify specific problems, this new workload analytics capability will be a good addition for existing and new IBM Tivoli customers.

Application logs have offered ways to diagnose system problems for a long time. Analyzing logs from multiple operational processes is a new challenge, and new technologies like Hadoop provide an excellent way to proactively act on intelligence gained. As technologies mature, technology giants will be forced to introduce new products to meet customer needs.

The ability to create new products rapidly is demonstrated by newcomers like Glassbeam and Splunk, changing the IT landscape. Pulse 2013 exhibited legacy product improvements along with new products bundling multiple capabilities from a diverse portfolio to address customer demand. Existing technology vendors need to quickly build their own products or acquire companies that will allow them to fill in this void in their portfolios. Predictive analytics from log data is a good example of how businesses can add significant value to their customers.

This post was first published on Robustcloud.com. Republished with permission.

More Stories By Larry Carvalho

Larry Carvalho runs Robust Cloud LLC, an advisory services company helping various ecosystem players develop a strategy to take advantage of cloud computing. As the 2010-12 Instructor of Cloud Expo's popular Cloud Computing Bootcamp, he has already led the bootcamp in New York, Silicon Valley, and Prague, receiving strong positive feedback from attendees about the value gained at these events. Carvalho has facilitated all-day sessions at customer locations to set a clear roadmap and gain consensus among attendees on strategy and product direction. He has participated in multiple discussion panels focused on cloud computing trends at information technology events, and he has delivered all-day cloud computing training to customers in conjunction with CloudCamps. To date, his role has taken him to clients in three continents.

@MicroservicesExpo Stories
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
The IT industry is undergoing a significant evolution to keep up with cloud application demand. We see this happening as a mindset shift, from traditional IT teams to more well-rounded, cloud-focused job roles. The IT industry has become so cloud-minded that Gartner predicts that by 2020, this cloud shift will impact more than $1 trillion of global IT spending. This shift, however, has left some IT professionals feeling a little anxious about what lies ahead. The good news is that cloud computin...
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership abi...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...