Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Andy Thurai, Pat Romanski, John Katrick

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, SDN Journal

@CloudExpo: Article

Data Optimization in Cloud Storage

The crucial technology that will provide service providers with the greatest cost reduction is data optimization

The promise of "the cloud" is that cloud storage delivers users seamless "just in time" storage scalability to handle growth and quickly respond to peak loads. The economics and business impact of cloud storage also delivers a compelling financial proposition in today's budget constrained IT environments. To the IT consumer shifting what was a capital expense and a fixed cost to a variable cost operating expense is financially compelling. Additionally, the ability to function in a "just in time" mode versus a "predictive" model for consumable storage also changes the CAPEX impact further assisting in justifying an already strong value proposition for adopting cloud storage.

IDC forecasts that cloud-based storage will represent a $15.6B market by 2015 with a compound annual growth rate (CAGR) of 32%. They also predict 10,000 service providers will focus on cloud storage with a data protection emphasis. The economics of this market transition will continue to evolve and accelerate as costs of delivering cloud services are optimized by service providers that become more efficient - ever mindful of the cost of their plant/facility, operating expenses and business margins.

The crucial technology that will provide service providers with the greatest cost reduction is data optimization. Data optimization incorporates deduplication, compression and thin provisioning technologies that maximize storage efficiency. Data optimization is the most significant innovation in the storage industry, enabling organizations to save more information in a smaller physical footprint. Nowhere does the fundamental concept of storing less data dramatically change economics more than for providers of cloud storage infrastructure.

When applied to the cloud, data optimization is a disruptive technology. It increases storage efficiency on a 5X to 35X scale. In cloud storage environments the impact is simple and compelling. For a data mix that yields a 5X boost to storage efficiency, costs drop to 20% of what they previously were. In addition, reductions in floor space, power and cooling and manpower all significantly improve operating efficiency. With a clearly differentiated value proposition the disruptive nature of applying data optimization enables the service provider to grow their business by gaining market share at the competitor's expense.

Data optimization's impact on storage consumption in the cloud or in the data center yields operating efficiency and critical business advantages;

  • Reduced capital expense - in any storage environment media is a substantial expense. Disk drives and now SSDs are a significant cost because they need to be available in anticipation of demand. Reducing cost to store data will have a direct impact on the IT expense budget's bottom line. Data optimization applied here will drive down capital storage costs by 80% or more.
  • Data center operating efficiency - with a 5X increase in storage efficiency the requirements for the cloud or data center proportionately decrease. The same data is stored on data optimized storage that is now 80% less and is delivered with 20% of existing floor space, power and cooling costs.
  • Additional benefits are realized through data optimization in network bandwidth consumption, manpower requirements, operational systems needed to support the infrastructure and overall management of the cloud service provider or data center.

Market Drivers
Worldwide data growth is consistently in the 50% per year range. The storage necessary to house this data dramatically impacts enterprise IT operations and their capital budgets. Enterprises are evaluating and planning to use cloud storage to gain business agility and manage capital cost variations. Cloud service providers compete on location, cost, reputation and SLAs. Providers must increasingly differentiate their offering either by developing technology in house or by leveraging third party solutions from major storage vendors. .

Industry analysts, well aware of the impact of continued data growth on IT budgets and the rapid adoption of cloud storage, continue to observe the impact of how data optimization significantly increases IT efficiency and recommend data optimization technologies for all phases of data storage from primary through to the cloud.

At a recent Gartner Conference, one of the keynote speakers was adamant that throwing hardware at the problem of rampant data growth will not work. His view was that IT needs to optimize storage capacity consumption with virtualization, data deduplication and bandwidth optimization. I found this to be quite a contrarian view since the platform vendors have dominated the dialogue in IT informatics. Here was a keynote speaker saying hardware is not the answer to managing the data glut...it's a software fix that we need. Later in the event another session, another speaker, and data deduplication was raised as "emerging as a top opportunity to positively impact data growth because of its ability to reduce the amount of data consuming costly storage." Additionally, the discussion also evolved to include the "green impact" of reducing the amount of data which in turn impacts floor space, power and cooling consumption. Seems that deduplication and data optimization is a win-win.

Summary
Disruptive technologies deliver competitive advantage. Data optimization is one of those technologies that can make market leaders by clearly differentiating what they offer versus their competitors. Sometimes it's economic and other times it can be technical leadership. In the case of both economic and technology leadership, a market disruptive technology, such as data optimization, will drive market share growth and enable rapid business growth at the expense of competitors.

In the case of cloud storage, the first storage systems vendor to bring cloud storage products to market with integrated data optimization technology will gain a huge advantage in the rapidly growing cloud service provider space. In addition, their products will become deeply entrenched into the cloud provider infrastructure because they reduce storage costs and increase overall operational efficiency.

More Stories By Wayne Salpietro

Wayne Salpietro is the director of product and social media marketing at data storage and cloud backup services provider Permabit Technology Corp. He has served in this capacity for the past six years, prior to which he held product marketing and managerial roles at CA, HP, and IBM.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Kin Lane recently wrote a couple of blogs about why copyrighting an API is not common. I couldn’t agree more that copyrighting APIs is uncommon. First of all, the API definition is just an interface (It is the implementation detail … Continue reading →
The United States spends around 17-18% of its GDP on healthcare every year. Translated into dollars, it is a mind-boggling $2.9 trillion. Unfortunately, that spending will grow at a faster rate now due to baby boomers becoming an aging population, and they are the largest demographic in the U.S. Unless the U.S. gets this spiraling healthcare spending under control, in a few short years we will be spending almost 25% of our entire GDP in healthcare trying to fix people’s failing health, instead o...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...