|By Wayne Salpietro||
|February 19, 2013 09:45 AM EST||
The promise of "the cloud" is that cloud storage delivers users seamless "just in time" storage scalability to handle growth and quickly respond to peak loads. The economics and business impact of cloud storage also delivers a compelling financial proposition in today's budget constrained IT environments. To the IT consumer shifting what was a capital expense and a fixed cost to a variable cost operating expense is financially compelling. Additionally, the ability to function in a "just in time" mode versus a "predictive" model for consumable storage also changes the CAPEX impact further assisting in justifying an already strong value proposition for adopting cloud storage.
IDC forecasts that cloud-based storage will represent a $15.6B market by 2015 with a compound annual growth rate (CAGR) of 32%. They also predict 10,000 service providers will focus on cloud storage with a data protection emphasis. The economics of this market transition will continue to evolve and accelerate as costs of delivering cloud services are optimized by service providers that become more efficient - ever mindful of the cost of their plant/facility, operating expenses and business margins.
The crucial technology that will provide service providers with the greatest cost reduction is data optimization. Data optimization incorporates deduplication, compression and thin provisioning technologies that maximize storage efficiency. Data optimization is the most significant innovation in the storage industry, enabling organizations to save more information in a smaller physical footprint. Nowhere does the fundamental concept of storing less data dramatically change economics more than for providers of cloud storage infrastructure.
When applied to the cloud, data optimization is a disruptive technology. It increases storage efficiency on a 5X to 35X scale. In cloud storage environments the impact is simple and compelling. For a data mix that yields a 5X boost to storage efficiency, costs drop to 20% of what they previously were. In addition, reductions in floor space, power and cooling and manpower all significantly improve operating efficiency. With a clearly differentiated value proposition the disruptive nature of applying data optimization enables the service provider to grow their business by gaining market share at the competitor's expense.
Data optimization's impact on storage consumption in the cloud or in the data center yields operating efficiency and critical business advantages;
- Reduced capital expense - in any storage environment media is a substantial expense. Disk drives and now SSDs are a significant cost because they need to be available in anticipation of demand. Reducing cost to store data will have a direct impact on the IT expense budget's bottom line. Data optimization applied here will drive down capital storage costs by 80% or more.
- Data center operating efficiency - with a 5X increase in storage efficiency the requirements for the cloud or data center proportionately decrease. The same data is stored on data optimized storage that is now 80% less and is delivered with 20% of existing floor space, power and cooling costs.
- Additional benefits are realized through data optimization in network bandwidth consumption, manpower requirements, operational systems needed to support the infrastructure and overall management of the cloud service provider or data center.
Worldwide data growth is consistently in the 50% per year range. The storage necessary to house this data dramatically impacts enterprise IT operations and their capital budgets. Enterprises are evaluating and planning to use cloud storage to gain business agility and manage capital cost variations. Cloud service providers compete on location, cost, reputation and SLAs. Providers must increasingly differentiate their offering either by developing technology in house or by leveraging third party solutions from major storage vendors. .
Industry analysts, well aware of the impact of continued data growth on IT budgets and the rapid adoption of cloud storage, continue to observe the impact of how data optimization significantly increases IT efficiency and recommend data optimization technologies for all phases of data storage from primary through to the cloud.
At a recent Gartner Conference, one of the keynote speakers was adamant that throwing hardware at the problem of rampant data growth will not work. His view was that IT needs to optimize storage capacity consumption with virtualization, data deduplication and bandwidth optimization. I found this to be quite a contrarian view since the platform vendors have dominated the dialogue in IT informatics. Here was a keynote speaker saying hardware is not the answer to managing the data glut...it's a software fix that we need. Later in the event another session, another speaker, and data deduplication was raised as "emerging as a top opportunity to positively impact data growth because of its ability to reduce the amount of data consuming costly storage." Additionally, the discussion also evolved to include the "green impact" of reducing the amount of data which in turn impacts floor space, power and cooling consumption. Seems that deduplication and data optimization is a win-win.
Disruptive technologies deliver competitive advantage. Data optimization is one of those technologies that can make market leaders by clearly differentiating what they offer versus their competitors. Sometimes it's economic and other times it can be technical leadership. In the case of both economic and technology leadership, a market disruptive technology, such as data optimization, will drive market share growth and enable rapid business growth at the expense of competitors.
In the case of cloud storage, the first storage systems vendor to bring cloud storage products to market with integrated data optimization technology will gain a huge advantage in the rapidly growing cloud service provider space. In addition, their products will become deeply entrenched into the cloud provider infrastructure because they reduce storage costs and increase overall operational efficiency.
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 12, 2016 12:00 AM EST Reads: 272
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 11, 2016 10:45 PM EST Reads: 361
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 11, 2016 10:00 PM EST Reads: 122
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
Feb. 11, 2016 10:00 PM EST Reads: 228
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
Feb. 11, 2016 06:00 PM EST
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 11, 2016 05:00 PM EST Reads: 391
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 11, 2016 04:15 PM EST Reads: 181
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 11, 2016 03:45 PM EST Reads: 413
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 11, 2016 02:45 PM EST Reads: 442
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 11, 2016 01:30 PM EST Reads: 441
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 11, 2016 12:00 PM EST Reads: 212
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 11, 2016 12:00 PM EST
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
Feb. 11, 2016 11:30 AM EST
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
Feb. 11, 2016 11:15 AM EST Reads: 289
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
Feb. 11, 2016 11:00 AM EST Reads: 421
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Feb. 11, 2016 11:00 AM EST Reads: 249
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...
Feb. 11, 2016 08:00 AM EST Reads: 488
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
Feb. 11, 2016 05:00 AM EST
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 11, 2016 02:30 AM EST Reads: 260
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
Feb. 11, 2016 02:00 AM EST Reads: 112