Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Mehdi Daoudi, Yeshim Deniz

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

A New Era of Post-Production Data Management Software

Solving legacy problems of backup to tape

By Jerome M. Wendt

Nearly every small, medium or large organization is heading down the path of adopting disk-based data protection as a way to solve their legacy problems of backup to tape. But what many of these organizations have yet to recognize is that as they adopt disk to store these post-production copies of data, a new opportunity is presenting itself. They now have the option to manage and leverage post-production data in ways that were never possible when on tape but now lack the tools to do so.

Nearly every organization is somewhere on the journey towards implementing disk-based backup. But that does not mean every organization is implementing it in the same way or using the same technologies. Some are introducing disk in lieu of tape as a backup target. Others are using disk-based snapshots while still others are using some form of replication. Some are even using a combination all of these forms of disk-based data protection.

Then as they store this post-production data on disk they are looking to optimize how this data is stored. This is leading most organizations to try to deduplicate this data whenever possible and then use replication to move the data off-site.

But now that organizations are storing all of this data to disk, they have both a new opportunity and a new challenge in front of them. The opportunity is they now have near-real time copies of production data on disk.

These post-production copies of data are at most 24 hours old and may only be hours, minutes or seconds removed from the production copy of data from which they were derived. As such, they are excellent candidates for use for testing, development and disaster recovery.

At the same time, the challenge is finding software that can do this "post-production data management." After all, it is one thing to create a backup, snapshot or set up a replication job for a particular set of production data. But it is quite another job to then manage the tens, hundreds or even possibly thousands of these copies of post-production data in such a way that they are useable and easily managed by an organization.

These requirements are making it evident that a new category of software, which I refer to as "post-production data management software," needs to emerge. This software needs to give organizations the ability to manage this post-production data in such a way that they may perform tasks like:

  • Minimizing or eliminating backup and recovery windows by ensuring that data is on the right tier of disk at the right time
  • Ensuring that these post-production copies of data (backups, snapshots, replicated) are deduplicated whenever possible to conserve storage capacity
  • Data is replicated locally or remotely so business continuity and disaster recovery can occur automatically with minimal need for setup or ongoing management

Already we are seeing these next generation types of post-production data management capabilities begin to emerge. This last week FalconStor announced its new RecoverTrac technology that is now part of its Continuous Data Protector (CDP) software. RecoverTrac's purpose is to help CDP's snapshots and replication features deliver real world, turnkey business continuity and disaster recovery functionality without requiring organizations to spend tens or hundreds of thousands of dollars and numerous man hours in order to deploy, test, implement and manage it.

InMage is another company that has also been doing something similar for some time. In a series of two blog entries I wrote a couple of years ago I covered how a Dr. James Tu, an Information Security Officer at a real estate company, was using InMage Scout's replication capabilities to replicate data between two sites. Then within Scout he was creating virtual mount points so he could use them to perform recoveries at the secondary site.

Yet possibly the most interesting entrant in this emerging space is a new company called Actifio. It has built its entire platform around this concept of post-production data management and has gone beyond just acting as a backup target, replicating production data or creating snapshots of production data. While it can perform all three of these functions (it can act as either a backup target or a production file system which replicates or takes snapshots of production data) its ultimate objective is to optimally manage the post-production data under its control.

Once it has these post-production copies of data in whatever form they take (backup, snapshot or replicated,) Actifio can then deduplicate this data, put it on the appropriate tier of disk for recoveries or replicate it offsite for DR.

What specifically makes Actifio unique is its virtualization technology to re-purpose a unique copy of all post-production data for each of the different operational requirements. Instead of the complexity and cost of multiple tools for backup, snapshot, deduplication, disaster recovery or business continuance, Actifio is a single solution that manages data through its entire life cycle.

In the next few years I expect almost every size company to complete the transition from tape-based backup to using some form of disk-based data protection. However once that transition is complete, these organizations are going to recognize that just having all of their post-production data stored on disk adds insufficient value to their organization.

Instead they will want the option to easily and centrally manage where this post-production data is placed to facilitate business continuity, disaster recoveries and even operational testing and development of production applications. It is for this reason that I believe that companies will rapidly move beyond just implementing disk-based data protection and that a new era of organizations wanting and needing software that automates the management and placement of their post-production data is dawning.

More Stories By Derek Harris

Derek Harris Sr. is a senior technology writer and blogger with more than 20 years experience in journalism.

While covering a broad spectrum of technology segments, his focus is weighted on enterprise technologies in the data storage, security and infrastructure spaces.

Microservices Articles
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.