Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, @DXWorldExpo, SDN Journal

Containers Expo Blog: Article

Take Big Advantage of Your Data

A fresh look at data virtualization

Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.

Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.

Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.

In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data.  But they remain information poor.

In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.

What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.

Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.

With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.

Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.

Data virtualization provides instant access to all the data you want, the way you want it.

Enterprise, cloud, Big Data, and more, no problem!

What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.

  • Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
  • Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
  • Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
  • Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.

Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.

  • Business Leaders - Data virtualization helps you drive business advantage from your data.
  • Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
  • CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
  • CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
  • Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.

How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.

  • Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
  • Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
  • Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.

Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.

When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:

When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem.  Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case.  And sometimes a hybrid mix is the right answer.

You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.

What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.

  • Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
  • Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties.  Plus data virtualization's rapid development and quick iterations lower your IT project risk.
  • Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
  • Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
  • Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.

How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.

You can also deploy data virtualization in a more enterprise-wide manner, with common semantics, shared objects and architecture, and an Integration Competency Center.

Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader.  But how do you define the market leader

Is it the one with the most mature product?  For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.

Is it the one with the most installations?  For example the same vendor is used by nearly two hundred of world's largest organizations

Is it the one with them most domain knowledge?  This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:

Conclusion
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Microservices Articles
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...