|By Robert Eve||
|February 15, 2013 08:00 AM EST||
Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.
Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.
Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.
In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data. But they remain information poor.
In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.
What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.
Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.
With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.
Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.
Data virtualization provides instant access to all the data you want, the way you want it.
Enterprise, cloud, Big Data, and more, no problem!
What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.
- Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
- Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
- Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
- Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.
Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.
- Business Leaders - Data virtualization helps you drive business advantage from your data.
- Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
- CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
- CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
- Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.
How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.
- Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
- Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
- Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.
Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.
When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:
- Agile Analytics and BI Solutions
- Data Warehouse Extension Solutions
- Logical Data Warehouse Solutions
- Data Virtualization Architecture Solutions
- Data Integration and Management Solutions
- Business Solutions
- Industry Solutions
When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem. Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case. And sometimes a hybrid mix is the right answer.
You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.
What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.
- Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
- Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties. Plus data virtualization's rapid development and quick iterations lower your IT project risk.
- Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
- Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
- Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.
How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.
Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader. But how do you define the market leader
Is it the one with the most mature product? For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.
Is it the one with the most installations? For example the same vendor is used by nearly two hundred of world's largest organizations
Is it the one with them most domain knowledge? This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:
- The first book on data virtualization, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
- Data virtualization's foremost microsite, the DV Café
- The Data Virtualization Leadership Series of analyst reports on data virtualization
- Data virtualization's only dedicated blog, the Data Virtualization Leadership Blog
- The Data Virtualization Channel on YouTube with users, analysts, chalk talks and more
- The Data Virtualization Leadership Awards honoring users
- Data Virtualization Day Resources, assets from the premier events in data virtualization
- Data virtualization's longest running newsletter, Enterprise Information Insight
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.
It is with great pleasure that I am able to announce that Jesse Proudman, Blue Box CTO, has been appointed to the position of IBM Distinguished Engineer. Jesse is the first employee at Blue Box to receive this honor, and I’m quite confident there will be more to follow given the amazing talent at Blue Box with whom I have had the pleasure to collaborate. I’d like to provide an overview of what it means to become an IBM Distinguished Engineer.
Oct. 10, 2015 04:00 AM EDT Reads: 261
The cloud has reached mainstream IT. Those 18.7 million data centers out there (server closets to corporate data centers to colocation deployments) are moving to the cloud. In his session at 17th Cloud Expo, Achim Weiss, CEO & co-founder of ProfitBricks, will share how two companies – one in the U.S. and one in Germany – are achieving their goals with cloud infrastructure. More than a case study, he will share the details of how they prioritized their cloud computing infrastructure deployments ...
Oct. 10, 2015 03:00 AM EDT Reads: 753
Ten years ago, there may have been only a single application that talked directly to the database and spit out HTML; customer service, sales - most of the organizations I work with have been moving toward a design philosophy more like unix, where each application consists of a series of small tools stitched together. In web example above, that likely means a login service combines with webpages that call other services - like enter and update record. That allows the customer service team to writ...
Oct. 10, 2015 02:45 AM EDT Reads: 458
As we increasingly rely on technology to improve the quality and efficiency of our personal and professional lives, software has become the key business differentiator. Organizations must release software faster, as well as ensure the safety, security, and reliability of their applications. The option to make trade-offs between time and quality no longer exists—software teams must deliver quality and speed. To meet these expectations, businesses have shifted from more traditional approaches of d...
Oct. 10, 2015 02:15 AM EDT Reads: 245
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Oct. 10, 2015 02:00 AM EDT Reads: 312
If you are new to Python, you might be confused about the different versions that are available. Although Python 3 is the latest generation of the language, many programmers still use Python 2.7, the final update to Python 2, which was released in 2010. There is currently no clear-cut answer to the question of which version of Python you should use; the decision depends on what you want to achieve. While Python 3 is clearly the future of the language, some programmers choose to remain with Py...
Oct. 10, 2015 02:00 AM EDT Reads: 271
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
Oct. 10, 2015 02:00 AM EDT Reads: 654
Achim Weiss is Chief Executive Officer and co-founder of ProfitBricks. In 1995, he broke off his studies to co-found the web hosting company "Schlund+Partner." The company "Schlund+Partner" later became the 1&1 web hosting product line. From 1995 to 2008, he was the technical director for several important projects: the largest web hosting platform in the world, the second largest DSL platform, a video on-demand delivery network, the largest eMail backend in Europe, and a universal billing syste...
Oct. 10, 2015 01:00 AM EDT Reads: 246
Somebody call the buzzword police: we have a serious case of microservices-washing in progress. The term “microservices-washing” is derived from “whitewashing,” meaning to hide some inconvenient truth with bluster and nonsense. We saw plenty of cloudwashing a few years ago, as vendors and enterprises alike pretended what they were doing was cloud, even though it wasn’t. Today, the hype around microservices has led to the same kind of obfuscation, as vendors and enterprise technologists alike ar...
Oct. 10, 2015 12:00 AM EDT Reads: 497
Opinions on how best to package and deliver applications are legion and, like many other aspects of the software world, are subject to recurring trend cycles. On the server-side, the current favorite is container delivery: a “full stack” approach in which your application and everything it needs to run are specified in a container definition. That definition is then “compiled” down to a container image and deployed by retrieving the image and passing it to a container runtime to create a running...
Oct. 10, 2015 12:00 AM EDT Reads: 273
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Oct. 10, 2015 12:00 AM EDT Reads: 268
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively.
Oct. 10, 2015 12:00 AM EDT Reads: 221
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Oct. 9, 2015 08:00 PM EDT Reads: 941
Saviynt Inc. has announced the availability of the next release of Saviynt for AWS. The comprehensive security and compliance solution provides a Command-and-Control center to gain visibility into risks in AWS, enforce real-time protection of critical workloads as well as data and automate access life-cycle governance. The solution enables AWS customers to meet their compliance mandates such as ITAR, SOX, PCI, etc. by including an extensive risk and controls library to detect known threats and b...
Oct. 9, 2015 03:00 PM EDT Reads: 242
Docker is hot. However, as Docker container use spreads into more mature production pipelines, there can be issues about control of Docker images to ensure they are production-ready. Is a promotion-based model appropriate to control and track the flow of Docker images from development to production? In his session at DevOps Summit, Fred Simon, Co-founder and Chief Architect of JFrog, will demonstrate how to implement a promotion model for Docker images using a binary repository, and then show h...
Oct. 9, 2015 02:15 PM EDT Reads: 192
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, will explore HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
Oct. 9, 2015 01:30 PM EDT Reads: 167
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
Oct. 9, 2015 01:15 PM EDT Reads: 266
Despite all the talk about public cloud services and DevOps, you would think the move to cloud for enterprises is clear and simple. But in a survey of almost 1,600 IT decision makers across the USA and Europe, the state of the cloud in enterprise today is still fraught with considerable frustration. The business case for apps in the real world cloud is hybrid, bimodal, multi-platform, and difficult. Download this report commissioned by NTT Communications to see the insightful findings – registra...
Oct. 9, 2015 01:00 PM EDT Reads: 306
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Oct. 9, 2015 12:30 PM EDT Reads: 1,108
DevOps Summit at Cloud Expo 2014 Silicon Valley was a terrific event for us. The Qubell booth was crowded on all three days. We ran demos every 30 minutes with folks lining up to get a seat and usually standing around. It was great to meet and talk to over 500 people! My keynote was well received and so was Stan's joint presentation with RingCentral on Devops for BigData. I also participated in two Power Panels – ‘Women in Technology’ and ‘Why DevOps Is Even More Important than You Think,’ both ...
Oct. 9, 2015 12:00 PM EDT Reads: 8,679