|By Robert Eve||
|February 15, 2013 08:00 AM EST||
Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.
Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.
Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.
In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data. But they remain information poor.
In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.
What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.
Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.
With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.
Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.
Data virtualization provides instant access to all the data you want, the way you want it.
Enterprise, cloud, Big Data, and more, no problem!
What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.
- Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
- Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
- Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
- Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.
Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.
- Business Leaders - Data virtualization helps you drive business advantage from your data.
- Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
- CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
- CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
- Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.
How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.
- Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
- Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
- Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.
Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.
When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:
- Agile Analytics and BI Solutions
- Data Warehouse Extension Solutions
- Logical Data Warehouse Solutions
- Data Virtualization Architecture Solutions
- Data Integration and Management Solutions
- Business Solutions
- Industry Solutions
When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem. Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case. And sometimes a hybrid mix is the right answer.
You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.
What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.
- Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
- Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties. Plus data virtualization's rapid development and quick iterations lower your IT project risk.
- Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
- Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
- Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.
How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.
Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader. But how do you define the market leader
Is it the one with the most mature product? For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.
Is it the one with the most installations? For example the same vendor is used by nearly two hundred of world's largest organizations
Is it the one with them most domain knowledge? This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:
- The first book on data virtualization, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
- Data virtualization's foremost microsite, the DV Café
- The Data Virtualization Leadership Series of analyst reports on data virtualization
- Data virtualization's only dedicated blog, the Data Virtualization Leadership Blog
- The Data Virtualization Channel on YouTube with users, analysts, chalk talks and more
- The Data Virtualization Leadership Awards honoring users
- Data Virtualization Day Resources, assets from the premier events in data virtualization
- Data virtualization's longest running newsletter, Enterprise Information Insight
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.
While poor system performance occurs for any number of reasons (poor code, understaffed teams, inadequate legacy systems), this week’s post should help you quickly diagnose and fix a few common problems, while setting yourself up for a more stable future at the same time. Modern application frameworks have made it very easy to build not only powerful back-ends, but also rich, web-based user interfaces that are pushed out to the client in real-time. Often this involves a lot of data being transf...
Apr. 1, 2015 04:12 PM EDT Reads: 400
InfoScout in San Francisco gleans new levels of accurate insights into retail buyer behavior by collecting data directly from consumers’ sales receipts. In order to better analyze actual retail behaviors and patterns, InfoScout provides incentives for buyers to share their receipts, but InfoScout is then faced with the daunting task of managing and cleansing that essential data to provide actionable and understandable insights.
Apr. 1, 2015 03:45 PM EDT Reads: 381
Best practices for helping DevOps and Test collaborate in ways that make your SDLC leaner and more scalable. The business demand for "more innovative software, faster" is driving a surge of interest in DevOps, Agile and Lean software development practices. However, today's testing processes are typically bogged down by weighty burdens such as the difficulty of 1) accessing complete Dev/Test environments; 2) acquiring complete, sanitized test data; and 3) configuring the behavior of the environm...
Apr. 1, 2015 03:20 PM EDT Reads: 407
SYS-CON Events announced today that MangoApps will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY., and the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides private all-in-one social intranets allowing workers to securely collaborate from anywhere in the world and from any device. Social, mobile, and eas...
Apr. 1, 2015 03:00 PM EDT Reads: 3,280
As a group of concepts, DevOps has converged on several prominent themes including continuous software delivery, automation, and configuration management (CM). These integral pieces often form the pillars of an organization’s DevOps efforts, even as other bigger pieces like overarching best practices and guidelines are still being tried and tested. Being that DevOps is a relatively new paradigm - movement - methodology - [insert your own label here], standards around it have yet to be codified a...
Apr. 1, 2015 03:00 PM EDT Reads: 958
SYS-CON Events announced today that Solgenia will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY, and the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between Personal and Professional S...
Apr. 1, 2015 03:00 PM EDT Reads: 3,108
Learn the top API testing issues that organizations encounter and how automation plus a DevOps team approach can address these top API testing challenges. Ensuring API integrity is difficult in today's complex application cloud, on-premises and hybrid environment scenarios. In this interview with TechTarget, Parasoft solution architect manager Spencer Debrosse shares his experiences about the top API testing issues that organizations encounter and how automation and a DevOps team approach can a...
Apr. 1, 2015 02:00 PM EDT Reads: 515
Chef and Canonical announced a partnership to integrate and distribute Chef with Ubuntu. Canonical is integrating the Chef automation platform with Canonical's Machine-As-A-Service (MAAS), enabling users to automate the provisioning, configuration and deployment of bare metal compute resources in the data center. Canonical is packaging Chef 12 server in upcoming distributions of its Ubuntu open source operating system and will provide commercial support for Chef within its user base.
Apr. 1, 2015 01:00 PM EDT Reads: 602
When it comes to microservices there are myths and uncertainty about the journey ahead. Deploying a “Hello World” app on Docker is a long way from making microservices work in real enterprises with large applications, complex environments and existing organizational structures. February 19, 2015 10:00am PT / 1:00pm ET → 45 Minutes Join our four experts: Special host Gene Kim, Gary Gruver, Randy Shoup and XebiaLabs’ Andrew Phillips as they explore the realities of microservices in today’s IT worl...
Apr. 1, 2015 12:45 PM EDT Reads: 1,944
After what feel like an interminable cycle of media frenzy followed by hype and hysteria cycles, the practical elements of real world cloud implementations are starting to become better documented. But what is really different in the cloud? How do software applications behave, live, interact and interconnect inside the cloud? Where do cloud architectures differ so markedly from their predecessors that we need to learn a new set of mechanics – and, when do we start to refer to software progra...
Apr. 1, 2015 12:45 PM EDT Reads: 548
The world's leading Cloud event, Cloud Expo has launched Microservices Journal on the SYS-CON.com portal, featuring over 19,000 original articles, news stories, features, and blog entries. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. Microservices Journal offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Follow new article posts on T...
Apr. 1, 2015 12:00 PM EDT Reads: 1,610
Even though it’s now Microservices Journal, long-time fans of SOA World Magazine can take comfort in the fact that the URL – soa.sys-con.com – remains unchanged. And that’s no mistake, as microservices are really nothing more than a new and improved take on the Service-Oriented Architecture (SOA) best practices we struggled to hammer out over the last decade. Skeptics, however, might say that this change is nothing more than an exercise in buzzword-hopping. SOA is passé, and now that people are ...
Apr. 1, 2015 12:00 PM EDT Reads: 1,483
Hosted PaaS providers have given independent developers and startups huge advantages in efficiency and reduced time-to-market over their more process-bound counterparts in enterprises. Software frameworks are now available that allow enterprise IT departments to provide these same advantages for developers in their own organization. In his workshop session at DevOps Summit, Troy Topnik, ActiveState’s Technical Product Manager, will show how on-prem or cloud-hosted Private PaaS can enable organ...
Apr. 1, 2015 12:00 PM EDT Reads: 1,459
For those of us that have been practicing SOA for over a decade, it's surprising that there's so much interest in microservices. In fairness microservices don't look like the vendor play that was early SOA in the early noughties. But experienced SOA practitioners everywhere will be wondering if microservices is actually a good thing. You see microservices is basically an SOA pattern that inherits all the well-known SOA principles and adds characteristics that address the use of SOA for distribut...
Apr. 1, 2015 11:00 AM EDT Reads: 1,128
SYS-CON Events announced today the IoT Bootcamp – Jumpstart Your IoT Strategy, being held June 9–10, 2015, in conjunction with 16th Cloud Expo and Internet of @ThingsExpo at the Javits Center in New York City. This is your chance to jumpstart your IoT strategy. Combined with real-world scenarios and use cases, the IoT Bootcamp is not just based on presentations but includes hands-on demos and walkthroughs. We will introduce you to a variety of Do-It-Yourself IoT platforms including Arduino, Ras...
Apr. 1, 2015 11:00 AM EDT Reads: 2,279
Microservice architectures are the new hotness, even though they aren't really all that different (in principle) from the paradigm described by SOA (which is dead, or not dead, depending on whom you ask). One of the things this decompositional approach to application architecture does is encourage developers and operations (some might even say DevOps) to re-evaluate scaling strategies. In particular, the notion is forwarded that an application should be built to scale and then infrastructure sho...
Apr. 1, 2015 11:00 AM EDT Reads: 2,604
Our guest on the podcast this week is Jason Bloomberg, President at Intellyx. When we build services we want them to be lightweight, stateless and scalable while doing one thing really well. In today's cloud world, we're revisiting what to takes to make a good service in the first place. Listen in to learn why following "the book" doesn't necessarily mean that you're solving key business problems.
Apr. 1, 2015 10:45 AM EDT Reads: 1,394
Microservices are the result of decomposing applications. That may sound a lot like SOA, but SOA was based on an object-oriented (noun) premise; that is, services were built around an object - like a customer - with all the necessary operations (functions) that go along with it. SOA was also founded on a variety of standards (most of them coming out of OASIS) like SOAP, WSDL, XML and UDDI. Microservices have no standards (at least none deriving from a standards body or organization) and can be b...
Apr. 1, 2015 10:45 AM EDT Reads: 2,314
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
Apr. 1, 2015 10:15 AM EDT Reads: 2,240
SYS-CON Events announced today the DevOps Foundation Certification Course, being held June ?, 2015, in conjunction with DevOps Summit and 16th Cloud Expo at the Javits Center in New York City, NY. This sixteen (16) hour course provides an introduction to DevOps – the cultural and professional movement that stresses communication, collaboration, integration and automation in order to improve the flow of work between software developers and IT operations professionals. Improved workflows will res...
Apr. 1, 2015 10:00 AM EDT Reads: 1,803