|By Robert Eve||
|February 15, 2013 08:00 AM EST||
Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.
Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.
Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.
In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data. But they remain information poor.
In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.
What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.
Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.
With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.
Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.
Data virtualization provides instant access to all the data you want, the way you want it.
Enterprise, cloud, Big Data, and more, no problem!
What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.
- Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
- Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
- Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
- Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.
Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.
- Business Leaders - Data virtualization helps you drive business advantage from your data.
- Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
- CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
- CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
- Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.
How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.
- Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
- Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
- Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.
Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.
When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:
- Agile Analytics and BI Solutions
- Data Warehouse Extension Solutions
- Logical Data Warehouse Solutions
- Data Virtualization Architecture Solutions
- Data Integration and Management Solutions
- Business Solutions
- Industry Solutions
When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem. Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case. And sometimes a hybrid mix is the right answer.
You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.
What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.
- Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
- Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties. Plus data virtualization's rapid development and quick iterations lower your IT project risk.
- Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
- Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
- Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.
How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.
Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader. But how do you define the market leader
Is it the one with the most mature product? For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.
Is it the one with the most installations? For example the same vendor is used by nearly two hundred of world's largest organizations
Is it the one with them most domain knowledge? This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:
- The first book on data virtualization, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
- Data virtualization's foremost microsite, the DV Café
- The Data Virtualization Leadership Series of analyst reports on data virtualization
- Data virtualization's only dedicated blog, the Data Virtualization Leadership Blog
- The Data Virtualization Channel on YouTube with users, analysts, chalk talks and more
- The Data Virtualization Leadership Awards honoring users
- Data Virtualization Day Resources, assets from the premier events in data virtualization
- Data virtualization's longest running newsletter, Enterprise Information Insight
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Oct. 1, 2016 06:00 AM EDT Reads: 2,786
Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS - software, platform, and infrastructure as a service.
Oct. 1, 2016 05:30 AM EDT Reads: 1,031
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
Oct. 1, 2016 05:15 AM EDT Reads: 986
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Oct. 1, 2016 05:00 AM EDT Reads: 4,751
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Oct. 1, 2016 04:30 AM EDT Reads: 1,812
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Oct. 1, 2016 04:00 AM EDT Reads: 5,478
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 1, 2016 03:30 AM EDT Reads: 1,232
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
Oct. 1, 2016 02:30 AM EDT Reads: 3,039
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Oct. 1, 2016 02:15 AM EDT Reads: 2,153
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.
Oct. 1, 2016 02:15 AM EDT Reads: 534
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Oct. 1, 2016 01:15 AM EDT Reads: 3,051
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
Oct. 1, 2016 01:15 AM EDT Reads: 4,356
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software sec...
Oct. 1, 2016 12:15 AM EDT Reads: 1,433
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
Sep. 30, 2016 11:30 PM EDT Reads: 788
With online viewership and sales growing rapidly, enterprises are interested in understanding how they analyze performance to positively impact business metrics. Deeper insight into the user experience is needed to understand why conversions are dropping and/or bounce rates are increasing or, preferably, to understand what has been helping these metrics improve. The digital performance management industry has evolved as application performance management companies have broadened their scope beyo...
Sep. 30, 2016 11:00 PM EDT Reads: 1,472
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Sep. 30, 2016 10:30 PM EDT Reads: 923
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Sep. 30, 2016 09:45 PM EDT Reads: 2,871
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Sep. 30, 2016 09:00 PM EDT Reads: 2,843
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Sep. 30, 2016 08:00 PM EDT Reads: 1,580
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
Sep. 30, 2016 07:45 PM EDT Reads: 1,323