Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Matt Brickey, Mehdi Daoudi, Astadia CloudGPS

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

Can Virtualization Help with Governance?

Five ways data virtualization improves data governance

As with motherhood and apple pie, who can argue with data governance?

Business users like it because it assures critical business decisions are made based on sound data.

IT likes data governance because as the organization's data stewards, it shows they are doing a good job.

Compliance officers and risk managers like data governance because it lets them sleep at night.

Data Governance Is Challenging
Liking it is one thing.  Doing it is another.

Enterprises are struggling to turn the concept of data governance into a reality due to significantly growing data volumes, variety and variability, along with onerous new compliance requirements.

Effective data virtualization can improve data governance in numerous ways.

Five Requirements for More Effective Data Governance
Many articles and white papers define data governance, so it does not make sense to include a lengthy treatment here.  However, it is helpful to identify data governance's most critical requirements.

Data governance is a set of well-defined policies and practices designed to ensure that data is:

  • Accessible - Can the people who need it access the data they need? Does the data match the format the user requires?
  • Secure - Are authorized people the only ones who can access the data? Are non-authorized users prevented from accessing it?
  • Consistent - When two users seek the "same" piece of data, is it actually the same data? Have multiple versions been rationalized?
  • High Quality - Is the data accurate? Has it been conformed to meet agreed standards?
  • Auditable - Where did the data come from? Is the lineage clear? Does IT know who is using it and for what purpose?

Data Virtualization Helps Five Ways
Enterprises cannot buy data governance solutions off-the-shelf because effective data governance requires complex policies and practices, supported by software technology, integrated across the wider enterprise IT architecture.

As such, enterprises are turning to enabling technologies such as data virtualization support the accessibility, security, consistency, quality and auditability capabilities required for effective data governance.

Data Accessibility
It is generally agreed that as much as 80 percent of any new development effort is spent on data integration, making data access--rather than developing the application--the most time-consuming and expensive activity.

Most users access their data via business intelligence (BI) and reporting applications.  These applications typically rely on data integration middleware to access and format the data, before the application displays it.  So, ensuring proper governance falls on the data integration middleware.

By eliminating the need for the physical builds and testing that replication and consolidation approaches require, data virtualization is more agile and cost-effective method to access, integrate, and deliver data.  This agility lets enterprise provide data access faster and more easily.

Data Security
Ensuring that only authorized users can see appropriate data and nothing more is a critical data governance requirement.  This is a straightforward task for single systems and small user counts, but becomes more complex and difficult in larger enterprises with hundreds of systems and thousands of users.

As a first step, many enterprises have implemented single-sign-on technologies that allow individuals to be uniquely authenticated in many diverse systems. However, implementing security policies (i.e., authorization to see or use certain data) in individual source systems alone is often insufficient to ensure the appropriate enterprise-wide data security.  For some hyper-sensitive data, encryption as it moves through the network is a further requirement.

Data virtualization not only leverages single-sign-on capabilities to authorize and authenticate individuals, it can also encrypt any and all data.  As such, data virtualization becomes the data governance focal point for implementing security policies across multiple data sources and consumers.

Data Consistency
Consider the following commonplace scenario:  Two people attend a meeting with reports or graphs generated from the "same" data, but they show different numbers or results. Likely, they believed they were using the same data.  In reality, they were each using their own replicated, consolidated, aggregated version of the data.

Data virtualization allows enterprises to prevent this scenario from occurring by establishing consistent and complete data canonicals applicable across all aspects of business use.

Data Quality
Correct and complete data is a critical data governance requirement.  However, data quality is often implemented as an afterthought to data creation and modification, and it is usually performed during data consolidation.  This approach impedes the achievement of good data quality across the enterprise.

The modern trend in data quality and governance, however, is to push the practices of ensuring quality data back toward the source systems, so that data is of the highest quality right from the start.

Data virtualization leverages these "systems of record" when delivering data to the consumer, so it naturally delivers high-quality data. In addition, data virtualization allows data quality practices like enrichment and standardization to occur inline, giving the data stewards more options for ensuring data is of the highest quality when it reaches the consumer.

Data Auditablity
On the data source side, good data governance policy requires that IT can explain where data comes from, and prove its source. On the data consumer side, good data governance policy requires that IT show who used the data, and how it was used.

Traditional data integration copies data from one place to another.  As a result, the copied data becomes "disconnected" from the source, making it difficult to establish a complete source-to-consumer audit trail.

Data virtualization integrates data directly from the original source and delivers it directly to the consumer.  This end-to-end flow, without creating a disconnected copy of the data in the middle, simplifies and strengthens data governance. When auditing is required, full lineage is readily available at anytime within the data virtualization metadata and transaction histories.

Bottom-line
As data governance becomes increasingly prevalent in enterprise information management strategies, forward-looking organizations are deploying methods that simplify data governance.  Data virtualization platforms such as Composite 6 not only makes data governance easier in practice, but it also shortens the time to begin achieving the data governance benefits of consistent, secure high-quality data for more intelligent business decision-making.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
"We focus on SAP workloads because they are among the most powerful but somewhat challenging workloads out there to take into public cloud," explained Swen Conrad, CEO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I think DevOps is now a rambunctious teenager – it’s starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
If you read a lot of business and technology publications, you might think public clouds are universally preferred over all other cloud options. To be sure, the numbers posted by Amazon Web Services (AWS) and Microsoft’s Azure platform are nothing short of impressive. Statistics reveal that public clouds are growing faster than private clouds and analysts at IDC predict that public cloud growth will be 3 times that of private clouds by 2019.
For over a decade, Application Programming Interface or APIs have been used to exchange data between multiple platforms. From social media to news and media sites, most websites depend on APIs to provide a dynamic and real-time digital experience. APIs have made its way into almost every device and service available today and it continues to spur innovations in every field of technology. There are multiple programming languages used to build and run applications in the online world. And just li...
If you are thinking about moving applications off a mainframe and over to open systems and the cloud, consider these guidelines to prioritize what to move and what to eliminate. On the surface, mainframe architecture seems relatively simple: A centrally located computer processes data through an input/output subsystem and stores its computations in memory. At the other end of the mainframe are printers and terminals that communicate with the mainframe through protocols. For all of its appare...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
"Peak 10 is a hybrid infrastructure provider across the nation. We are in the thick of things when it comes to hybrid IT," explained Michael Fuhrman, Chief Technology Officer at Peak 10, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Data reduction delivers compelling cost reduction that substantially improves the business case in every cloud deployment model. No matter which cloud approach you choose, the cost savings benefits from data reduction should not be ignored and must be a component of your cloud strategy. IT professionals are finding that the future of IT infrastructure lies in the cloud. Data reduction technologies enable clouds — public, private, and hybrid — to deliver business agility and elasticity at the lo...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
From personal care products to groceries and movies on demand, cloud-based subscriptions are fulfilling the needs of consumers across an array of market sectors. Nowhere is this shift to subscription services more evident than in the technology sector. By adopting an Everything-as-a-Service (XaaS) delivery model, companies are able to tailor their computing environments to shape the experiences they want for customers as well as their workforce.
"Outscale was founded in 2010, is based in France, is a strategic partner to Dassault Systémes and has done quite a bit of work with divisions of Dassault," explained Jackie Funk, Digital Marketing exec at Outscale, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.