|By Robert Eve||
|August 13, 2011 12:00 PM EDT||
As with motherhood and apple pie, who can argue with data governance?
Business users like it because it assures critical business decisions are made based on sound data.
IT likes data governance because as the organization's data stewards, it shows they are doing a good job.
Compliance officers and risk managers like data governance because it lets them sleep at night.
Data Governance Is Challenging
Liking it is one thing. Doing it is another.
Enterprises are struggling to turn the concept of data governance into a reality due to significantly growing data volumes, variety and variability, along with onerous new compliance requirements.
Effective data virtualization can improve data governance in numerous ways.
Five Requirements for More Effective Data Governance
Many articles and white papers define data governance, so it does not make sense to include a lengthy treatment here. However, it is helpful to identify data governance's most critical requirements.
Data governance is a set of well-defined policies and practices designed to ensure that data is:
- Accessible - Can the people who need it access the data they need? Does the data match the format the user requires?
- Secure - Are authorized people the only ones who can access the data? Are non-authorized users prevented from accessing it?
- Consistent - When two users seek the "same" piece of data, is it actually the same data? Have multiple versions been rationalized?
- High Quality - Is the data accurate? Has it been conformed to meet agreed standards?
- Auditable - Where did the data come from? Is the lineage clear? Does IT know who is using it and for what purpose?
Data Virtualization Helps Five Ways
Enterprises cannot buy data governance solutions off-the-shelf because effective data governance requires complex policies and practices, supported by software technology, integrated across the wider enterprise IT architecture.
As such, enterprises are turning to enabling technologies such as data virtualization support the accessibility, security, consistency, quality and auditability capabilities required for effective data governance.
It is generally agreed that as much as 80 percent of any new development effort is spent on data integration, making data access--rather than developing the application--the most time-consuming and expensive activity.
Most users access their data via business intelligence (BI) and reporting applications. These applications typically rely on data integration middleware to access and format the data, before the application displays it. So, ensuring proper governance falls on the data integration middleware.
By eliminating the need for the physical builds and testing that replication and consolidation approaches require, data virtualization is more agile and cost-effective method to access, integrate, and deliver data. This agility lets enterprise provide data access faster and more easily.
Ensuring that only authorized users can see appropriate data and nothing more is a critical data governance requirement. This is a straightforward task for single systems and small user counts, but becomes more complex and difficult in larger enterprises with hundreds of systems and thousands of users.
As a first step, many enterprises have implemented single-sign-on technologies that allow individuals to be uniquely authenticated in many diverse systems. However, implementing security policies (i.e., authorization to see or use certain data) in individual source systems alone is often insufficient to ensure the appropriate enterprise-wide data security. For some hyper-sensitive data, encryption as it moves through the network is a further requirement.
Data virtualization not only leverages single-sign-on capabilities to authorize and authenticate individuals, it can also encrypt any and all data. As such, data virtualization becomes the data governance focal point for implementing security policies across multiple data sources and consumers.
Consider the following commonplace scenario: Two people attend a meeting with reports or graphs generated from the "same" data, but they show different numbers or results. Likely, they believed they were using the same data. In reality, they were each using their own replicated, consolidated, aggregated version of the data.
Data virtualization allows enterprises to prevent this scenario from occurring by establishing consistent and complete data canonicals applicable across all aspects of business use.
Correct and complete data is a critical data governance requirement. However, data quality is often implemented as an afterthought to data creation and modification, and it is usually performed during data consolidation. This approach impedes the achievement of good data quality across the enterprise.
The modern trend in data quality and governance, however, is to push the practices of ensuring quality data back toward the source systems, so that data is of the highest quality right from the start.
Data virtualization leverages these "systems of record" when delivering data to the consumer, so it naturally delivers high-quality data. In addition, data virtualization allows data quality practices like enrichment and standardization to occur inline, giving the data stewards more options for ensuring data is of the highest quality when it reaches the consumer.
On the data source side, good data governance policy requires that IT can explain where data comes from, and prove its source. On the data consumer side, good data governance policy requires that IT show who used the data, and how it was used.
Traditional data integration copies data from one place to another. As a result, the copied data becomes "disconnected" from the source, making it difficult to establish a complete source-to-consumer audit trail.
Data virtualization integrates data directly from the original source and delivers it directly to the consumer. This end-to-end flow, without creating a disconnected copy of the data in the middle, simplifies and strengthens data governance. When auditing is required, full lineage is readily available at anytime within the data virtualization metadata and transaction histories.
As data governance becomes increasingly prevalent in enterprise information management strategies, forward-looking organizations are deploying methods that simplify data governance. Data virtualization platforms such as Composite 6 not only makes data governance easier in practice, but it also shortens the time to begin achieving the data governance benefits of consistent, secure high-quality data for more intelligent business decision-making.
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
Feb. 21, 2017 01:15 AM EST Reads: 1,139
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Feb. 21, 2017 01:15 AM EST Reads: 3,252
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Feb. 20, 2017 09:15 PM EST Reads: 4,506
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Feb. 20, 2017 05:45 PM EST Reads: 4,029
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Feb. 20, 2017 05:30 PM EST Reads: 2,288
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
Feb. 20, 2017 02:15 PM EST Reads: 1,142
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Feb. 20, 2017 02:15 PM EST Reads: 2,708
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
Feb. 20, 2017 01:30 PM EST Reads: 239
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Feb. 20, 2017 01:00 PM EST Reads: 5,910
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Feb. 20, 2017 12:45 PM EST Reads: 1,069
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 20, 2017 12:45 PM EST Reads: 1,009
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
Feb. 20, 2017 11:30 AM EST Reads: 376
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
Feb. 20, 2017 11:15 AM EST Reads: 7,889
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Feb. 20, 2017 11:00 AM EST Reads: 880
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
Feb. 20, 2017 10:45 AM EST Reads: 6,008
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
Feb. 20, 2017 10:15 AM EST Reads: 929
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Feb. 20, 2017 09:00 AM EST Reads: 7,850
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Feb. 20, 2017 07:15 AM EST Reads: 3,393
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Feb. 20, 2017 06:00 AM EST Reads: 1,658
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
Feb. 20, 2017 05:30 AM EST Reads: 4,588