|By Robert Eve||
|August 13, 2011 12:00 PM EDT||
As with motherhood and apple pie, who can argue with data governance?
Business users like it because it assures critical business decisions are made based on sound data.
IT likes data governance because as the organization's data stewards, it shows they are doing a good job.
Compliance officers and risk managers like data governance because it lets them sleep at night.
Data Governance Is Challenging
Liking it is one thing. Doing it is another.
Enterprises are struggling to turn the concept of data governance into a reality due to significantly growing data volumes, variety and variability, along with onerous new compliance requirements.
Effective data virtualization can improve data governance in numerous ways.
Five Requirements for More Effective Data Governance
Many articles and white papers define data governance, so it does not make sense to include a lengthy treatment here. However, it is helpful to identify data governance's most critical requirements.
Data governance is a set of well-defined policies and practices designed to ensure that data is:
- Accessible - Can the people who need it access the data they need? Does the data match the format the user requires?
- Secure - Are authorized people the only ones who can access the data? Are non-authorized users prevented from accessing it?
- Consistent - When two users seek the "same" piece of data, is it actually the same data? Have multiple versions been rationalized?
- High Quality - Is the data accurate? Has it been conformed to meet agreed standards?
- Auditable - Where did the data come from? Is the lineage clear? Does IT know who is using it and for what purpose?
Data Virtualization Helps Five Ways
Enterprises cannot buy data governance solutions off-the-shelf because effective data governance requires complex policies and practices, supported by software technology, integrated across the wider enterprise IT architecture.
As such, enterprises are turning to enabling technologies such as data virtualization support the accessibility, security, consistency, quality and auditability capabilities required for effective data governance.
It is generally agreed that as much as 80 percent of any new development effort is spent on data integration, making data access--rather than developing the application--the most time-consuming and expensive activity.
Most users access their data via business intelligence (BI) and reporting applications. These applications typically rely on data integration middleware to access and format the data, before the application displays it. So, ensuring proper governance falls on the data integration middleware.
By eliminating the need for the physical builds and testing that replication and consolidation approaches require, data virtualization is more agile and cost-effective method to access, integrate, and deliver data. This agility lets enterprise provide data access faster and more easily.
Ensuring that only authorized users can see appropriate data and nothing more is a critical data governance requirement. This is a straightforward task for single systems and small user counts, but becomes more complex and difficult in larger enterprises with hundreds of systems and thousands of users.
As a first step, many enterprises have implemented single-sign-on technologies that allow individuals to be uniquely authenticated in many diverse systems. However, implementing security policies (i.e., authorization to see or use certain data) in individual source systems alone is often insufficient to ensure the appropriate enterprise-wide data security. For some hyper-sensitive data, encryption as it moves through the network is a further requirement.
Data virtualization not only leverages single-sign-on capabilities to authorize and authenticate individuals, it can also encrypt any and all data. As such, data virtualization becomes the data governance focal point for implementing security policies across multiple data sources and consumers.
Consider the following commonplace scenario: Two people attend a meeting with reports or graphs generated from the "same" data, but they show different numbers or results. Likely, they believed they were using the same data. In reality, they were each using their own replicated, consolidated, aggregated version of the data.
Data virtualization allows enterprises to prevent this scenario from occurring by establishing consistent and complete data canonicals applicable across all aspects of business use.
Correct and complete data is a critical data governance requirement. However, data quality is often implemented as an afterthought to data creation and modification, and it is usually performed during data consolidation. This approach impedes the achievement of good data quality across the enterprise.
The modern trend in data quality and governance, however, is to push the practices of ensuring quality data back toward the source systems, so that data is of the highest quality right from the start.
Data virtualization leverages these "systems of record" when delivering data to the consumer, so it naturally delivers high-quality data. In addition, data virtualization allows data quality practices like enrichment and standardization to occur inline, giving the data stewards more options for ensuring data is of the highest quality when it reaches the consumer.
On the data source side, good data governance policy requires that IT can explain where data comes from, and prove its source. On the data consumer side, good data governance policy requires that IT show who used the data, and how it was used.
Traditional data integration copies data from one place to another. As a result, the copied data becomes "disconnected" from the source, making it difficult to establish a complete source-to-consumer audit trail.
Data virtualization integrates data directly from the original source and delivers it directly to the consumer. This end-to-end flow, without creating a disconnected copy of the data in the middle, simplifies and strengthens data governance. When auditing is required, full lineage is readily available at anytime within the data virtualization metadata and transaction histories.
As data governance becomes increasingly prevalent in enterprise information management strategies, forward-looking organizations are deploying methods that simplify data governance. Data virtualization platforms such as Composite 6 not only makes data governance easier in practice, but it also shortens the time to begin achieving the data governance benefits of consistent, secure high-quality data for more intelligent business decision-making.
This complete kit provides a proven process and customizable documents that will help you evaluate rapid application delivery platforms and select the ideal partner for building mobile and web apps for your organization.
Aug. 29, 2016 12:00 PM EDT Reads: 2,990
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 29, 2016 12:00 PM EDT Reads: 3,180
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
Aug. 29, 2016 12:00 PM EDT Reads: 942
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
Aug. 29, 2016 09:30 AM EDT Reads: 4,721
If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
Aug. 29, 2016 08:30 AM EDT Reads: 5,200
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
Aug. 29, 2016 08:15 AM EDT Reads: 5,289
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Aug. 29, 2016 08:00 AM EDT Reads: 839
Before becoming a developer, I was in the high school band. I played several brass instruments - including French horn and cornet - as well as keyboards in the jazz stage band. A musician and a nerd, what can I say? I even dabbled in writing music for the band. Okay, mostly I wrote arrangements of pop music, so the band could keep the crowd entertained during Friday night football games. What struck me then was that, to write parts for all the instruments - brass, woodwind, percussion, even k...
Aug. 29, 2016 06:45 AM EDT Reads: 3,190
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Aug. 29, 2016 05:15 AM EDT Reads: 3,232
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Aug. 29, 2016 03:00 AM EDT Reads: 2,424
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 29, 2016 02:15 AM EDT Reads: 1,835
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 29, 2016 01:45 AM EDT Reads: 2,200
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
Aug. 29, 2016 01:30 AM EDT Reads: 4,849
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
Aug. 28, 2016 10:30 PM EDT Reads: 5,245
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 28, 2016 10:30 PM EDT Reads: 4,072
Aug. 28, 2016 10:30 PM EDT Reads: 4,899
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?
Aug. 28, 2016 08:15 PM EDT Reads: 1,971
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Aug. 28, 2016 07:00 PM EDT Reads: 2,089
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 28, 2016 06:00 PM EDT Reads: 1,965
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Aug. 28, 2016 03:15 PM EDT Reads: 3,549