|By Robert Eve||
|August 13, 2011 12:00 PM EDT||
As with motherhood and apple pie, who can argue with data governance?
Business users like it because it assures critical business decisions are made based on sound data.
IT likes data governance because as the organization's data stewards, it shows they are doing a good job.
Compliance officers and risk managers like data governance because it lets them sleep at night.
Data Governance Is Challenging
Liking it is one thing. Doing it is another.
Enterprises are struggling to turn the concept of data governance into a reality due to significantly growing data volumes, variety and variability, along with onerous new compliance requirements.
Effective data virtualization can improve data governance in numerous ways.
Five Requirements for More Effective Data Governance
Many articles and white papers define data governance, so it does not make sense to include a lengthy treatment here. However, it is helpful to identify data governance's most critical requirements.
Data governance is a set of well-defined policies and practices designed to ensure that data is:
- Accessible - Can the people who need it access the data they need? Does the data match the format the user requires?
- Secure - Are authorized people the only ones who can access the data? Are non-authorized users prevented from accessing it?
- Consistent - When two users seek the "same" piece of data, is it actually the same data? Have multiple versions been rationalized?
- High Quality - Is the data accurate? Has it been conformed to meet agreed standards?
- Auditable - Where did the data come from? Is the lineage clear? Does IT know who is using it and for what purpose?
Data Virtualization Helps Five Ways
Enterprises cannot buy data governance solutions off-the-shelf because effective data governance requires complex policies and practices, supported by software technology, integrated across the wider enterprise IT architecture.
As such, enterprises are turning to enabling technologies such as data virtualization support the accessibility, security, consistency, quality and auditability capabilities required for effective data governance.
It is generally agreed that as much as 80 percent of any new development effort is spent on data integration, making data access--rather than developing the application--the most time-consuming and expensive activity.
Most users access their data via business intelligence (BI) and reporting applications. These applications typically rely on data integration middleware to access and format the data, before the application displays it. So, ensuring proper governance falls on the data integration middleware.
By eliminating the need for the physical builds and testing that replication and consolidation approaches require, data virtualization is more agile and cost-effective method to access, integrate, and deliver data. This agility lets enterprise provide data access faster and more easily.
Ensuring that only authorized users can see appropriate data and nothing more is a critical data governance requirement. This is a straightforward task for single systems and small user counts, but becomes more complex and difficult in larger enterprises with hundreds of systems and thousands of users.
As a first step, many enterprises have implemented single-sign-on technologies that allow individuals to be uniquely authenticated in many diverse systems. However, implementing security policies (i.e., authorization to see or use certain data) in individual source systems alone is often insufficient to ensure the appropriate enterprise-wide data security. For some hyper-sensitive data, encryption as it moves through the network is a further requirement.
Data virtualization not only leverages single-sign-on capabilities to authorize and authenticate individuals, it can also encrypt any and all data. As such, data virtualization becomes the data governance focal point for implementing security policies across multiple data sources and consumers.
Consider the following commonplace scenario: Two people attend a meeting with reports or graphs generated from the "same" data, but they show different numbers or results. Likely, they believed they were using the same data. In reality, they were each using their own replicated, consolidated, aggregated version of the data.
Data virtualization allows enterprises to prevent this scenario from occurring by establishing consistent and complete data canonicals applicable across all aspects of business use.
Correct and complete data is a critical data governance requirement. However, data quality is often implemented as an afterthought to data creation and modification, and it is usually performed during data consolidation. This approach impedes the achievement of good data quality across the enterprise.
The modern trend in data quality and governance, however, is to push the practices of ensuring quality data back toward the source systems, so that data is of the highest quality right from the start.
Data virtualization leverages these "systems of record" when delivering data to the consumer, so it naturally delivers high-quality data. In addition, data virtualization allows data quality practices like enrichment and standardization to occur inline, giving the data stewards more options for ensuring data is of the highest quality when it reaches the consumer.
On the data source side, good data governance policy requires that IT can explain where data comes from, and prove its source. On the data consumer side, good data governance policy requires that IT show who used the data, and how it was used.
Traditional data integration copies data from one place to another. As a result, the copied data becomes "disconnected" from the source, making it difficult to establish a complete source-to-consumer audit trail.
Data virtualization integrates data directly from the original source and delivers it directly to the consumer. This end-to-end flow, without creating a disconnected copy of the data in the middle, simplifies and strengthens data governance. When auditing is required, full lineage is readily available at anytime within the data virtualization metadata and transaction histories.
As data governance becomes increasingly prevalent in enterprise information management strategies, forward-looking organizations are deploying methods that simplify data governance. Data virtualization platforms such as Composite 6 not only makes data governance easier in practice, but it also shortens the time to begin achieving the data governance benefits of consistent, secure high-quality data for more intelligent business decision-making.
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
May. 22, 2015 05:30 PM EDT Reads: 4,016
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
May. 22, 2015 05:00 PM EDT Reads: 2,361
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
May. 22, 2015 05:00 PM EDT Reads: 1,734
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
May. 22, 2015 05:00 PM EDT Reads: 1,593
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
May. 22, 2015 03:00 PM EDT Reads: 1,858
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
May. 22, 2015 02:30 PM EDT Reads: 1,453
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
May. 22, 2015 02:00 PM EDT Reads: 1,600
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
May. 22, 2015 12:30 PM EDT Reads: 1,409
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
May. 22, 2015 12:00 PM EDT Reads: 1,989
Converging digital disruptions is creating a major sea change - Cisco calls this the Internet of Everything (IoE). IoE is the network connection of People, Process, Data and Things, fueled by Cloud, Mobile, Social, Analytics and Security, and it represents a $19Trillion value-at-stake over the next 10 years. In her keynote at @ThingsExpo, Manjula Talreja, VP of Cisco Consulting Services, will discuss IoE and the enormous opportunities it provides to public and private firms alike. She will shar...
May. 22, 2015 12:00 PM EDT Reads: 2,039
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
May. 22, 2015 11:30 AM EDT Reads: 2,549
SYS-CON Events announced today that EnterpriseDB (EDB), the leading worldwide provider of enterprise-class Postgres products and database compatibility solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. EDB is the largest provider of Postgres software and services that provides enterprise-class performance and scalability and the open source freedom to divert budget from more costly traditiona...
May. 22, 2015 11:00 AM EDT Reads: 1,549
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
May. 22, 2015 11:00 AM EDT Reads: 1,582
With the advent of micro-services, the application design paradigm has undergone a major shift. The days of developing monolithic applications are over. We are bringing in the principles (read SOA) hereto the preserve of applications or system integration space into the application development world. Since the micro-services are consumed within the application, the need of ESB is not there. There is no message transformation or mediations required. But service discovery and load balancing of ...
May. 22, 2015 11:00 AM EDT Reads: 2,715
The integration between the 2 solutions is handled by a module provided by XebiaLabs that will ensure the containers are correctly defined in the XL Deloy repository based on the information managed by Puppet. It uses the REST API offered by the XL Deploy server: so the security permissions are checked as a operator could do it using the GUI or the CLI. This article shows you how use the xebialabs/xldeploy Puppet module. The Production environment is based on 2 tomcats instances (tomcat1 &...
May. 22, 2015 11:00 AM EDT Reads: 1,928
Do you think development teams really update those BMC Remedy tickets with all the changes contained in a release? They don't. Most of them just "check the box" and move on. They rose a Risk Level that won't raise questions from the Change Control managers and they work around the checks and balances. The alternative is to stop and wait for a department that still thinks releases are rare events. When a release happens every day there's just not enough time for people to attend CAB meeting...
May. 22, 2015 10:45 AM EDT Reads: 1,263
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
May. 22, 2015 10:30 AM EDT Reads: 1,193
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
May. 22, 2015 10:00 AM EDT Reads: 2,063
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
May. 22, 2015 10:00 AM EDT Reads: 5,901
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud en...
May. 22, 2015 10:00 AM EDT Reads: 1,849