Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

In the Rush to the Cloud, Don’t Forget About Data Governance

Migrations to the cloud are an extension of the operational perimeter of the business

Cloud computing has become an integrated part of IT strategy for companies in every sector of our economy. By 2012, IDC predicts that IT spending on cloud services will grow almost threefold to $42 billion. So it's no surprise that decision makers no longer wonder "if" they can benefit from cloud computing. Instead, the question being asked now is "how" best to leverage the cloud while keeping data and systems secure.

Data governance and compliance issues are typically the same whether information is in a private or public cloud environment or on-premise. That said, when organizations are considering moving business data to the cloud, a sound data governance approach must be in place to avoid costly data protection mistakes. At the heart of a sound data governance strategy is ensuring that only the right users have access to the right data at all times.

While the economic advantages of the cloud are compelling (the ability to quickly expand infrastructure to meet demand, low usage-based pricing and near infinite scalability), many organizations have yet to master data governance of their existing, in-house infrastructure. It's a bit like putting the cart before the horse. Against that backdrop, it should come as no surprise that cloud services can actually exacerbate existing data management and protection issues, adding a host of new concerns:

  • How do I enforce existing security policies and procedures when my data is in the cloud?
  • If my cloud provider is sued, can the suing party get access to my data?
  • How do I get access to full reporting that I need for my IT governance and compliance responsibilities?
  • How do I know what other data is in my cloud?
  • How do I know if my cloud is secure?
  • How do I automate access rights management in the cloud?

The Data Deluge Dilemma
Organizations have more digital data than ever before that must be continuously managed and protected in order for it to remain safe and retain its value. While data governance is often thought of more as a discipline than a technology, software can help companies implement data governance policies through automation, without disrupting existing business processes.

Concern about data governance has increased substantially over the past two decades, driven by the rapid growth in digital collaboration and an exponential increase in the amount of data that is created, shared, streamed and stored. Organizations now possess increasingly more information about their customers and partners - whether it's stored in a cloud environment or not - and failure to protect this data can be damaging. Partners and customers expect their information will be consistently protected before conducting business with a company. Therein lies the need for comprehensive data governance to manage and protect critical data, which has become a key issue for the cloud.

For years, IT has worked at capacity to manage and protect data manually as best it could - responding to authorization requests, migrating data, and cleaning up excessive access. Yet, despite this effort, IT has been falling further and further behind for the past 15 years. There is simply too much data being created too quickly to manage, protect and realize its full value without continuous, up-to-date information about the data: metadata.

Put simply, metadata is data about the data you hold in your organization. Use and analysis of metadata is already more common than we realize, and automated collection, storage, analysis and presentation of metadata will become a necessity not only for in-house data stores, but for cloud infrastructure as well.

Metadata frameworks for data governance from companies like Varonis non-intrusively collect critical information, generate metadata where existing metadata is lacking (e.g., file system filters and content inspection technologies), pre-process it, normalize it, analyze it, store it, and present it to IT administrators in an interactive, dynamic interface. Once data owners are identified, they are empowered to make informed authorization and permissions maintenance decisions through a web-based interface. In addition, data owners can do all of this on their own without IT overhead or manual back-end processes.

Those organizations that have learned to harness metadata to underpin their data governance practices will have a far greater chance of a extending those management and protection capabilities to the cloud, assuming that the cloud providers are equally metadata-capable.

Due Diligence in the Cloud
To coin a phrase from John Walker, Professor of Science & Technology, School of Computing & Informatics and member of ISACA Security Advisory Group: "You are not merely buying a cloud, you are choosing a partner and that choice has to be based on thorough due diligence. This process is essential. The most important barrier to the adoption of cloud computing is assurance - ‘how do I know if it's safe to trust the cloud provider?' With today's complex IT architectures and heavy reliance upon third-party providers, there has never been a greater demand for transparency and objective metrics for attestation."

Migrations to the cloud are an extension of the operational perimeter of the business. It is a partnership that joins on-premise business objects with those located in the extended perimeter of the cloud. Both are subject to the same access controls and policies. Any approach to utilize the cloud must be achieved in tandem with organizational controls to create a robust, contractually obligated partnership between client and provider - nothing short of this should be considered secure.

There is an urgent need to address security and compliance challenges associated with an organization's cloud initiatives. IDC research has found that security and compliance are among the top three challenges to cloud computing. Without adequate information on the security and compliance profile of the data, including its ownership, access controls, audits and classification, cloud initiatives can fall short of expectations and put sensitive data at risk. Understanding the data owners, authorized users and user activity is critical to garnering organizational input, which in turn, is critical to defining the security and compliance profile of the data for your internal datacenter and the cloud. CFOs and CIOs are hesitant, IDC says, to move critical data and processes into the cloud when there is still little visibility on access and ownership, traceability and data segregation. It is vital that organizations have data governance in order to provide secure collaboration and data protection for their customers, partners and employees. Without it, companies will find it virtually impossible to manage and protect digital information in the cloud or anywhere else.

More Stories By Wendy Yale

Wendy Yale leads marketing and brand development for Varonis’ global growth efforts. She is a veteran brand strategist with 16 years of marketing experience. Prior to Varonis, Wendy successfully managed the global integrated marketing communications team at Symantec. She joined Symantec from VERITAS, where she led the interactive media marketing team. Beginning her career as a freelance producer and writer, she has developed projects for organizations such as the University of Hawaii at Manoa, Film and Video Magazine, Aloha Airlines, the International Teleproduction Society and Unitel Video. Wendy has held senior posts at DMEC and ReplayTV, and holds a B.A. degree in Geography from Cal State Northridge. You can contact Wendy at [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...