Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Jason Bloomberg, Liz McMillan, PagerDuty Blog

Related Topics: Agile Computing, Microservices Expo, @CloudExpo

Agile Computing: Article

Ten Things IT Should Be Doing to Manage Unstructured Data – But Isn’t

‘To do’ list reduces the risk of unstructured data loss

When it comes to protecting unstructured data, such as spreadsheets, documents, images and other data on file servers, most organizations acknowledge that their existing processes and risk profiles are less than ideal. Unfortunately, IT personnel - rather than data owners - are typically the ones making many of the decisions about permissions, acceptable use, and acceptable access review. And because IT personnel aren't equipped with adequate business context around the growing volumes of unstructured data, they can only make a best effort guess as to how to manage and protect each data set.

Until organizations shift the decision making responsibility to business data owners, IT carries the burden of enforcing rules for who can access what on shared file systems, and for keeping those structures current through data growth and user role changes. IT needs to determine who can access unstructured data, who should and is accessing it, and what is likely to be sensitive.

To help streamline this process, here are 10 must-do actions for IT teams to carry out as part of a daily data management routine to maximize unstructured data protection:

1. Identify data owners
IT should keep a current list of data business owners and the folders and SharePoint sites that are their responsibility. By having this list the ready, IT can expedite a number of the previously identified tasks, including verifying permissions revocation and review, and identifying data for archival. The net effect is a marked increase in the accuracy of data entitlement permissions and, therefore, data protection.

2. Remove global group access control lists (ACLs) like ‘Everyone'
It is not uncommon for folders on file shares to have access control permissions allowing ‘everyone,' or all ‘domain users' (nearly everyone) to access the data contained. This creates a significant security risk, for any data placed in that folder will inherit those exposed permissions, and those who place data in these wide-open folders may not be aware of the lax access settings. Global access to folders should be removed and replaced with rules that give access to explicit groups that need it.

3. Perform data entitlement (ACL) reviews
Every file and folder on a Windows or Unix file system has access controls assigned to it that determine which users can access the data and how, i.e., read, write, execute, and list. These controls need to be reviewed on a regular basis and the settings documented so that they can be verified as accurate by data business owners and security policy auditors.

4. Revoke unused and unwarranted permissions
Users with access to data that is not material to their jobs constitute a security risk for organizations. Most users only need access to a small fraction of the data that resides on file servers. It is important to review and then remove or revoke permissions that are unused.

5. Audit permissions changes
Access Control Lists are the fundamental preventive control mechanism that's in place to protect data from loss, tampering, and exposure. IT requires the ability to capture and report on access control changes to data, especially for highly sensitive folders. If access is incorrectly assigned or changed to a more permissive state without a good business reason, IT and the data business owner must be alerted quickly and be able to remediate the situation.

6. Audit group membership changes
Directory Groups are the primary entities on access control lists (Active Directory, LDAP, NIS, etc.) with membership granting access to unstructured data as well as many applications, VPN gateways, etc. Users are added to existing and newly created groups on a daily basis. Without an audit trail of who is being added and removed from these groups, enforcing access control processes is impossible. Ideally group membership should be authorized and reviewed by the owner of the data or resource to which the group provides access.

7. Audit data access
Effective management of any data set is impossible without an access record. Unless you can reliably observe data use you cannot observe its misuse, abuse, or non-use. Even if IT could ask its organization's users if they used each data set, the end users would not be able to answer accurately - the scope of a typical user's access activity is far beyond what humans can recall. Without a record of data usage, you cannot determine the proper organizational owner for a data set, and neither the unfound owner nor IT can make informed decisions about protecting it, archiving it, or deleting it.

8. Prioritize data
While all data should be protected, some data needs to be protected much more urgently than others. Using data owners, data access patterns, and data classification technology, data that is considered sensitive, confidential, or internal should be tagged accordingly, and protected and reviewed frequently.

9. Align security groups to data
Whenever someone is placed in a group, they get file system access to all folders that list the group on its ACL. Unfortunately, organizations have completely lost track of what data folders contain which Active Directory, LDAP, SharePoint or NIS groups. This uncertainty undermines any access control review project, and any role-based access control (RBAC) initiative. In role-based access control methodology, each role has a list of associated groups into which the user is placed when they are assigned that role. It is impossible to align the role with the right data if the organization cannot verify what data a group provides access to.

10. Lock down, delete, or archive stale, unused data
Not all of the data contained on shared file servers and network attached storage devices is in active use. By archiving stale or unused data to offline storage, or deleting it, IT makes the job of managing the remainder simpler and easier, while freeing up expensive resources.

The principal of least privilege is a well-accepted guideline for managing access controls - only those who have an organizational need to access information should be able to do so. However, for most organizations, a least-privilege model is not feasible, because data is generated far too quickly and personnel change rapidly. Even in small organizations the growing data set and pace of organizational changes exceed the IT department's ability to keep up with access control lists and group memberships. By automating and conducting the 10 management tasks outlined above frequently, organizations will gain the visibility and auditing required that determines who can access the unstructured data, who is accessing it and who should have access. This detailed data access behavior will benefit organizations in a plethora of ways, most significantly securing their data, ensuring compliance demands are met, and freeing up expensive storage resources.

More Stories By Wendy Yale

Wendy Yale leads marketing and brand development for Varonis’ global growth efforts. She is a veteran brand strategist with 16 years of marketing experience. Prior to Varonis, Wendy successfully managed the global integrated marketing communications team at Symantec. She joined Symantec from VERITAS, where she led the interactive media marketing team. Beginning her career as a freelance producer and writer, she has developed projects for organizations such as the University of Hawaii at Manoa, Film and Video Magazine, Aloha Airlines, the International Teleproduction Society and Unitel Video. Wendy has held senior posts at DMEC and ReplayTV, and holds a B.A. degree in Geography from Cal State Northridge. You can contact Wendy at [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and attent...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
By now, every company in the world is on the lookout for the digital disruption that will threaten their existence. In study after study, executives believe that technology has either already disrupted their industry, is in the process of disrupting it or will disrupt it in the near future. As a result, every organization is taking steps to prepare for or mitigate unforeseen disruptions. Yet in almost every industry, the disruption trend continues unabated.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Lots of cloud technology predictions and analysis are still dealing with future spending and planning, but there are plenty of real-world cloud use cases and implementations happening now. One approach, taken by stalwart GE, is to use SaaS applications for non-differentiated uses. For them, that means moving functions like HR, finance, taxes and scheduling to SaaS, while spending their software development time and resources on the core apps that make GE better, such as inventory, planning and s...
Building custom add-ons does not need to be limited to the ideas you see on a marketplace. In his session at 20th Cloud Expo, Sukhbir Dhillon, CEO and founder of Addteq, will go over some adventures they faced in developing integrations using Atlassian SDK and other technologies/platforms and how it has enabled development teams to experiment with newer paradigms like Serverless and newer features of Atlassian SDKs. In this presentation, you will be taken on a journey of Add-On and Integration ...