Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Jason Bloomberg, Anders Wallgren, Derek Weeks

Related Topics: Agile Computing, Microservices Expo, @CloudExpo

Agile Computing: Article

Ten Things IT Should Be Doing to Manage Unstructured Data – But Isn’t

‘To do’ list reduces the risk of unstructured data loss

When it comes to protecting unstructured data, such as spreadsheets, documents, images and other data on file servers, most organizations acknowledge that their existing processes and risk profiles are less than ideal. Unfortunately, IT personnel - rather than data owners - are typically the ones making many of the decisions about permissions, acceptable use, and acceptable access review. And because IT personnel aren't equipped with adequate business context around the growing volumes of unstructured data, they can only make a best effort guess as to how to manage and protect each data set.

Until organizations shift the decision making responsibility to business data owners, IT carries the burden of enforcing rules for who can access what on shared file systems, and for keeping those structures current through data growth and user role changes. IT needs to determine who can access unstructured data, who should and is accessing it, and what is likely to be sensitive.

To help streamline this process, here are 10 must-do actions for IT teams to carry out as part of a daily data management routine to maximize unstructured data protection:

1. Identify data owners
IT should keep a current list of data business owners and the folders and SharePoint sites that are their responsibility. By having this list the ready, IT can expedite a number of the previously identified tasks, including verifying permissions revocation and review, and identifying data for archival. The net effect is a marked increase in the accuracy of data entitlement permissions and, therefore, data protection.

2. Remove global group access control lists (ACLs) like ‘Everyone'
It is not uncommon for folders on file shares to have access control permissions allowing ‘everyone,' or all ‘domain users' (nearly everyone) to access the data contained. This creates a significant security risk, for any data placed in that folder will inherit those exposed permissions, and those who place data in these wide-open folders may not be aware of the lax access settings. Global access to folders should be removed and replaced with rules that give access to explicit groups that need it.

3. Perform data entitlement (ACL) reviews
Every file and folder on a Windows or Unix file system has access controls assigned to it that determine which users can access the data and how, i.e., read, write, execute, and list. These controls need to be reviewed on a regular basis and the settings documented so that they can be verified as accurate by data business owners and security policy auditors.

4. Revoke unused and unwarranted permissions
Users with access to data that is not material to their jobs constitute a security risk for organizations. Most users only need access to a small fraction of the data that resides on file servers. It is important to review and then remove or revoke permissions that are unused.

5. Audit permissions changes
Access Control Lists are the fundamental preventive control mechanism that's in place to protect data from loss, tampering, and exposure. IT requires the ability to capture and report on access control changes to data, especially for highly sensitive folders. If access is incorrectly assigned or changed to a more permissive state without a good business reason, IT and the data business owner must be alerted quickly and be able to remediate the situation.

6. Audit group membership changes
Directory Groups are the primary entities on access control lists (Active Directory, LDAP, NIS, etc.) with membership granting access to unstructured data as well as many applications, VPN gateways, etc. Users are added to existing and newly created groups on a daily basis. Without an audit trail of who is being added and removed from these groups, enforcing access control processes is impossible. Ideally group membership should be authorized and reviewed by the owner of the data or resource to which the group provides access.

7. Audit data access
Effective management of any data set is impossible without an access record. Unless you can reliably observe data use you cannot observe its misuse, abuse, or non-use. Even if IT could ask its organization's users if they used each data set, the end users would not be able to answer accurately - the scope of a typical user's access activity is far beyond what humans can recall. Without a record of data usage, you cannot determine the proper organizational owner for a data set, and neither the unfound owner nor IT can make informed decisions about protecting it, archiving it, or deleting it.

8. Prioritize data
While all data should be protected, some data needs to be protected much more urgently than others. Using data owners, data access patterns, and data classification technology, data that is considered sensitive, confidential, or internal should be tagged accordingly, and protected and reviewed frequently.

9. Align security groups to data
Whenever someone is placed in a group, they get file system access to all folders that list the group on its ACL. Unfortunately, organizations have completely lost track of what data folders contain which Active Directory, LDAP, SharePoint or NIS groups. This uncertainty undermines any access control review project, and any role-based access control (RBAC) initiative. In role-based access control methodology, each role has a list of associated groups into which the user is placed when they are assigned that role. It is impossible to align the role with the right data if the organization cannot verify what data a group provides access to.

10. Lock down, delete, or archive stale, unused data
Not all of the data contained on shared file servers and network attached storage devices is in active use. By archiving stale or unused data to offline storage, or deleting it, IT makes the job of managing the remainder simpler and easier, while freeing up expensive resources.

The principal of least privilege is a well-accepted guideline for managing access controls - only those who have an organizational need to access information should be able to do so. However, for most organizations, a least-privilege model is not feasible, because data is generated far too quickly and personnel change rapidly. Even in small organizations the growing data set and pace of organizational changes exceed the IT department's ability to keep up with access control lists and group memberships. By automating and conducting the 10 management tasks outlined above frequently, organizations will gain the visibility and auditing required that determines who can access the unstructured data, who is accessing it and who should have access. This detailed data access behavior will benefit organizations in a plethora of ways, most significantly securing their data, ensuring compliance demands are met, and freeing up expensive storage resources.

More Stories By Wendy Yale

Wendy Yale leads marketing and brand development for Varonis’ global growth efforts. She is a veteran brand strategist with 16 years of marketing experience. Prior to Varonis, Wendy successfully managed the global integrated marketing communications team at Symantec. She joined Symantec from VERITAS, where she led the interactive media marketing team. Beginning her career as a freelance producer and writer, she has developed projects for organizations such as the University of Hawaii at Manoa, Film and Video Magazine, Aloha Airlines, the International Teleproduction Society and Unitel Video. Wendy has held senior posts at DMEC and ReplayTV, and holds a B.A. degree in Geography from Cal State Northridge. You can contact Wendy at [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, will draw upon their own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He will also discuss the implementation of microservices in data and applicat...
Join IBM June 8 at 18th Cloud Expo at the Javits Center in New York City, NY, and learn how to innovate like a startup and scale for the enterprise. You need to deliver quality applications faster and cheaper, attract and retain customers with an engaging experience across devices, and seamlessly integrate your enterprise systems. And you can't take 12 months to do it.
This is not a small hotel event. It is also not a big vendor party where politicians and entertainers are more important than real content. This is Cloud Expo, the world's longest-running conference and exhibition focused on Cloud Computing and all that it entails. If you want serious presentations and valuable insight about Cloud Computing for three straight days, then register now for Cloud Expo.
SYS-CON Events announced today that Stratoscale, the software company developing the next generation data center operating system, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Stratoscale is revolutionizing the data center with a zero-to-cloud-in-minutes solution. With Stratoscale’s hardware-agnostic, Software Defined Data Center (SDDC) solution to store everything, run anything and scale everywhere...
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.
Digital means customer preferences and behavior are driving enterprise technology decisions to be sure, but let’s not forget our employees. After all, when we say customer, we mean customer writ large, including partners, supply chain participants, and yes, those salaried denizens whose daily labor forms the cornerstone of the enterprise. While your customers bask in the warm rays of your digital efforts, are your employees toiling away in the dark recesses of your enterprise, pecking data into...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
With DevOps becoming more well-known and established practice in nearly every industry that delivers software, it is important to continually reassess its efficacy. This week’s top 10 includes a discussion on how the quick uptake of DevOps adoption in the enterprise has posed some serious challenges. Additionally, organizations who have taken the DevOps plunge must find ways to find, hire and keep their DevOps talent in order to keep the machine running smoothly.
Between the mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at 18th Cloud Expo, Charles Kendrick, CTO & Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how business and devel...
Call it DevOps or not, if you are concerned about releasing more code faster and at a higher quality, the resulting software delivery chain and process will look and smell like DevOps. But for existing development teams, no matter what the velocity objective is, getting from here to there is not something that can be done without a plan. Moving your release cadence from months to weeks is not just about learning Agile practices and getting some automation tools. It involves people, tooling and ...
The notion of customer journeys, of course, are central to the digital marketer’s playbook. Clearly, enterprises should focus their digital efforts on such journeys, as they represent customer interactions over time. But making customer journeys the centerpiece of the enterprise architecture, however, leaves more questions than answers. The challenge arises when EAs consider the context of the customer journey in the overall architecture as well as the architectural elements that make up each...
APIs have taken the world by storm in recent years. The use of APIs has gone beyond just traditional "software" companies, to companies and organizations across industries using APIs to share information and power their applications. For some organizations, APIs are the biggest revenue drivers. For example, Salesforce generates nearly 50% of annual revenue through APIs. In other cases, APIs can increase a business's footprint and initiate collaboration. Netflix, for example, reported over 5 bi...
As the software delivery industry continues to evolve and mature, the challenge of managing the growing list of the tools and processes becomes more daunting every day. Today, Application Lifecycle Management (ALM) platforms are proving most valuable by providing the governance, management and coordination for every stage of development, deployment and release. Recently, I spoke with Madison Moore at SD Times about the changing market and where ALM is headed.
If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your busines...
These days I mostly make my living as a consultant. Consultants in general are probably not the best loved group in the world. It is common to think of consultants wafting-in to your organization, telling you things that you already know and advising you to “change your culture”, whatever that means. Subsequently they depart, no-doubt with a fat fee, and leave you as you were before with the same problems and no progress made.
Struggling to keep up with increasing application demand? Learn how Platform as a Service (PaaS) can streamline application development processes and make resource management easy.
In the rush to compete in the digital age, a successful digital transformation is essential, but many organizations are setting themselves up for failure. There’s a common misconception that the process is just about technology, but it’s not. It’s about your business. It shouldn’t be treated as an isolated IT project; it should be driven by business needs with the committed involvement of a range of stakeholders.
New Relic, Inc. has announced a set of new features across the New Relic Software Analytics Cloud that offer IT operations teams increased visibility, and the ability to diagnose and resolve performance problems quickly. The new features further IT operations teams’ ability to leverage data and analytics, as well as drive collaboration and a common, shared understanding between teams. Software teams are under pressure to resolve performance issues quickly and improve availability, as the comple...
The goal of any tech business worth its salt is to provide the best product or service to its clients in the most efficient and cost-effective way possible. This is just as true in the development of software products as it is in other product design services. Microservices, an app architecture style that leans mostly on independent, self-contained programs, are quickly becoming the new norm, so to speak. With this change comes a declining reliance on older SOAs like COBRA, a push toward more s...