Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @DXWorldExpo, Microservices Expo, Agile Computing, @CloudExpo, Cloud Security, SDN Journal

@DXWorldExpo: Article

How Do You Kill Something That Lives Forever?

The Dark Side of Big Data

Who doesn't love a good zombie flick, right? Hordes of undead ambling around in tattered clothing looking for something to eat. The low, drawn-out moans of a once-productive member of society, who now possesses the brainpower of a teenager on an 8-hour Call of Duty bender.

While I don't believe the zombie apocalypse is happening anytime soon (at least not for another six months), there is another form of undead that is very much alive and well today. I'm talking of course about... digital data.

We're at the point now where anything you do online leaves a digital footprint, whether it's a photo posted to Instagram, a purchase on Amazon, or a patient intake form completed on an iPad.

This data, stored in the cloud, is often moved and replicated, but it really can't be destroyed, and companies place a great deal of value on it. We often talk about this phenomenon of Big Data. It's the increasing flow of varied forms of data that ultimately reaches petabyte scale. And it contains little bits and pieces about you that are next to impossible to erase.

Consider the following:

Data can literally be kept forever. Thanks to the nature of big data architectures, most organizations will never run out of storage capacity. So data, regardless of it's importance, can be retained forever. That means 40 years from now, a company might still retain all the metadata associated with a purchase you made online last week. It stands to reason that the more data that gets scooped up, the more personal data gets scooped up. Organizations, particularly those in Europe that must comply with strict privacy regulations, will need to make some tough decisions about how to keep personally identifiable information (PII) confidential.

Companies should care more about privacy than consumers. While individuals may care about privacy, particularly when it comes to their children, I don't believe that the collective masses do. Social media sharing, providing an email address in exchange for online coupons, giving a mobile gaming app access to your contacts, lack of outrage at the NSA spying scandal are all evidence of that. On the other hand, companies care greatly about their reputation and their competitive advantage, so they can't afford to be viewed as having a laissez faire attitude toward protecting sensitive data. Gazzang works with a number of SaaS companies who have gone to great lengths to keep their customer data private.

Anonymizing certain datasets is not the answer. A commonly held belief is that anonymizing or tokenizing certain personally identifiable information like names, addresses and phone numbers is the best way to ensure user privacy. This is simply not true. With as much user data as there is floating around, today's analytics systems make it possible to take a series of disparate bits of data and piece them together to figure out exactly who an individual is.

How analyzed data is used depends on the company. Whether data is used to predict future behavior or condemn those with past transgressions is up to the company. I suspect there will be use cases for both since the data and tools are available. Take the airline industry for example. A frequent flyer in good standing who is known to travel abroad for two weeks in October, may in late September receive a gratis global TSA Pre check to get them through the security line more quickly. That same airline may also decide to charge a premium on business travelers in late March, June and September because they know from historical data who the salespeople are that need to travel in order close out a successful quarter.

Don't just pay lip service to data security. Do something about it. C-level execs need to have a serious security and privacy conversation BEFORE their company embarks on a big data project. You don’t wait until after a burglary to put locks on your doors, and you should not wait until after a breach to secure your data. It is possible to respect customer and employee privacy, even as you pile up terabytes of data. Here are a few tips on how:

  • Encrypt all data at rest. This ensures a data breach or leak won't result in the embarrassing or illegal disclosure of private or confidential data.
  • Establish and enforce access policies. This keeps unauthorized parties from gaining access to the data or the encryption keys.
  • If you store encrypted data in the cloud, make sure your keys are stored locally or on a separate server. Separating the keys from the encrypted data ensures a breach or subpoena doesn't result in the loss of the keys
  • Don't trade off security for big data performance and availability, because you can have both. Find a security solution that's built to work in a cloud or big data environment. There are plenty of options out there that are lightning quick and don't rely on clunky, expensive hardware.

More Stories By David Tishgart

David Tishgart is a Director of Product Marketing at Cloudera, focused on the company's cloud products, strategy, and partnerships. Prior to joining Cloudera, he ran business development and marketing at Gazzang, an enterprise security software company that was eventually acquired by Cloudera. He brings nearly two decades of experience in enterprise software, hardware, and services marketing to Cloudera. He holds a bachelor's degree in journalism from the University of Texas at Austin.

Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...