Welcome!

Microservices Expo Authors: Lori MacVittie, David Sprott, Elizabeth White, Liz McMillan, Daniel Khan

Related Topics: @CloudExpo, Java IoT, Industrial IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog

@CloudExpo: Article

Drilling the Data Wells of the Future

Human-generated Big Data could make us rich

Human-generated content is comprised of all the files and e-mails that we create every day, all the presentations, word processing documents, spreadsheets, audio files and other documents our employers ask us to produce hour-by-hour. These are the files that take up the vast majority of digital storage space in most organizations - they are kept for significant amounts of time and have huge amounts of metadata associated with them. Human-generated content is huge, and its metadata is even bigger. Metadata is the information about a file: who might have created it, what type of file it is, what folder it is stored in, who has been reading it and who has access to it. The content and metadata together make up human-generated Big Data.

The problem is that most of us, meaning organizations and governments, are not yet equipped with the tools to exploit human-generated Big Data. The conclusion of a recent survey of over 1,000 Internet experts and other Internet users, published by the Pew Research Centre and the Imagining the Internet Center at Elon University, is that the world may not be ready to properly handle and understand Big Data.[1] These experts have come to the conclusion that the huge quantities of data, which they term "digital exhaust," which will be created by the year 2020 could very well enhance productivity, improve organizational transparency and expand the frontier of the "knowable future." However, they are concerned about whose hands this information is in and whether government or corporations will use this information wisely.

The survey found that "...human and machine analysis of Big Data could improve social, political and economic intelligence by 2020. The rise of what is known as Big Data will facilitate things like real-time forecasting of events; the development of ‘inferential software' that assesses data patterns to project outcomes; and the creation of algorithms for advanced correlations that enable new understanding of the world."

Of those surveyed, 39% of the Internet experts asked agreed with the counterargument to Big Data's benefits, which posited that "Human and machine analysis of Big Data will cause more problems than it solves by 2020. The existence of huge data sets for analysis will engender false confidence in our predictive powers and will lead many to make significant and hurtful mistakes. Moreover, analysis of Big Data will be misused by powerful people and institutions with selfish agendas who manipulate findings to make the case for what they want."

As one of the study's participants, entrepreneur Bryan Trogdon put it: "Big Data is the new oil," observing that, "...the companies, governments, and organizations that are able to mine this resource will have an enormous advantage over those that don't. With speed, agility, and innovation determining the winners and losers, Big Data allows us to move from a mindset of ‘measure twice, cut once' to one of ‘place small bets fast.'"[2]

Jeff Jarvis, professor and blogger, said: "Media and regulators are demonizing Big Data and its supposed threat to privacy. Such moral panics have occurred often thanks to changes in technology. But the moral of the story remains: there is value to be found in this data, value in our newfound ability to share. Google's founders have urged government regulators not to require them to quickly delete searches because, in their patterns and anomalies, they have found the ability to track the outbreak of the flu before health officials could and they believe that by similarly tracking a pandemic, millions of lives could be saved. Demonizing data, big or small, is demonising knowledge, and that is never wise."[3]

Sean Mead, director of analytics at Mead, Mead & Clark, Interbrand said: "Large, publicly available data sets, easier tools, wider distribution of analytics skills, and early stage artificial intelligence software will lead to a burst of economic activity and increased productivity comparable to that of the Internet and PC revolutions of the mid to late 1990s. Social movements will arise to free up access to large data repositories, to restrict the development and use of AIs, and to ‘liberate' AIs."[4]

These are very interesting arguments and they do begin to get to the heart of the matter - which is that our data sets have grown beyond our ability to analyze and process them without sophisticated automation. We simply have to rely on technology to analyze and cope with this enormous wave of content and metadata.

Analyzing human-generated Big Data has enormous potential. More than potential, harnessing the power of metadata has become essential to manage and protect human-generated content. File shares, emails, and intranets have made it so easy for end users to save and share files that organizations now have more human-generated content than they can sustainably manage and protect using small data thinking. Many organizations face real problems because questions that could be answered 15 years ago on smaller, more static data sets can no longer be answered. These questions include: Where does critical data reside, who accesses it, and who should have access to it? As a consequence, IDC estimates that only half the data that should be protected is protected.

The problem is compounded with cloud-based file sharing, as these services create yet another growing store of human-generated content requiring management and protection - one that lies outside corporate infrastructure with different controls and management processes.

David Weinberger of Harvard University's Berkman Center said: "We are just beginning to understand the range of problems Big Data can solve, even though it means acknowledging that we're less unpredictable, free, madcap creatures than we'd like to think. If harnessing the power of human generated big data can make data protection and management less unpredictable, free, and madcap, organizations will be grateful.

References

  1. http://www.elon.edu/e-web/predictions/expertsurveys/2012survey/future_Big_Data_2020.xhtml
  2. http://pewinternet.org/Reports/2012/Future-of-Big-Data/Overview.aspx
  3. http://pewinternet.org/Reports/2012/Future-of-Big-Data/Overview.aspx
  4. http://pewinternet.org/Reports/2012/Future-of-Big-Data/Overview.aspx

More Stories By Rob Sobers

Rob Sobers is a designer, web developer, and technical marketing manager for Varonis, where he oversees the company's online marketing strategy. He writes a popular blog on software and security at accidentalhacker.com and is co-author of the book "Learn Ruby the Hard Way", which has been used by thousands of students to learn the Ruby programming language. Rob is a 12-year technology industry veteran and, prior to joining Varonis, held positions in software engineering, design, and professional services.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
No matter how well-built your applications are, countless issues can cause performance problems, putting the platforms they are running on under scrutiny. If you've moved to Node.js to power your applications, you may be at risk of these issues calling your choice into question. How do you identify vulnerabilities and mitigate risk to take the focus off troubleshooting the technology and back where it belongs, on innovation? There is no doubt that Node.js is one of today's leading platforms of ...

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Before becoming a developer, I was in the high school band. I played several brass instruments - including French horn and cornet - as well as keyboards in the jazz stage band. A musician and a nerd, what can I say? I even dabbled in writing music for the band. Okay, mostly I wrote arrangements of pop music, so the band could keep the crowd entertained during Friday night football games. What struck me then was that, to write parts for all the instruments - brass, woodwind, percussion, even k...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Node.js and io.js are increasingly being used to run JavaScript on the server side for many types of applications, such as websites, real-time messaging and controllers for small devices with limited resources. For DevOps it is crucial to monitor the whole application stack and Node.js is rapidly becoming an important part of the stack in many organizations. Sematext has historically had a strong support for monitoring big data applications such as Elastic (aka Elasticsearch), Cassandra, Solr, S...
If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
This digest provides an overview of good resources that are well worth reading. We’ll be updating this page as new content becomes available, so I suggest you bookmark it. Also, expect more digests to come on different topics that make all of our IT-hearts go boom!
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...