|By Rob Sobers||
|October 29, 2012 10:00 AM EDT||
Human-generated content is comprised of all the files and e-mails that we create every day, all the presentations, word processing documents, spreadsheets, audio files and other documents our employers ask us to produce hour-by-hour. These are the files that take up the vast majority of digital storage space in most organizations - they are kept for significant amounts of time and have huge amounts of metadata associated with them. Human-generated content is huge, and its metadata is even bigger. Metadata is the information about a file: who might have created it, what type of file it is, what folder it is stored in, who has been reading it and who has access to it. The content and metadata together make up human-generated Big Data.
The problem is that most of us, meaning organizations and governments, are not yet equipped with the tools to exploit human-generated Big Data. The conclusion of a recent survey of over 1,000 Internet experts and other Internet users, published by the Pew Research Centre and the Imagining the Internet Center at Elon University, is that the world may not be ready to properly handle and understand Big Data. These experts have come to the conclusion that the huge quantities of data, which they term "digital exhaust," which will be created by the year 2020 could very well enhance productivity, improve organizational transparency and expand the frontier of the "knowable future." However, they are concerned about whose hands this information is in and whether government or corporations will use this information wisely.
The survey found that "...human and machine analysis of Big Data could improve social, political and economic intelligence by 2020. The rise of what is known as Big Data will facilitate things like real-time forecasting of events; the development of ‘inferential software' that assesses data patterns to project outcomes; and the creation of algorithms for advanced correlations that enable new understanding of the world."
Of those surveyed, 39% of the Internet experts asked agreed with the counterargument to Big Data's benefits, which posited that "Human and machine analysis of Big Data will cause more problems than it solves by 2020. The existence of huge data sets for analysis will engender false confidence in our predictive powers and will lead many to make significant and hurtful mistakes. Moreover, analysis of Big Data will be misused by powerful people and institutions with selfish agendas who manipulate findings to make the case for what they want."
As one of the study's participants, entrepreneur Bryan Trogdon put it: "Big Data is the new oil," observing that, "...the companies, governments, and organizations that are able to mine this resource will have an enormous advantage over those that don't. With speed, agility, and innovation determining the winners and losers, Big Data allows us to move from a mindset of ‘measure twice, cut once' to one of ‘place small bets fast.'"
Jeff Jarvis, professor and blogger, said: "Media and regulators are demonizing Big Data and its supposed threat to privacy. Such moral panics have occurred often thanks to changes in technology. But the moral of the story remains: there is value to be found in this data, value in our newfound ability to share. Google's founders have urged government regulators not to require them to quickly delete searches because, in their patterns and anomalies, they have found the ability to track the outbreak of the flu before health officials could and they believe that by similarly tracking a pandemic, millions of lives could be saved. Demonizing data, big or small, is demonising knowledge, and that is never wise."
Sean Mead, director of analytics at Mead, Mead & Clark, Interbrand said: "Large, publicly available data sets, easier tools, wider distribution of analytics skills, and early stage artificial intelligence software will lead to a burst of economic activity and increased productivity comparable to that of the Internet and PC revolutions of the mid to late 1990s. Social movements will arise to free up access to large data repositories, to restrict the development and use of AIs, and to ‘liberate' AIs."
These are very interesting arguments and they do begin to get to the heart of the matter - which is that our data sets have grown beyond our ability to analyze and process them without sophisticated automation. We simply have to rely on technology to analyze and cope with this enormous wave of content and metadata.
Analyzing human-generated Big Data has enormous potential. More than potential, harnessing the power of metadata has become essential to manage and protect human-generated content. File shares, emails, and intranets have made it so easy for end users to save and share files that organizations now have more human-generated content than they can sustainably manage and protect using small data thinking. Many organizations face real problems because questions that could be answered 15 years ago on smaller, more static data sets can no longer be answered. These questions include: Where does critical data reside, who accesses it, and who should have access to it? As a consequence, IDC estimates that only half the data that should be protected is protected.
The problem is compounded with cloud-based file sharing, as these services create yet another growing store of human-generated content requiring management and protection - one that lies outside corporate infrastructure with different controls and management processes.
David Weinberger of Harvard University's Berkman Center said: "We are just beginning to understand the range of problems Big Data can solve, even though it means acknowledging that we're less unpredictable, free, madcap creatures than we'd like to think. If harnessing the power of human generated big data can make data protection and management less unpredictable, free, and madcap, organizations will be grateful.
I read an insightful article this morning from Bernard Golden on DZone discussing the DevOps conundrum facing many enterprises today – is it better to build your own DevOps tools or go commercial? For Golden, the question arose from his observations at a number of DevOps Days events he has attended, where typically the audience is composed of startup professionals: “I have to say, though, that a typical feature of most presentations is a recitation of the various open source products and compo...
May. 27, 2015 08:45 PM EDT Reads: 1,074
In the first four parts of this series I presented an introduction to microservices along with a handful of emerging microservices patterns, and a discussion of some of the downsides and challenges to using microservices. The most recent installment of this series looked at ten ways that PaaS facilitates microservices development and adoption. In this post I’ll cover some words of wisdom, advice intended for individuals, teams, and organizations considering a move to microservices. I've gleaned...
May. 27, 2015 08:30 PM EDT Reads: 3,043
Many people recognize DevOps as an enormous benefit – faster application deployment, automated toolchains, support of more granular updates, better cooperation across groups. However, less appreciated is the journey enterprise IT groups need to make to achieve this outcome. The plain fact is that established IT processes reflect a very different set of goals: stability, infrequent change, hands-on administration, and alignment with ITIL. So how does an enterprise IT organization implement change...
May. 27, 2015 08:00 PM EDT Reads: 1,170
Containers Expo Blog covers the world of containers, as this lightweight alternative to virtual machines enables developers to work with identical dev environments and stacks. Containers Expo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Bookmark Containers Expo Blog ▸ Here Follow new article posts on Twitter at @ContainersExpo
May. 27, 2015 08:00 PM EDT Reads: 1,769
There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way ...
May. 27, 2015 07:00 PM EDT Reads: 1,864
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
May. 27, 2015 07:00 PM EDT Reads: 3,834
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
May. 27, 2015 06:45 PM EDT Reads: 2,209
This is the final installment of the six-part series Microservices and PaaS. It seems like forever since I attended Adrian Cockroft's meetup focusing on microservices. It's actually only been a couple of months, but much has happened since then: countless articles, meetups, and conference sessions focusing on microservices have been delivered, many meetings and design efforts at companies moving towards a microservices-based approach have been endured, and five installments of this blog series ...
May. 27, 2015 05:45 PM EDT Reads: 2,610
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
May. 27, 2015 05:00 PM EDT Reads: 2,166
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
May. 27, 2015 05:00 PM EDT Reads: 2,081
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
May. 27, 2015 04:00 PM EDT Reads: 2,065
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
May. 27, 2015 03:00 PM EDT Reads: 2,775
I’ve been thinking a bit about microservices (μServices) recently. My immediate reaction is to think: “Isn’t this just yet another new term for the same stuff, Web Services->SOA->APIs->Microservices?” Followed shortly by the thought, “well yes it is, but there are some important differences/distinguishing factors.” Microservices is an evolutionary paradigm born out of the need for simplicity (i.e., get away from the ESB) and alignment with agile (think DevOps) and scalable (think Containerizati...
May. 27, 2015 03:00 PM EDT Reads: 1,720
In her General Session at 15th Cloud Expo, Anne Plese, Senior Consultant, Cloud Product Marketing, at Verizon Enterprise, focused on finding the right mix of renting vs. buying Oracle capacity to scale to meet business demands, and offer validated Oracle database TCO models for Oracle development and testing environments. Anne Plese is a marketing and technology enthusiast/realist with over 19+ years in high tech. At Verizon Enterprise, she focuses on driving growth for the Verizon Cloud platfo...
May. 27, 2015 03:00 PM EDT Reads: 5,688
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
May. 27, 2015 02:00 PM EDT Reads: 4,705
How does one bridge the gap between traditional enterprise storage infrastructures and the private, hybrid, and public cloud? In his session at 15th Cloud Expo, Dan Pollack, Chief Architect of Storage Operations at AOL Inc., examed the workload differences and required changes to reuse existing knowledge and components when building and using a cloud infrastructure. He also looked into the operational considerations, tool requirements, and behavioral changes required for private cloud storage s...
May. 27, 2015 02:00 PM EDT Reads: 2,968
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
May. 27, 2015 02:00 PM EDT Reads: 5,138
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
May. 27, 2015 02:00 PM EDT Reads: 2,301
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
May. 27, 2015 02:00 PM EDT Reads: 5,852
Virtualization is everywhere. Enormous and highly profitable companies have been built on nothing but virtualization. And nowhere has virtualization made more of an impact than in Cloud Computing, the rampant and unprecedented adoption of which has been the direct result of the wide availability of virtualization software and techniques that enabled it. But does the cloud actually require virtualization?
May. 27, 2015 01:00 PM EDT Reads: 2,105