Welcome!

Microservices Expo Authors: Pat Romanski, Carmen Gonzalez, Elizabeth White, Ken Schwaber, Aruna Ravichandran

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, @CloudExpo, Apache, SDN Journal

@BigDataExpo: Blog Feed Post

Big Data = Dropping the Big One?

Big Data is certainly both marvelous and terrifying

Editor’s note: This article was first published on Analyst One, a site focused on analysts and topics of interest to the analytical community.-bg

Eminent network scientist Laszlo Barabasi recently penned an op-ed calling on fellow scientists to spearhead the ethical use of big data. Comparing big data to the atom bomb, Barabasi persuasively argued that the technology and methodologies he and other social network theorists had created had far outstripped societal controls on its use.

Barabasi’s op-ed is part of a growing backlash against big data technologies and methodologies While Barabasi and historian of science George Dyson have the historical perspective, technical insight, and scientific stature to write insightfully about the problems of pervasive data collection and algorithms that structure human decisions, other criticisms have been less than edifying. Frustrated Harvard Business Review blogger Andrew McAfee recently called on pundits to “stop sounding ignorant about big data.” Big data, McAfee points out, is held to unrealistic standards and often the victim of strawmanning. Critics expect big data to eliminate uncertainty (spoiler: it doesn’t), falsely overestimate the power of qualitative thinking, make broad criticisms against quantification itself, and overestimate the willingness of big data advocates to automate important decisions. Listening to some critics talk, you’d think that Palantir or Recorded Future = Skynet.

While insightful in many aspects, Barabasi’s op-ed also fails to fully investigate the real implications of his Hadoop ~ ICBM analogy. Many scientists sought to influence the use of nuclear weapons, understandably believing themselves the most well-informed about the dangers they posed. However, even the most effective of their well-meaning efforts were superseded by Cold War politics. It is within the American political system — teetering between fear of terrorism, fear of big government, love of capitalism, and fear of capitalism — that big data’s societal impact will be decided. And if the rising tide of anti-science sentiment is any proof, politicians couldn’t care less about science or the men and women who practice it.  Scientists are no longer viewed as unimpeachable figures of authority — and to some extent it’s doubtful they ever really were in predictably populist America.

Second, if big data is a weapon of mass destruction, you aren’t going to see Hans Blix suddenly busting down the doors of startups for snap inspections of Apache software or NoSQL. The only thing inherently more “dual use” than offensive cyber tools are big data technologies and methodologies. They are quickly becoming an integral part of modern business, academic research, and intelligence practice. Barabasi and others are correct that in a world in which the individual is more vulnerable than ever to government and corporate usage of data science, we arguably should try to mitigate current and potential harm. The problem with analogizing data to nukes (besides the fact that Google never destroyed a Japanese city) is that the former are clumsy weapons of last resort that even bitter enemies had a stake in controlling and the latter are ubiquitous aspects of modern life.

While Barabasi and others may have pioneered the techniques industry and government demand, big data has long since ceased to be a purely academic endeavor. The men and women who use them mostly aren’t scientists. Big data is heavily driven by corporate and government needs. Even the most talented PhDs often leave the academy to pursue higher salaries and greater freedom in the corporate world. Perhaps the best big data analogy is not to the atomic science of Einstein or Oppenheimer, but to the mathematics of Newton, Leibnitz, and Fourier. Were they alive today, even these eminent scientists would be powerless to prevent their mathematics from being used for military operations research on how to kill more efficiently or from being inputted into faulty and investor-bankrupting financial models. A Taylor Series or a differential equation — once out in the wild — belongs to anyone with a pen, paper, and calculator. Likewise, with open-source tools like Python machine learning library scikit-learn, anyone with the requisite technical training can utilize some canonical data science techniques.

Big data is certainly both marvelous and terrifying. It offers the opportunity to make money, make new scientific discoveries, and enhance political endeavors from development to national security. It also puts the individual at the mercy of companies and governments. But at the end of the day it is “neither a atomic bomb nor a holy grail.” It should neither be held to unrealistic standards nor feared as a weapon of mass destruction. And everyone who cares about the ethics of data — from the scientist to the layperson — must understand that control over its use is a function of the messy and dysfunctional domestic political scene and the anarchic international system.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@MicroservicesExpo Stories
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
I’m told that it has been 21 years since Scrum became public when Jeff Sutherland and I presented it at an Object-Oriented Programming, Systems, Languages & Applications (OOPSLA) workshop in Austin, TX, in October of 1995. Time sure does fly. Things mature. I’m still in the same building and at the same company where I first formulated Scrum.[1] Initially nobody knew of Scrum, yet it is now an open source body of knowledge translated into more than 30 languages[2] People use Scrum worldwide for ...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Synthetic monitoring is hardly a new technology. It’s been around almost as long as the commercial World Wide Web has. But the importance of monitoring the performance and availability of a web application by simulating users’ interactions with that application, from around the globe, has never been more important. We’ve seen prominent vendors in the broad APM space add this technology with new development or partnerships just in the last 18 months.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
A lot of time, resources and energy has been invested over the past few years on de-siloing development and operations. And with good reason. DevOps is enabling organizations to more aggressively increase their digital agility, while at the same time reducing digital costs and risks. But as 2017 approaches, the hottest trends in DevOps aren’t specifically about dev or ops. They’re about testing, security, and metrics.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.