Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Yeshim Deniz, Zakia Bouachraoui

Related Topics: @CloudExpo

@CloudExpo: Article

Address Your Unique Big Data Challenge

Never Mind Headlines About "60% of This" or :85% of That"

As I've recently written, people are throwing around the term Big Data as if they're speaking of duct tape-you can use it here, you can use it there, you can use it anywhere.

This promiscuous use of the term has led to some amusing "findings" in today's headlines-eg, "60% of companies expect to invest in Big Data dramatically within three years" while at the same time "85% of Fortune 500 companies unable to leverage Big Data for competitive advantage."

The latter quote is from Gartner, so is easily discounted-besides, there should be an automatic penalty of having to watch The View on a continuous loop for a year whenever one utters the phrase "leverage for competitive advantage." The first of the two quotes is one of those wildly general, unfocused nonsense predictions that mean nothing. The use of the word "expect" makes it even less convincing.

So how do we restore some order here? Perhaps a little checklist can help.

  • Are you considering Big Data for a project, an initiative, or a makeover?

  • Are you with a Fortune 500 or similarly large company?

  • Is your data new and unstructured? How much do you know about Hadoop?

  • Is your data traditional and structured? How much of it do you have?

  • Are you allowed to go outside the company for public-cloud instances? (See New York Times and its creation of 11 million pdfs during one large public-cloud session.)

  • How many of your current vendors are touting cloud computing? How many are touting Big Data? How many of them did so at your prompting, rather than theirs?

  • What's your timeframe?

  • What's your budget for using new resources, even if they're rented, cloud resources?

  • Why are you considering Big Data in the first place? Have you seen an answer to a longstanding problem, or are you just reading too many articles about it?

I think it's fair to say Big Data can run anywhere from a gigabyte on up, depending on the size of your organization, your problem, and your previous ability to solve your problem. This statement may sound horrifying, as Big Data has until recently been the province of people who work with hundreds of terabytes and dozens of petabytes.

Facebook claims 100 petabytes (ie, 100 million gigabytes) in its Hadoop-driven repository-the mind boggles at the uselessness of that information for anything other than hawking targeted potpourri. You will likely have much less data, but perhaps at least as serious a use for it.

In any case, I never trust predictions of "60% of this" or "85% of that" or whatnot. We are all as unique as the challenges we face. We can perhaps validate our concerns by reading of other people's problems and solutions, but can only learn about working this stuff best by doing it on our own, whether we're part of the 60 percent, the 85 percent, the 99 percent, or the 1 percent.

Follow me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Microservices Articles
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.