Welcome!

Microservices Expo Authors: Gordon Haff, Pat Romanski, Liz McMillan, Elizabeth White, Carmen Gonzalez

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, @CloudExpo, @BigDataExpo, SDN Journal

Containers Expo Blog: Article

The Big Data Bottleneck: Uploading to the Cloud

If only we could get those gigando-bytes into the Cloud in the first place. And there’s the rub.

The problem with Big Data is that, well, Big Data are big. Really big. We’re talking terabytes. Petabytes. Zettabytes. Whatever’s-even-bigger-bytes. And of course, we want to solve all our Big Data challenges in the Cloud. If only we could get those gigando-bytes into the Cloud in the first place. And there’s the rub.

Uploading Big Data from our internal network to the Cloud via an Internet connection is as practical as filling a swimming pool through a drinking straw. It doesn’t matter how sophisticated our Big Data analytics, how super-duper our Hadoopers. If we can’t efficiently get our data where we need them when we need them, we’re stuck.

Optimize the Pipe
Fortunately, the Big Data upload problem isn’t new. In fact, it’s been around for years, under the moniker Wide Area Network (WAN) Optimization. Fortunate for us because vendors have been working on WAN Optimization techniques for a while now, and now several of them are repurposing those techniques to help with the Cloud.

For example, Aryaka has been peddling WAN Optimization appliances for several years. Put one appliance in your local data center, a second in the remote data center, and proprietary technology moves data from one to the other at a rapid clip. Now that the Cloud has turned their world upside down, they are providing a distributed service at the remote end, a “mesh of network connections” better suited to the Cloud. In other words, Aryaka is building an offering similar to Content Delivery Networks (CDNs) like Akamai.

RainStor, in contrast, focuses primarily on a proprietary compression algorithm that promises to squeeze data into one fortieth their original size. Furthermore, RainStor’s compressed data remain directly accessible using standard SQL or even MapReduce on Hadoop with no storage-eating, time-consuming reinflation.

Then there’s Aspera, who’s found a sophisticated way around the limitations of the Transmission Control Protocol (TCP) itself. After all, TCP’s tiny packets and penchant for resending them are a large part of the reason uploading Big Data over the Internet runs like such a dog in the first place. To teach this dog a new trick or two, Aspera transfers use one TCP port for session initialization and control, and one User Datagram Protocol (UDP) port for data transfer.

UDP is an older, fire-and-forget protocol that doesn’t perform the retries that provide TCP’s reliability, but by combining the two protocols, FASP achieves nearly 100% error-free data throughput. In fact, FASP reaches the maximum transfer speed possible given the hardware on which you deploy it, and maintains maximum available throughput independent of network delay and packet loss. FASP also aggregates hundreds of concurrent transfers on commodity hardware, addressing the drinking straw problem in part by supporting hundreds of straws at once.

CloudOpt is also a player worth mentioning. Their JetStream technology takes a soup-to-nuts approach that combines compression and transmission protocol optimization with advanced data deduplication, SSL acceleration, and an ingenious approach to getting the most performance out of cached data. Or Attunity Cloudbeam, that touts file to Cloud upload, file to Cloud replication, and Cloud to Cloud replication. Attunity’s Managed File Transfer (MFT) incorporates a secure DMZ architecture, security policy enforcement, guaranteed and accelerated transfers, process automation, and audit capabilities across each stage of the file transfer process.

Finally, there’s Amazon Web Services (AWS) itself. Yes, most if not all of the vendors discussed above can firehose data into AWS’s various storage services. But AWS also offers a simple, if decidedly low-tech approach as well: AWS Import/Export. Simply ship your big hard drives to Amazon. They’ll hook them up, copy the data to your Simple Storage Service (S3) or other storage service, and ship the drive back when you’re done. This SneakerNet or “Forklifting” approach, believe it or not, can even be faster than some of the over-the-Internet optimizations for certain Big Data sets, even considering the time it takes to FedEx AWS your drives.

On Beyond Drinking Straws
The problem with most of the approaches above (excepting only Aspera and Amazon’s forklift) is that they make the drinking straw we’re using to fill that swimming pool better, faster, and bigger – but we’re still filling that damn pool with a straw. So what’s better than a straw? How about many straws? If any optimization technique improves a single connection to the Internet, then it stands to reason that establishing many connections to your Cloud provider in parallel would multiply your upload speed dramatically.

Fair enough, but let’s think out of the box here. A fundamental Big Data best practice is to bring your analytics to your data. The reasoning is that it’s hard to move your data but easy to move your software, so once your data are in the Cloud, you should also run your analytics there.

But this argument also works in reverse. If your data aren’t in the Cloud, then it may not make sense to move them to the Cloud simply to run your software there. Instead, bring your software to your data, even if they’re on premise.

Perish the thought, you say! We’re sold on Big Data in the Cloud. We’ve crunched the numbers and we know it’s going to save us money, provide more capabilities, and facilitate sharing information across our organization and the world. Fair enough. Here’s another twist for you.

Why are your Big Data sets outside the Cloud to begin with? Sure, you’re stuck with existing, legacy data sets wherever they happen to be today. But as a rule, those don’t constitute Big Data, or will cease to qualify as being large enough to warrant the Big Data label relatively soon. By definition, Big Data sets keep expanding exponentially, which means that you keep creating them with generations of newfangled tools.

In fact, there are already multitudinous sources for raw Big Data, as varied as the Big Data challenges organizations struggle with today. But many such sources are already in the Cloud, or could be moved to the Cloud simply. For example, clickthrough data from your Web sites. Such data come from your Web servers, which should be in the Cloud anyway. If your Big Data come from Web Servers scattered here and there in the Cloud, then moving the clickthrough data to a Big Data repository for processing can be handled in the same Cloud. No need for uploading.

What about data sources that aren’t already in the Cloud? Many Big Data streams come from instrumentation or sensors of some sort, from seismographs underground to EKGs in hospitals to UPC scanners in supermarkets. There’s no reason why such instrumentation shouldn’t pour their raw data feeds directly to the Cloud. What good is storing a week’s worth of supermarket purchasing data on premise anyway? You’ll want to store, process, manage, and analyze those data in the Cloud, so the sooner you get it there, the better.

The ZapThink Take
The only reason we have to worry about uploading Big Data to the Cloud in the first place is because our Big Data aren’t already in the Cloud. And broadly speaking, the reason they’re not already in the Cloud is because the Cloud isn’t everywhere. Instead, we think of the Cloud as being locked away in data centers, those alien, air conditioned facilities packed full of racks of high tech equipment.

That may be true today, but as ZapThink has discussed before, there’s nothing in the definition of Cloud Computing that requires Cloud resources to live in data centers. You might have a bit of the Cloud in your pocket, or on your laptop, in your car, or in your refrigerator. For now, this vision of the Internet of Things meeting the Cloud is mostly the stuff of science fiction. We’re only now figuring out what it means to have a ubiquitous global network of sensors, from the aforementioned EKGs and UPC scanners to traffic cameras to home thermostats. But the writing is on the wall. Just as we now don’t think twice about carrying supercomputers in our pockets, it’s only a matter of time until the Cloud itself is fully distributed and ubiquitous. When that happens, the question of moving Big Data to the Cloud will be moot. They will already be there.

Are you one of the vendors mentioned in this article and have a correction, or a vendor who should have been mentioned but wasn’t? Please feel free to comment here.

Image Source: US Navy

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

@MicroservicesExpo Stories
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, will discuss how to use Kubernetes to setup a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace....
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, explored HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.