Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz, Flint Brenton

Related Topics: @CloudExpo, Microservices Expo, Agile Computing, @DXWorldExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Article

In-Memory Data Grids and Cloud Computing

The promise of the cloud is a reduction in total cost of ownership

The use of in-memory data grids (IMDGs) for scaling application performance has rapidly increased in recent years as firms have seen their application workloads explode. This trend runs across nearly every vertical market, touching online applications for financial services, ecommerce, travel, manufacturing, social media, mobile, and more. At the same time, many firms are also looking to leverage the use of cloud computing to meet the challenge of ever increasing workloads. One of the fundamental promises of the cloud is elastic, transparent, on-demand scalability -- a key capability that has become practical with the use of in-memory data grid technology. As such IMDGs are becoming a vital factor in the cloud, just as they have been for on-premise applications.

What makes IMDGs such a good fit with cloud computing? The promise of the cloud is a reduction in total cost of ownership. Part of that reduction comes from the ability to quickly provision and use new server capacity (without having to own the hardware). The essential synergy between IMDGs and the cloud derives from their common elasticity. IMDGs can scale out their memory-based storage and performance linearly as servers are added to the grid, and they can gracefully scale back when fewer servers are needed. IMDGs take full advantage of the cloud's ability to easily spin-up or remove servers. IMDGs enable cloud-hosted applications to be quickly and easily deployed on an elastic pool of cloud servers to deliver scalable performance, maintaining fast data access even as workloads increase. This is an ideal solution for fast-growing companies and for applications whose workloads create widely varying demands (like online flowers for Mother's Day, concert tickets, etc.). These companies no longer need to create space, power, and cooling for new hardware to meet these fluctuating workloads. Instead, with a few button clicks, they can start up an IMDG-enabled cloud architecture, which transparently meets their performance demands at a cost is solely based on usage.

Expanding on the promise of the cloud, some in-memory data grids can span both on-premise and cloud environments to provide seamless "cloud bursting" for handling high workloads. Let's say your e-commerce application stores shopping carts in an IMDG to give customers fast response times. To spur sales, your marketing group plans to run a special online sales event. Because projected traffic is expected to double during this event, additional web servers will be needed to handle the workload. Of course, maintaining fast response times as the workload increases is essential to success. By deploying your web app in the cloud and connecting it to your on-premise server farm with an IMDG, you can seamlessly double your traffic-handling capacity without interrupting current shopping activity on your site. You don't even need to make changes to your application. The combined deployments transparently work together to serve web traffic, and data freely flows between them within the IMDGs at both sites.

These synergies form a solid basis for making 2014 a watershed year for IMDGs in the cloud. But, there's another big trend that will further drive adoption. As the discussion around "Big Data" analysis heats up, the emerging combination of Big Data and cloud computing - cloud-based analytics - promises to fundamentally change the technology of data mining, machine learning and many other analytics use cases. In 2014, we expect to see the trend toward in-memory, predictive analytics sharply increase, and cloud computing will be a fundamental enabler of that trend.

IMDGs integrate memory-based data storage and computing to make real-time data analysis easily accessible to users and help extend a company's competitive edge. IMDGs automatically take full advantage of the cloud's elasticity to run analytics in parallel across cloud servers with lightning fast performance. Now it's possible to host a real-time analytics engine in the cloud and provide on-demand analytics to a wide range of users, from SaaS services for mobile devices to business simulations for corporate users. Or, maybe you want to spin-up servers with, say, a terabyte of memory, load the grid, run analytics across that data, and then release the resources. In an extreme example, chemistry researchers recently used Amazon Web Services to achieve a "petaflop" of computing power</a> running an analysis of 205,000 molecules for just one week. The elasticity of the cloud again makes the difference by providing the equivalent of a parallel processing supercomputer at your fingertips without the huge capital investment (it costs $33,000 total).

To sum-up, in 2014 we expect firms to adopt cloud computing and cloud-hosted IMDGs at a rapid rate, and the trends of in-memory computing and data analytics will converge to enable fast adoption of in-memory data grid technology in public, private, and hybrid cloud environments. Enterprises that take advantage of this convergence are expected to enjoy a quantum leap in the value of their data without the need to break their IT budgets.

More Stories By William Bain

Dr. William L. Bain is founder and CEO of ScaleOut Software, Inc. Bill has a Ph.D. in electrical engineering/parallel computing from Rice University, and he has worked at Bell Labs research, Intel, and Microsoft. Bill founded and ran three start-up companies prior to joining Microsoft. In the most recent company (Valence Research), he developed a distributed Web load-balancing software solution that was acquired by Microsoft and is now called Network Load Balanc¬ing within the Windows Server operating system. Dr. Bain holds several patents in computer architecture and distributed computing. As a member of the Seattle-based Alliance of Angels, Dr. Bain is actively involved in entrepreneurship and the angel community.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...