Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Flint Brenton, Elizabeth White, Yeshim Deniz

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, @DXWorldExpo, SDN Journal

@CloudExpo: Article

Predictive Analytics, Cloud Computing, and Healthcare

The fact of the matter is that most healthcare providers are under-funded

Editor’s note: Gathering Clouds is pleased to welcome noted thought leader and Cloud Player David Linthicum as a regular contributor. David is a renown expert in all things cloud computing, SOA,  Health IT, SaaS, Big Data, and many more IT related topics. Check back every week for more from David!

____

By David Linthicum - The use of Big Data with predictive analytics systems layered on top has a tremendous amount of potential in the healthcare market.  Indeed, when paired with cloud-based platforms, there is the potential to become more cost effective, and much better at delivering healthcare services.

doctor data

Big data analytics can perform miracles for the cloud-enabled healthcare organization.

The fact of the matter is that most healthcare providers are under-funded, which leads to being under-automated and under-innovative.  Moreover, there seems to be a growing chasm between those who deliver healthcare to patients, and those who drive IT within healthcare provider organizations.

The statistics back this up.  According to Gartner, anticipated growth opportunities put some industries at the top when it comes to global IT spending.  However, Healthcare Providers were not in the top for growth opportunities, coming in at $15,311M. Even Utilities beat them out by a projected $18,756M.  Think about the number of changes in the world of healthcare providers.  These numbers are surprising at best, or very scary at worst.

The solution to this problem of “too much to do and not enough resources to do it” is to leverage the right new technologies, apply careful planning, and move from a reactive to proactive state in the world of healthcare IT.

The objective is to manage patient data holistically, and in new, innovative ways.  The rise of big data as a set of new technologies provides new options for both the storage and analysis of information.  This leads to better patient care and cost reductions.  The use of cloud computing provides the elastic capacity requirements at costs that almost all healthcare provider organizations can afford.  When combined, you have something that is clearly a game changer.

The data points around the use of big data for predictive analytics are beginning to show up.  In a recent story, Indiana University researchers found that a pair of predictive modeling techniques can make significantly better decisions about patients’ treatments than can doctors acting alone.  Indeed, they claim a better than 50 percent reduction in costs and more than 40 percent better patient outcomes.  (See a story by Derrick Harris over at GigaOM.)

The use case for big data, cloud computing, and predictive analytical models is compelling.  The researchers leveraged clinical and demographic data on more than 6,700 patients with clinical depression diagnoses.  Within that population, about 65 to 70 percent had co-occurring chronic physical disorders, including diabetes and hypertension.

David Linthicum, CTO and Found of Blue Mountain Labs

David Linthicum, friend of and contributor to Gathering Clouds.

Leveraging Markov decision processes, they built a model used to predict the probabilities of future events based upon those events that immediately preceded them.  Moreover, they leveraged dynamic decision networks, which can consider the specific features of those events to determine probabilities.  In other words, the model looks at the current attributes of a patient, and then uses huge amounts of data to provide the likely diagnosis and the best treatment to drive the best possible outcome.

The use of core data points, along with well-designed analytical models, leads to a cost reduction from $497 to $189 per unit (58.5 percent reduction).  Also, patient outcomes improved by about 35 percent.

What’s critical to the use of predictive modeling running on cloud-based platforms is the ability to access massive amounts of data and consider that data within these models.  This is not just technology that will be nice to have.  The use of predictive analytics and the tools that support the creation of these models, along with the strategic use of data integration technology, changes the game.


More Stories By Gathering Clouds

Cloud computing news, information, and insights. Powered by Logicworks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...