Microservices Expo Authors: TJ Randall, David Sprott, Liz McMillan, Pat Romanski, Derek Weeks

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, @BigDataExpo, SDN Journal

@CloudExpo: Article

Predictive Analytics, Cloud Computing, and Healthcare

The fact of the matter is that most healthcare providers are under-funded

Editor’s note: Gathering Clouds is pleased to welcome noted thought leader and Cloud Player David Linthicum as a regular contributor. David is a renown expert in all things cloud computing, SOA,  Health IT, SaaS, Big Data, and many more IT related topics. Check back every week for more from David!


By David Linthicum - The use of Big Data with predictive analytics systems layered on top has a tremendous amount of potential in the healthcare market.  Indeed, when paired with cloud-based platforms, there is the potential to become more cost effective, and much better at delivering healthcare services.

doctor data

Big data analytics can perform miracles for the cloud-enabled healthcare organization.

The fact of the matter is that most healthcare providers are under-funded, which leads to being under-automated and under-innovative.  Moreover, there seems to be a growing chasm between those who deliver healthcare to patients, and those who drive IT within healthcare provider organizations.

The statistics back this up.  According to Gartner, anticipated growth opportunities put some industries at the top when it comes to global IT spending.  However, Healthcare Providers were not in the top for growth opportunities, coming in at $15,311M. Even Utilities beat them out by a projected $18,756M.  Think about the number of changes in the world of healthcare providers.  These numbers are surprising at best, or very scary at worst.

The solution to this problem of “too much to do and not enough resources to do it” is to leverage the right new technologies, apply careful planning, and move from a reactive to proactive state in the world of healthcare IT.

The objective is to manage patient data holistically, and in new, innovative ways.  The rise of big data as a set of new technologies provides new options for both the storage and analysis of information.  This leads to better patient care and cost reductions.  The use of cloud computing provides the elastic capacity requirements at costs that almost all healthcare provider organizations can afford.  When combined, you have something that is clearly a game changer.

The data points around the use of big data for predictive analytics are beginning to show up.  In a recent story, Indiana University researchers found that a pair of predictive modeling techniques can make significantly better decisions about patients’ treatments than can doctors acting alone.  Indeed, they claim a better than 50 percent reduction in costs and more than 40 percent better patient outcomes.  (See a story by Derrick Harris over at GigaOM.)

The use case for big data, cloud computing, and predictive analytical models is compelling.  The researchers leveraged clinical and demographic data on more than 6,700 patients with clinical depression diagnoses.  Within that population, about 65 to 70 percent had co-occurring chronic physical disorders, including diabetes and hypertension.

David Linthicum, CTO and Found of Blue Mountain Labs

David Linthicum, friend of and contributor to Gathering Clouds.

Leveraging Markov decision processes, they built a model used to predict the probabilities of future events based upon those events that immediately preceded them.  Moreover, they leveraged dynamic decision networks, which can consider the specific features of those events to determine probabilities.  In other words, the model looks at the current attributes of a patient, and then uses huge amounts of data to provide the likely diagnosis and the best treatment to drive the best possible outcome.

The use of core data points, along with well-designed analytical models, leads to a cost reduction from $497 to $189 per unit (58.5 percent reduction).  Also, patient outcomes improved by about 35 percent.

What’s critical to the use of predictive modeling running on cloud-based platforms is the ability to access massive amounts of data and consider that data within these models.  This is not just technology that will be nice to have.  The use of predictive analytics and the tools that support the creation of these models, along with the strategic use of data integration technology, changes the game.

More Stories By Gathering Clouds

Cloud computing news, information, and insights. Powered by Logicworks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@MicroservicesExpo Stories
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
With emerging ideas, innovation, and talents, the lines between DevOps, release engineering, and even security are rapidly blurring. I invite you to sit down for a moment with Principle Consultant, J. Paul Reed, and listen to his take on what the intersection between these once individualized fields entails, and may even foreshadow.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
What do dependency resolution, situational awareness, and superheroes have in common? Meet Chris Corriere, a DevOps/Software Engineer at Autotrader, speaking on creative ways to maximize usage of all of the above. Mark Miller, Community Advocate and senior storyteller at Sonatype, caught up with Chris to learn more about what his team is up to.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
DevOps is a term that comes full of controversy. A lot of people are on the bandwagon, while others are waiting for the term to jump the shark, and eventually go back to business as usual. Regardless of where you are along the specturm of loving or hating the term DevOps, one thing is certain. More and more people are using it to describe a system administrator who uses scripts, or tools like, Chef, Puppet or Ansible, in order to provision infrastructure. There is also usually an expectation of...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
The general concepts of DevOps have played a central role advancing the modern software delivery industry. With the library of DevOps best practices, tips and guides expanding quickly, it can be difficult to track down the best and most accurate resources and information. In order to help the software development community, and to further our own learning, we reached out to leading industry analysts and asked them about an increasingly popular tenet of a DevOps transformation: collaboration.
At its core DevOps is all about collaboration. The lines of communication must be opened and it takes some effort to ensure that they stay that way. It’s easy to pay lip service to trends and talk about implementing new methodologies, but without action, real benefits cannot be realized. Success requires planning, advocates empowered to effect change, and, of course, the right tooling. To bring about a cultural shift it’s important to share challenges. In simple terms, ensuring that everyone k...
In case you haven’t heard, the new hotness in app architectures is serverless. Mainly restricted to cloud environments (Amazon Lambda, Google Cloud Functions, Microsoft Azure Functions) the general concept is that you don’t have to worry about anything but the small snippets of code (functions) you write to do something when something happens. That’s an event-driven model, by the way, that should be very familiar to anyone who has taken advantage of a programmable proxy to do app or API routing ...