|By Gathering Clouds||
|June 27, 2013 09:00 AM EDT||
Editor’s note: Gathering Clouds is pleased to welcome noted thought leader and Cloud Player David Linthicum as a regular contributor. David is a renown expert in all things cloud computing, SOA, Health IT, SaaS, Big Data, and many more IT related topics. Check back every week for more from David!
By David Linthicum - The use of Big Data with predictive analytics systems layered on top has a tremendous amount of potential in the healthcare market. Indeed, when paired with cloud-based platforms, there is the potential to become more cost effective, and much better at delivering healthcare services.
The fact of the matter is that most healthcare providers are under-funded, which leads to being under-automated and under-innovative. Moreover, there seems to be a growing chasm between those who deliver healthcare to patients, and those who drive IT within healthcare provider organizations.
The statistics back this up. According to Gartner, anticipated growth opportunities put some industries at the top when it comes to global IT spending. However, Healthcare Providers were not in the top for growth opportunities, coming in at $15,311M. Even Utilities beat them out by a projected $18,756M. Think about the number of changes in the world of healthcare providers. These numbers are surprising at best, or very scary at worst.
The solution to this problem of “too much to do and not enough resources to do it” is to leverage the right new technologies, apply careful planning, and move from a reactive to proactive state in the world of healthcare IT.
The objective is to manage patient data holistically, and in new, innovative ways. The rise of big data as a set of new technologies provides new options for both the storage and analysis of information. This leads to better patient care and cost reductions. The use of cloud computing provides the elastic capacity requirements at costs that almost all healthcare provider organizations can afford. When combined, you have something that is clearly a game changer.
The data points around the use of big data for predictive analytics are beginning to show up. In a recent story, Indiana University researchers found that a pair of predictive modeling techniques can make significantly better decisions about patients’ treatments than can doctors acting alone. Indeed, they claim a better than 50 percent reduction in costs and more than 40 percent better patient outcomes. (See a story by Derrick Harris over at GigaOM.)
The use case for big data, cloud computing, and predictive analytical models is compelling. The researchers leveraged clinical and demographic data on more than 6,700 patients with clinical depression diagnoses. Within that population, about 65 to 70 percent had co-occurring chronic physical disorders, including diabetes and hypertension.
Leveraging Markov decision processes, they built a model used to predict the probabilities of future events based upon those events that immediately preceded them. Moreover, they leveraged dynamic decision networks, which can consider the specific features of those events to determine probabilities. In other words, the model looks at the current attributes of a patient, and then uses huge amounts of data to provide the likely diagnosis and the best treatment to drive the best possible outcome.
The use of core data points, along with well-designed analytical models, leads to a cost reduction from $497 to $189 per unit (58.5 percent reduction). Also, patient outcomes improved by about 35 percent.
What’s critical to the use of predictive modeling running on cloud-based platforms is the ability to access massive amounts of data and consider that data within these models. This is not just technology that will be nice to have. The use of predictive analytics and the tools that support the creation of these models, along with the strategic use of data integration technology, changes the game.
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
Oct. 1, 2016 09:45 AM EDT Reads: 820
SYS-CON Events announced today that Sheng Liang to Keynote at SYS-CON's 19th Cloud Expo, which will take place on November 1-3, 2016 at the Santa Clara Convention Center in Santa Clara, California.
Oct. 1, 2016 09:15 AM EDT Reads: 119
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Oct. 1, 2016 09:00 AM EDT Reads: 2,938
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Oct. 1, 2016 08:45 AM EDT Reads: 5,502
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
Oct. 1, 2016 07:00 AM EDT Reads: 2,411
With online viewership and sales growing rapidly, enterprises are interested in understanding how they analyze performance to positively impact business metrics. Deeper insight into the user experience is needed to understand why conversions are dropping and/or bounce rates are increasing or, preferably, to understand what has been helping these metrics improve. The digital performance management industry has evolved as application performance management companies have broadened their scope beyo...
Oct. 1, 2016 07:00 AM EDT Reads: 1,488
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Oct. 1, 2016 06:30 AM EDT Reads: 1,933
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Oct. 1, 2016 06:00 AM EDT Reads: 2,809
Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS - software, platform, and infrastructure as a service.
Oct. 1, 2016 05:30 AM EDT Reads: 1,056
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
Oct. 1, 2016 05:15 AM EDT Reads: 1,003
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Oct. 1, 2016 05:00 AM EDT Reads: 4,762
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Oct. 1, 2016 04:30 AM EDT Reads: 1,834
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 1, 2016 03:30 AM EDT Reads: 1,238
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
Oct. 1, 2016 02:30 AM EDT Reads: 3,059
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Oct. 1, 2016 02:15 AM EDT Reads: 2,159
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.
Oct. 1, 2016 02:15 AM EDT Reads: 569
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Oct. 1, 2016 01:15 AM EDT Reads: 3,058
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
Oct. 1, 2016 01:15 AM EDT Reads: 4,380
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software sec...
Oct. 1, 2016 12:15 AM EDT Reads: 1,437
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Sep. 30, 2016 10:30 PM EDT Reads: 934