Welcome!

Microservices Expo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Linux Containers, Microservices Expo, @CloudExpo, @DXWorldExpo

Linux Containers: Blog Post

The Taming of the Skew | @CloudExpo #Cloud #BigData #Analytics

Two types of skewness: the statistical skew impacts data analysis, and the operational skew impacts operational processes

The Taming of the Skew
By Dr. Laura Gardner, VP, Products, CLARA Analytics

In the famous comedy by William Shakespeare, "The Taming of the Shrew," the main plot depicts the courtship of Petruchio and Katherina, the headstrong, uncooperative shrew. Initially, Katherina is an unwilling participant in the relationship, but Petruchio breaks down her resistance with various psychological torments, which make up the "taming" - until she finally becomes agreeable.

An analogous challenge exists when using predictive analytics with healthcare data. Healthcare data can often seem quite stubborn, like Katherina. One of the main features of healthcare data that needs to be "tamed" is the "skew" of the data. In this article, we describe two types of skewness: the statistical skew, which impacts data analysis, and the operational skew, which impacts operational processes.

The Statistical Skew
Because the distribution of healthcare costs is bounded on the lower end - that is, the cost of healthcare services is never less than zero but ranges widely on the upper end, sometimes into the millions of dollars - the frequency distribution of costs is a skewed distribution. More specifically, in the following plot of frequency by cost, the distribution of healthcare costs is right-skewed because the long tail is on the right (and the coefficient of skewness is positive):

This skewness is present whether we are looking at total claim expense in the workers' compensation sector or annual expenses in the group health sector. Why is this a problem? Simply because the most common methods for analyzing data depend on the ability to assume that there is a normal distribution, and a right-skewed distribution is clearly not normal. It fails to conform to the assumption of normality. To produce reliable and accurate predictions and generalizable results from analyses of healthcare costs, the data need to be "tamed" (i.e., various sophisticated analytic techniques must be utilized to deal with the right-skewness of the data). Among these techniques are logarithmic transformation of the dependent variable, random forest regression, machine learning, topical analysis and others.

It's essential to keep this in mind in any analytic effort with healthcare data, especially in workers' compensation. To get the required level of accuracy, we need to think "non-normal" and get comfortable with the "skewed" behavior of the data.

Operational Skew
There is an equally pervasive operational skew in workers' compensation that calls out for a radical change in business models. The operational skew is exemplified by:

  • The 80/20 split between simple, straightforward claims that can be auto-adjudicated and more complex claims that have the potential to escalate or incur attorney involvement (i.e., 80 percent of the costs come from 20 percent of the claims).
  • The even more extreme 90/10 split between good providers delivering state-of-the-art care and the "bad apples" whose care is less effective, less often compliant with evidence-based guidelines or more expensive for a similar or worse result. (i.e., 90 percent of the costs come from 10 percent of the providers).

How can we deal with operational skew? The first step is to be aware of it and be prepared to use different tactics depending on which end of the skew you're dealing with. In the two examples just given, we have observed that by using the proper statistical approaches:

  • Claims can be categorized as early as Day 1 into low vs. high risk with respect to potential for cost escalation or attorney involvement. This enables payers to apply the appropriate amount of oversight, intervention and cost containment resources based on the risk of the claim.
  • Provider outcomes can be evaluated, summarized and scored, thus empowering network managers to fine-tune their networks and claims adjusters to recommend the best doctors to each injured worker.

Both of these examples show that what used to be a single business process -managing every claim by the high-touch, "throw a nurse or a doctor at every claim" approach, as noble as that sounds - now requires the discipline to enact two entirely different business models in order to be operationally successful. Let me explain.

The difference between low- and high-risk claims is not a subtle distinction. Low-risk claims should receive a minimum amount of intervention, just enough oversight to ensure that they are going well and staying within expected parameters. Good technology can help provide this oversight. Added expense, such as nurse case management, is generally unnecessary. Conversely, high-risk claims might need nurse and/or physician involvement, weekly or even daily updates, multiple points of contact and a keen eye for opportunities to do a better job navigating this difficult journey with the recovering worker.

The same is true for managing your network. It would be nice if all providers could be treated alike, but in fact, a small percentage of providers drives the bulk of the opioid prescribing, attorney involvement, liens and independent medial review (IMR) requests. These "bad apples" are difficult to reform and are best avoided, using a sophisticated provider scoring system that focuses on multiple aspects of provider performance and outcomes.

Once you have tamed your statistical skew with the appropriate data science techniques and your operational skew with a new business model, you will be well on your way to developing actionable insights from your predictive modeling. With assistance from the appropriate technology and operational routines, the most uncooperative skewness generally can be tamed. Are you ready to "tame the skew"?

Read Dr. Gardner's first two articles in this series:

Five Best Practices to Ensure the Injured Workers Comes First

Cycle Time is King

As first published in Claims Journal.

###

Laura B. Gardner, M.D., M.P.H., Ph.D., is an expert in analyzing U.S. health and workers' compensation data with a focus on predictive modeling, outcomes assessment, design of triage and provider evaluation software applications, program evaluation and health policy research. She is a successful entrepreneur with more than 20 years of experience in starting and building Axiomedics Research, Inc.

Dr. Gardner earned her bachelor's degree in biology (magna cum laude) from Brandeis University, her M.D. from Albert Einstein College of Medicine and both an M.P.H. in health policy and a Ph.D. in health economics from the University of California at Berkeley. As a physician, she is board certified in General Preventive Medicine and Public Health and is a fellow of the American College of Preventive Medicine.

For more information, visit http://www.claraanalytics.com/ and follow CLARA Analytics on LinkedInFacebook and Twitter.

More Stories By CLARA Analytics

CLARA analytics empowers workers’ compensation claims teams to rapidly get injured workers back on track with easy-to-use artificial intelligence (AI)-based products. Its CLARA providers search engine is an award-winning provider scoring engine that helps rapidly connect injured workers to the right providers, while CLARA claims is an early warning system that helps frontline claims teams efficiently manage claims, reduce escalations and understand the drivers of complexity. CLARA’s customers include a broad spectrum — from the top 25 insurance carriers to small, self-insured organizations.

Microservices Articles
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.