Microservices Expo Authors: Liz McMillan, Ian Khan, Jason Bloomberg, Elizabeth White, Lori MacVittie

Blog Feed Post

MaaS applied to Healthcare – Use Case Practice

MaaS (Model as a Service) might allow building and controlling shared healthcare Cloud-ready data, affording agile data design, economies of scale and maintaining a trusted environment and scaling security. With MaaS, models map infrastructure and allow controlling persistent storage and deployment audit in order to certify th at data are coherent and remain linked to specific storage. As a consequence, models allow to check where data is deployed and stored. MaaS can play a crucial role in supplying services in healthcare: the model containing infrastructure properties includes information to classify the on-premise data Cloud service in terms of data security, coherence, outage, availability, geo-location and to secure an assisted service deployment and virtualization.

Municipalities are opening new exchange information with healthcare institutes. The objective is sharing medical research, hospital acceptance by pathology, assistance and hospitalization with doctors, hospitals, clinics and, of course, patients. This open data [6] should improve patient care, prevention, prophylaxis and appropriate medical booking and scheduling by making information sharing more timely and efficient. From the data management point of view it means the service should assure data elasticity, multi-tenancy, scalability, security together with physical and logical architectures that represent the guidelines to design healthcare services.

Accordingly, healthcare services in the Cloud must primarily secure the following data properties [2]:
-      data location;
-      data persistence;
-      data discovery and navigation;
-      data inference;
-      confidentiality;
-      availability;
-      on-demand data secure deleting/shredding [4] [5] [11] [12].

These properties should be defined during the service design and data models play the “on-premise” integral role in defining, managing and protecting healthcare data in the Cloud. When creating healthcare data models, the service is created as well and properties for confidentiality, availability, authenticity, authorization, authentication and integrity [12] have to be defined inside: here is how MaaS provides preconfigured service properties.

Applying MaaS to Healthcare – Getting Practice
Applying MaaS to design and deploy healthcare services means explaining how apply the DaaS (Database as a Service, see [2] and [4]) lifecycle to realize faster and positive impacts on the go-live preparation with Cloud services. The Use Case introduces the practices how could be defined the healthcare service and then to translate them into the appropriate guidelines. Therefore, the DaaS lifecycle service practices we are applying are [4]:

Take into account, healthcare is a dynamic complex environment with many actors: patients, physicians, IT professionals, chemists, lab technicians, researchers, health operators…. The Use Case we are introducing tries to consider the whole system. It provides the main tasks along the DaaS lifecycle and so how the medical information might be managed and securely exchanged [12] among stakeholders for multiple entities such as hospital, clinics, pharmacy, labs and insurance companies.

The Use Case
Here is how MaaS might cover the Use Case and DaaS lifecycle best practices integrate the above properties and directions:

Objective To facilitate services to healthcare users and to improve exchange information experience among stakeholders. The Use Case aims to reduce costs of services by rapid data designing, updating, deployment and to provide data audit and control. To improve user experience with healthcare knowledge.
Description Current costs of data design, update and deployment are expensive and healthcare information (clinical, pharmaceutical, prevention, prophylaxis…) is not delivered fast enough based upon user experience;
Costs for hospitalization and treatments information should be predictable based upon user experience and interaction.
Actors Clinical and Research Centres;
Healthcare Institute/Public Body  (Access Administrators);
Healthcare Institute/Public Body (Credentials, Roles Providers);
IT Operations (Cloud Providers, Storage Providers, Clinical Application Providers).
Requirements Reducing costs and rapidly delivering relevant data to users, stakeholders and healthcare institutes;
Enabling decision making information to actors who regularly need access [11] [12] to healthcare services but lack the scale to exchange (and require) more dedicated services and support;
Fast supporting and updating healthcare data to users due to large reference base with many locations and disparate applications;
Ensuring compliance and governance directions are currently applied, revised and supervised;
Data security, confidentiality, availability, authenticity, authorization, authentication and integrity to be defined “on-premise”.
Pre-processing and post-processing Implementing and sharing data models;
Designing data model properties according to private, public and/or hybrid Cloud requirements;
Designing “on-premise” of the data storage model;
Modeling data to calculate “a priori” physical resources allocation;
Modeling data to predict usage “early” and to optimize database handling;
Outage is covered by versions and changes archived based on model partitioning;
Content discovery assists in identifying and auditing data to restore the service to previous versions and to irrecoverably destroying the data, if necessary, is asked by the regulations.
Included and extended use case Deployment is guided from model properties and architecture definition;
Mapping of data is defined and updated, checking whether the infrastructure provider has persistence and finding out whether outages are related to on-line tasks;
Deploying and sharing are guided from model properties and architecture definition.

Following, we apply MaaS’ properties (a subset) to the above healthcare Use Case. Per contra, Data Model properties (a subset) are applied along the DaaS lifecycle states:

MaaS Properties

DaaS Lifecycle States

Healthcare Data Model Properties
Data Location Create Data Model
Model Archive and Change
Deploy and Share
Data models contain partitioning properties and can include data location constraints. User tagging of data (a common Web 2.0 practice, through the use of clinic user-defined properties) should be managed. Support to compliant storage for preventative care data records should be provided
Data persistence Create Data Model
Model Archive & Change
Secure delete
For any partition, sub-model, or version of models, data model has to label and trace data location. Model defines a map specifying where data is stored (ambulatory care, clinical files have different storages). Providers persistence can be registered. Data discovery can update partition properties to identify where data is located
Data inference Create Data Model Data model has to support inference and special data aggregation: ambulatory might inference patient’s insurance file. All inferences and aggregations are defined, updated and tested into the model
Confidentiality Create Data Model
Populate, Use and Test
Data model guides rights assignment, access controls, rights management, and application data security starting from data model. As different tenants (hospitals, clinics, insurance companies and pharmacies) access the data, users and tenants should be defined inside the model. Logical and physical controls have to be set
High availability Deploy and Share
Model Archive and Change
Data model and partitioning configuration together with model changes and versions permits mastering of a recovery scheme and restoration when needed. Data inventory (classified by Surgery, Radiology, Cardiology, for example) vs discovery have to be traced and set.
Fast updates at low cost Create Data Model
Generate Schema/Update Data Model
Data reverse and forward engineering permits change management and version optimization in real-time directly on data deployed properties
Multi-database partitioning Create Data Model
Deploy and Share
Bi-directional partitioning in terms of deployment, storage, and evolution through model versioning has to be set. Multi-DBMS version management helps in sharing multi-partitioning deployments: for example, Insurance and Surgery by Patient, normally are partitioned and belong to different tenants vs different databases
Near-zero configuration and administration Create Data Model
Generate Schema/Update Data Model
Data models cover and contain all data properties including scripts, stored procedures, queries, partitions, changes and all configuration and administration properties. This means administrative actions decrease to leave more time for data design and update (and deployment). Regulation compliance can be a frequent administration task: models ensure that healthcare compliance and governance is currently aligned

The Outcome
MaaS defines service properties through which the DaaS process can be implemented and maintained. As a consequence, applying the Use Case through the introduced directions, the following results should be outlined.

Qualitative Outcomes:
1)    Healthcare actors share information on the basis of defined “on-premise” data models: models can be implemented and deployed using a model-driven paradigm;
2)    Data Models are standardized in terms of naming convention and conceptual templates (Pharma, Insurance, Municipality… and so on): in fact, models can be modified and updated with respect the knowledge they were initially designed;
3)    Storage and partitioning in the Cloud can be defined “a priori” and periodic audits can be set to certify that data are coherent and remain linked to specific sites;
4)    The users consult the information and perform 2 tasks:
4.1) try the (best) search and navigate the knowledge for personal and work activities;
4.2) give back information about user experience and practice/procedures that should be updated, rearranged, downsized or extended depending upon community needs, types of interaction, events or public specific situations.
5)    Models are “on-premise” policy-driven tools. Regulation compliance rules can be included in the data model. Changes on current compliance constraints means changes on the data model before it is deployed with the new version.

Quantitative Outcomes:
1)    Measurable and traceable costs reduction (to be calculated as a function of annual Cloud Fee, Resources tuning and TCO);
2)    Time reduction in terms of knowledge fast design, update, deployment, portability, reuse (to be calculated as a function of SLA, data and application management effort and ROI);
3)    Risk reduction accordingly to “on-premise” Cloud service design and control (to be calculated as a function of recovery time, chargeback on cost of applied countermeasures compared with periodical audit based upon model information).

MaaS might provide the real opportunity to offer a unique utility-style model life cycle to accelerate cloud data optimization and performance in the healthcare network. MaaS applied to healthcare services might be the right way to transform the medical service delivery in the Cloud. MaaS defines “on-premise” data security, coherence, outage, availability, geo-location and an assisted service deployment. Models are adaptable to various departmental needs and organizational sizes, simplify and align healthcare domain-specific knowledge combining the data model approach and the on-demand nature of cloud computing. MaaS agility is the key requirements of data services design, incremental data deployment and progressive data structure provisioning. Finally, the model approach allows the validation of service evolution. The models’ versions and configurations are a catalogue to manage both data regulation compliance [12] and data contract’s clauses in the Cloud among IT, Providers and Healthcare actors [9].

[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[4] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
[5] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[6] N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
[7] N. Piscopo - MaaS Workshop, Awareness, Courses Syllabus
[8] N. Piscopo - DaaS Workshop, Awareness, Courses Syllabus
[9] N. Piscopo – Applying MaaS to DaaS (Database as a Service ) Contracts. An intorduction to the Practice http://cloudbestpractices.net/2012/11/04/applying-maas-to-daas/
[10] N. M. Josuttis – SOA in Practice
[11] H. A. J. Narayanan, M. H. GüneşEnsuring Access Control in Cloud Provisioned Healthcare Systems
[12] Kantara Initiatives -http://kantarainitiative.org/confluence/display/uma/UMA+Scenarios+and+Use+Cases

This document is provided AS-IS for your informational purposes only. In no event the contains of “How MaaS might be applied to Healthcare – A Use Case” will be liable to any party for direct, indirect, special, incidental, economical (including lost business profits, business interruption, loss or damage of data, and the like) or consequential damages, without limitations, arising out of the use or inability to use this documentation or the products, regardless of the form of action, whether in contract, tort (including negligence), breach of warranty, or otherwise, even if an advise of the possibility of such damages there exists. Specifically, it is disclaimed any warranties, including, but not limited to, the express or implied warranties of merchantability, fitness for a particular purpose and non-infringement, regarding this document or the products’ use or performance. All trademarks, trade names, service marks and logos referenced herein belong to their respective companies/offices.

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@MicroservicesExpo Stories
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, will discuss the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docke...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
The reason I believe digital transformation is not only more than a fad, but is actually a life-or-death imperative for every business and IT executive on the planet is simple: there will be no place for an “industrial enterprise” in a digital world. Transformation, by definition, is a metamorphosis from one state to another, wholly new state. As such, a true digital transformation must be the act of transforming an industrial-era organization into something wholly different – the Digital Enter...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, will discuss what every business should plan for how to structure their teams to d...
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all...
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.