|By Robert Eve||
|July 5, 2011 05:15 AM EDT||
In the life sciences industry, the latest blockbuster drug or device can mean billions in revenue.
But developing these new offerings often takes ten or more years.
Moving ahead a year or two can be worth hundreds of millions.
The question is how?
Data, Data, Data
Data is the life blood of new product development. Research findings, clinical trial results, manufacturing process validation, and more are information-intensive activities where each new data point can result in a major shift in plans and timing.
To accelerate time to market in this data rich environment, life science R&D teams look to IT.
Unfortunately, IT is not always well equipped to meet this challenge. The problem is not a lack of data. The problem is a lack of data integration agility.
Most life sciences companies have made significant investments in their data. These investments have resulted in data silos and complexity which slow down their ability to respond to new information requests. To overcome these silos, life sciences companies are seeking new ways to integrate their new product development data.
How Data Virtualization Helps
Data virtualization has been adopted by the vast majority of pharmaceutical vendors and has recently seen increasing acceptance at medical device makers. Use of data virtualization to integrate product development data has been the primary use case. The reasons are several:
- Gaining Timely Insight – Up-to-the-minute data is critical throughout every stage in the new product cycle. Data virtualization provides query optimization algorithms and techniques that deliver timely information whenever needed.
- Seeing the Complete Picture – Multiple types of data from multiple sources must be combined to provide researchers, analysts, and managers with the full picture thata effective decision making requires. Data virtualization provides data federation that virtually integrates multiple data sources data in memory, without the cost and overhead of physical data consolidation in data warehouses.
- Controlling Data Proliferation – Identifying and understanding data assets distributed across a range of R&D repositories and locations requires significant manual effort. Data virtualization provides data discovery that saves time by automating entity and relationship identification and accelerating data modeling.
- Addressing Data Complexity – Incredible complexity challenges IT’s ability to leverage existing R&D data for new R&D questions. Data virtualization provides powerful data abstraction tools that simplify complex data, transforming it from native structures and syntax into easy-to-understand, reusable views and data services with common, business-friendly semantics.
- Improving Data Availability – With so many technologies, formats and standards, successfully surfacing R&D life cycle data consumes significant IT resources. Data virtualization supports numerous standards-based data access, caching and delivery options that allows IT to flexibly publish all the information that R&D users require.
- Providing Proper Data Controls – Data is a critical asset that must be governed, especially in life science R&D with its myriad compliance requirements. Data virtualization provides data governance that centralizes metadata management, ensures data security and improves data quality to meet these stringent control requirements.
- Environment of Non-Stop Change – Ever changing research results, clinical trial findings, and compliance requirements make frequent change inevitable. Data virtualization provides a loosely-coupled data virtualization layer, rapid development tools, automated impact analysis and extensible architecture to provide the information agility required to keep pace.
Pfizer Finds a Successful Formula
The R&D team at Pfizer was an early adopter of data virtualization with a number of positive business benefits.
Their successes have been recognized in a number of recent articles including:
- Business Intelligence: How To Get Agile
- Pfizer's Michael Linhares Talks About Data Virtualization Software
- Pfizer's Prescription for Data
- SOA to the Rescue, When Drug Discovery Needs Data Fast!
- Rx for Data Woes
- Faster Data Integration Helps Pfizer Speed Decisions on Drugs
Fortunately the path to successful data virtualization adoption is far shorter than the new drug or device development path. But as with R&D, integrating available data is the key.
Here are some great data sources to help you get started:
- The Data Virtualization Leadership Series is a great forum to gain data virtualization insights from top IT analysts.
- The Data Virtualization Leadership Blog helps you stay abreast of new data virtualization developments.
- How to Evaluate a Data Virtualization Platform and How Enterprises Measure Data Virtualization Platform Maturity provide coaching on how to evaluate data virtualization vendors.
- Ten Mistakes to Avoid When Virtualizing Data provides practical implementation guidance.
- And a number of published case studies provide models for data virtualization success.
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Oct. 21, 2016 02:00 PM EDT Reads: 6,769
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Oct. 21, 2016 02:00 PM EDT Reads: 549
The reason I believe digital transformation is not only more than a fad, but is actually a life-or-death imperative for every business and IT executive on the planet is simple: there will be no place for an “industrial enterprise” in a digital world. Transformation, by definition, is a metamorphosis from one state to another, wholly new state. As such, a true digital transformation must be the act of transforming an industrial-era organization into something wholly different – the Digital Enter...
Oct. 21, 2016 02:00 PM EDT Reads: 1,210
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Oct. 21, 2016 01:30 PM EDT Reads: 1,459
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, will discuss what every business should plan for how to structure their teams to d...
Oct. 21, 2016 01:00 PM EDT Reads: 1,212
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
Oct. 21, 2016 11:45 AM EDT Reads: 13,558
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,228
Oct. 21, 2016 10:00 AM EDT Reads: 3,691
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
Oct. 21, 2016 10:00 AM EDT Reads: 2,987
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Oct. 21, 2016 09:30 AM EDT Reads: 436
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 21, 2016 08:15 AM EDT Reads: 1,835
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
Oct. 21, 2016 07:45 AM EDT Reads: 1,168
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Oct. 21, 2016 07:45 AM EDT Reads: 2,055
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 883
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Oct. 21, 2016 07:00 AM EDT Reads: 4,398
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
Oct. 21, 2016 06:45 AM EDT Reads: 6,310
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
Oct. 21, 2016 06:00 AM EDT Reads: 7,153
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
Oct. 21, 2016 04:30 AM EDT Reads: 16,208
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 21, 2016 03:30 AM EDT Reads: 3,773
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
Oct. 21, 2016 02:30 AM EDT Reads: 1,216