|By Robert Eve||
|July 5, 2011 05:15 AM EDT||
In the life sciences industry, the latest blockbuster drug or device can mean billions in revenue.
But developing these new offerings often takes ten or more years.
Moving ahead a year or two can be worth hundreds of millions.
The question is how?
Data, Data, Data
Data is the life blood of new product development. Research findings, clinical trial results, manufacturing process validation, and more are information-intensive activities where each new data point can result in a major shift in plans and timing.
To accelerate time to market in this data rich environment, life science R&D teams look to IT.
Unfortunately, IT is not always well equipped to meet this challenge. The problem is not a lack of data. The problem is a lack of data integration agility.
Most life sciences companies have made significant investments in their data. These investments have resulted in data silos and complexity which slow down their ability to respond to new information requests. To overcome these silos, life sciences companies are seeking new ways to integrate their new product development data.
How Data Virtualization Helps
Data virtualization has been adopted by the vast majority of pharmaceutical vendors and has recently seen increasing acceptance at medical device makers. Use of data virtualization to integrate product development data has been the primary use case. The reasons are several:
- Gaining Timely Insight – Up-to-the-minute data is critical throughout every stage in the new product cycle. Data virtualization provides query optimization algorithms and techniques that deliver timely information whenever needed.
- Seeing the Complete Picture – Multiple types of data from multiple sources must be combined to provide researchers, analysts, and managers with the full picture thata effective decision making requires. Data virtualization provides data federation that virtually integrates multiple data sources data in memory, without the cost and overhead of physical data consolidation in data warehouses.
- Controlling Data Proliferation – Identifying and understanding data assets distributed across a range of R&D repositories and locations requires significant manual effort. Data virtualization provides data discovery that saves time by automating entity and relationship identification and accelerating data modeling.
- Addressing Data Complexity – Incredible complexity challenges IT’s ability to leverage existing R&D data for new R&D questions. Data virtualization provides powerful data abstraction tools that simplify complex data, transforming it from native structures and syntax into easy-to-understand, reusable views and data services with common, business-friendly semantics.
- Improving Data Availability – With so many technologies, formats and standards, successfully surfacing R&D life cycle data consumes significant IT resources. Data virtualization supports numerous standards-based data access, caching and delivery options that allows IT to flexibly publish all the information that R&D users require.
- Providing Proper Data Controls – Data is a critical asset that must be governed, especially in life science R&D with its myriad compliance requirements. Data virtualization provides data governance that centralizes metadata management, ensures data security and improves data quality to meet these stringent control requirements.
- Environment of Non-Stop Change – Ever changing research results, clinical trial findings, and compliance requirements make frequent change inevitable. Data virtualization provides a loosely-coupled data virtualization layer, rapid development tools, automated impact analysis and extensible architecture to provide the information agility required to keep pace.
Pfizer Finds a Successful Formula
The R&D team at Pfizer was an early adopter of data virtualization with a number of positive business benefits.
Their successes have been recognized in a number of recent articles including:
- Business Intelligence: How To Get Agile
- Pfizer's Michael Linhares Talks About Data Virtualization Software
- Pfizer's Prescription for Data
- SOA to the Rescue, When Drug Discovery Needs Data Fast!
- Rx for Data Woes
- Faster Data Integration Helps Pfizer Speed Decisions on Drugs
Fortunately the path to successful data virtualization adoption is far shorter than the new drug or device development path. But as with R&D, integrating available data is the key.
Here are some great data sources to help you get started:
- The Data Virtualization Leadership Series is a great forum to gain data virtualization insights from top IT analysts.
- The Data Virtualization Leadership Blog helps you stay abreast of new data virtualization developments.
- How to Evaluate a Data Virtualization Platform and How Enterprises Measure Data Virtualization Platform Maturity provide coaching on how to evaluate data virtualization vendors.
- Ten Mistakes to Avoid When Virtualizing Data provides practical implementation guidance.
- And a number of published case studies provide models for data virtualization success.
Here’s a novel, but controversial statement, “it’s time for the CEO, COO, CIO to start to take joint responsibility for application platform decisions.” For too many years now technical meritocracy has led the decision-making for the business with regard to platform selection. This includes, but is not limited to, servers, operating systems, virtualization, cloud and application platforms. In many of these cases the decision has not worked in favor of the business with regard to agility and cost...
Jan. 21, 2017 10:00 AM EST Reads: 525
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
Jan. 21, 2017 10:00 AM EST Reads: 5,069
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions with...
Jan. 21, 2017 10:00 AM EST Reads: 5,652
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Jan. 21, 2017 07:45 AM EST Reads: 2,444
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 21, 2017 07:15 AM EST Reads: 931
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
Jan. 21, 2017 06:45 AM EST Reads: 695
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Jan. 21, 2017 05:30 AM EST Reads: 3,553
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 21, 2017 02:30 AM EST Reads: 6,123
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
Jan. 21, 2017 02:30 AM EST Reads: 957
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
Jan. 21, 2017 02:15 AM EST Reads: 2,337
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Jan. 21, 2017 12:30 AM EST Reads: 1,822
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 21, 2017 12:00 AM EST Reads: 5,115
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Jan. 21, 2017 12:00 AM EST Reads: 4,721
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
Jan. 20, 2017 10:30 PM EST Reads: 1,395
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 20, 2017 06:30 PM EST Reads: 5,468
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Jan. 20, 2017 05:30 PM EST Reads: 1,492
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 20, 2017 05:15 PM EST Reads: 3,555
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
Jan. 20, 2017 05:15 PM EST Reads: 4,974
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Jan. 20, 2017 02:45 PM EST Reads: 4,728
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Jan. 20, 2017 02:30 PM EST Reads: 1,145