Welcome!

Microservices Expo Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Ian Khan, Pat Romanski

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

Why Energy Companies Like Data Virtualization

Integrated data powers business success

Discovering new upstream sources, smoothly delivering products through downstream distribution channels, and complying with extensive regulations are keys to success in the energy business.  And information is the fuel that enables these successes.

Unfortunately prior IT investments have resulted in numerous data silos and significant complexity that is making it harder than ever to turn information into business success.

Four of the top five global energy companies rely on data virtualization to provide the diverse information required across a range of strategic initiatives and business-critical IT projects.

Here is how.

Data Virtualization at Work in Energy Companies
Data virtualization use
by energy companies is extensive.  Here are but a few of the use cases these innovative IT organizations have deployed:

  • Well Maintenance and Repair - Keeping wells up and pumping drives revenue. When wells go down, getting the right repair rigs and teams on site fast is critical. To allocate these scarce resources optimally, requires dispatchers and triage teams have real-time access to repair rig status, staffing availability, best practice procedures, maintenance records, flow rates, and more. Data virtualization accesses and combines this diverse data so the oil and gas keep flowing.
  • Regulatory Reporting - The energy industry is one of the most highly regulated industries today. EPA, OSHA, DOT, and many other federal and state agencies require hundreds of compliance reports. Because internal systems have been optimized for operations, not compliance, integrating the data needed from across these systems is often the biggest component in your compliance reporting costs. Data virtualization quickly and easily federates diverse data from across operational systems, while leaving operating data in place to avoid the extra costs that result from unnecessary data replication.
  • Research and Development - Effective R&D pipeline management is the key to expanding energy sources, enhancing recovery and yield, and reducing costs in the energy business. Currently, most research project information resides is multiple systems. Managers have to access these separate systems to extract, compile, and assemble reports from multiple laboratory operations in order to keep up. Data virtualization offers easier access and visibility across the entire R&D process.
  • Human Resource Management - The shortage of skilled petroleum engineers, roustabouts, geologists, and others continues to worsen. Attracting, developing and retaining a highly skilled professional workforce has become a top priority. Information is critical including skills development, legal and regulatory mandates, and demographic shifts in workforce composition. Data virtualization easily integrates all this data to maximize the work force.
  • Sales Management - Sales managers need up-to-the-minute information at their fingertips to increase revenue and drive sales productivity. The costs associated with lost opportunities or problems left unresolved are significant. Answers to questions such as am I tracking to make my numbers this month, what are the hot products, how is my customer satisfaction, where can I get additional sales, and more are constantly being asked each day. While this information resides in multiple silos, you need what you need, no matter where it resides. Data virtualization can bring that information to sales leaders so they can hit revenue objectives.

Data Virtualization Pays Off!
Data virtualization is a more agile, lower cost data integration approach that successfully addresses complex upstream and downstream data silos and delivers significant business benefits including:

  • Increase Revenues - Maximize output from wells and refineries
  • Improve Productivity - Ensure engineers, analysts and manager have all the information they require
  • Reduce Costs - Avoid long data integration development cycles and excess data replication
  • Decrease Risk - Improve visibility across upstream, downstream and back-office operations
  • Ensure Compliance - Meet DOE, EPA, DOT, OSHA and EU compliance data requirements faster, for less

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@MicroservicesExpo Stories
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
SYS-CON Events announced today that Sheng Liang to Keynote at SYS-CON's 19th Cloud Expo, which will take place on November 1-3, 2016 at the Santa Clara Convention Center in Santa Clara, California.
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
With online viewership and sales growing rapidly, enterprises are interested in understanding how they analyze performance to positively impact business metrics. Deeper insight into the user experience is needed to understand why conversions are dropping and/or bounce rates are increasing or, preferably, to understand what has been helping these metrics improve. The digital performance management industry has evolved as application performance management companies have broadened their scope beyo...
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS - software, platform, and infrastructure as a service.
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software sec...
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.