Welcome!

Microservices Expo Authors: Anders Wallgren, Martin Etmajer, Elizabeth White, Liz McMillan, Jason Bloomberg

Related Topics: Microservices Expo, Java IoT, PowerBuilder, Microsoft Cloud, @CloudExpo, Apache

Microservices Expo: Article

From Efficiency to Effectiveness: The Role of Data

The excitement about Big Data is really about being able to take advantage of the data in which we are all awash

Efficiency may be the most commonly used term in enterprise software marketing - that or "ensure." And not without reason - efficiency is one of the key value propositions of most enterprise software, from collaboration tools, to productivity tools, to integration tools and beyond. At a certain point though, the gains to be achieved from efficiency become smaller and smaller and of lesser and lesser business significance.

This is resulting in a shift in focus from efficiency to effectiveness. At times, these goals are twin, but in many cases, they are not - the most effective allocation of resources may not be the most efficient - at least in the short-term. Managing an organization with an eye toward effectiveness can be a challenge, because business metrics are often tied to processes and other types of "discrete" pieces of work, and how quickly/efficiently they are completed. As a result, when an organization makes the shift to managing for effectiveness rather than efficiency, the metrics used to evaluate success typically have to be "leveled-up," that is, taken up to the level that really matters to the business. An example of this leveling up occurred several years back when customer service organizations changed their focus from shortening call times to increasing the rate of first call resolution. Resolving a customer issue on the first call may result in increasing the length of the call, but over the long term it is a more effective approach, because it may result in a shorter overall expenditure of the Customer Service Representatives' aggregated time, and will certainly result in more satisfied customers.

Operationalizing this "leveling-up" is not an easy task. Most of the greatest challenges associated with doing so relate to data. First, organizations must have an idea that their current efficiency-based metrics are not serving them well. The only way to know that your current practices are ill-serving you is to capture data to make that point. In the CSR example above, that means being able to find out that a customer has called multiple times. But the way that calls are typically handled, a case is created for each one, meaning that the data doesn't tell a story of a customer calling multiple times and taking the time of many different CSRs; instead, the data tells of ten individual calls, each of which lasted three minutes. The complexity of the problem is actually greater than this, because what happens more often than not in such cases is that a customer will try to resolve the problem by contacting the organization through multiple different channels - phone, Web, email, chat. Because the data is so often fragmented, organizations will typically find out about such broken practices through a series of irate letters and phone calls, or in the worst case scenario, in a drop-off in customers. Whatever the means of notification, at some point it becomes clear to the organization that they not only have a problem of misaligned incentives, but also a data problem. They then turn to the data to understand what has been going on in their organization and how to manage more effectively.

The story likely can be pieced together from the data, but the organization must still make sure they are asking the right questions - if "number of cold calls made" is not the right metric, what is? Once the right questions have been identified, then it's time to turn to the data. Because in most organizations the data to be captured was not set up with these higher-level goals in mind, getting the right answer from the data requires some work. The data across these various systems must be integrated and federated - all of the necessary data must be extracted from the various systems inside and out of the organization and loosely coupled so that the data is telling the whole story. It also requires cleansing the data and rationalizing it such that data about the same thing being captured in different systems is in sync.

It may be that even after having all of the data rationalized and accessible, the crucial data needed to manage the business more effectively is not currently being captured. This is a relatively small problem, with practically everything digitized and virtualized, there is very likely a way to capture the data an organization seeks. A common scenario is that the data is being captured, but in an off-premise cloud-based application or in a partner's application or it may be that the data is embedded in the activities carried out on social networks. In all of these cases, new technology makes the data accessible and manageable. As a result, so, too, are the answers to the real business questions of how to manage the business more effectively.

Data integration tools make it possible to integrate and federate data from cloud-based applications with on-premise systems, to incorporated data from third parties. The ability to use Hadoop MapReduce to take in and manage unprecedented volumes of data from social networks and other non-traditional sources makes it possible to truly have, manage and analyze all of your data. New social MDM technology means that you can tap into the data embedded in social interactions on social networks and use this to create an even more fully fleshed-out golden record for your customers.

In truth, it is the gains we have made in efficiency, in finding ever-more efficient ways to access, store and analyze data that make this turn towards effectiveness possible. Without being able to do all of the above in a time- and cost-efficient manner, it is not possible to use the data to manage more effectively.

In many ways, this is what the hype about Big Data is all about. The unarticulated and implicit excitement about Big Data is really about being able to take advantage of the data in which we are all awash and use it to manage our organizations more effectively than ever before. Managing for effectiveness looks different in every industry. In retail, managing for effectiveness is understanding customers - catering to them when, where, how and with what they want. In pharma, managing for effectiveness is limiting physician wash out, getting more clinical trial data more quickly, and being able to complete or pull the plug on trials faster based on the results of that data. In every industry, managing for effectiveness means using the power of data to make the best business decisions possible, getting a true return on data.

More Stories By Emily Burns

Emily Burns is responsible for Platform Product Marketing at Informatica. In that capacity, she has two principal roles. First, she evangelizes the benefits to be achieved from managing data as a key corporate asset, especially using the Informatica Platform. Second, she works to identify and communicate best practices and methodologies that demonstrate how to manage data as a corporate asset.

Prior to Informatica, Emily worked at Pegasystems and at TIBCO. While at Pegasystems, she led their case management product initiative. At TIBCO she was responsible for product marketing for the BPM suite. Emily holds a BS with majors in biochemistry and music, with an emphasis on piano performance. She is an avid reader, cook, and triathlete. Emily lives in Boston with her husband and two young sons.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...