Click here to close now.

Welcome!

@MicroservicesE Blog Authors: Pat Romanski, Elizabeth White, Lori MacVittie, Liz McMillan, Cloud Best Practices Network

Related Topics: @CloudExpo Blog, Java IoT, @MicroservicesE Blog, @ContainersExpo, IoT User Interface, Apache

@CloudExpo Blog: Blog Feed Post

Big Data Conundrum: Show Me the Money!

To glean value from Big Data efforts, companies need to embrace the real-time value provided by the cloud

Inventory levels. Sales results. Negative comments on Facebook. Positive comments on Twitter. Shopping on Amazon. Listening to Pandora. Online search habits. No matter what you call it or what the information describes, it’s all data being collected about you.

Thanks to new technologies like Hadoop, once-unquantifiable data (like Facebook conversations and Tweets) can now be quantified. Now, because nearly everything is measurable, everything is measured. The result: companies are spending big dollars to collect, store and measure astronomical amounts of data.

Show me the data!

There’s a name for this movement: Big Data. Not only is it a name, it has been the “it, it” of 2012, possibly trumping “the cloud.”

IDC defines Big Data as projects collecting 100 terabytes of data (hence the name), comprising two or more data formats. Earlier this year, the research firm predicted the market for Big Data technology and services will reach $16.9 billion by 2015, from $3.2 billion in 2010. That’s an astounding 40 percent annual growth rate.

The interesting thing is that IDC expects most of this spending to focus on infrastructure — the plumbing that enables companies to download, collect and store vast amounts of data.

To me, this is a missed opportunity. Why? We need to focus on unlocking the real business benefits from all this data.

Companies have not yet grasped the business potential of all the data pouring in from hundreds of sources—think apps in the cloud, on-premise partner software and from their own enterprise. In effect, businesses haven’t figured out how to make money from this fire hose of disparate data sources.

My point-of-view is that Big Data’s only real value lies in businesses’ ability to transform data into insight they can act on.

This means enabling sales managers to quickly analyze sales reps’ results, view new contracts lost or signed, and react to how actual performance compares against the plan they set months earlier. Help-desk staff could see how individual customers affect sales and profit, showing them when to go above-and-beyond to retain certain customers while allowing low-flyers to churn. Or helping insurance agents to predict kinds and amounts of damage as hurricanes hurtle toward their region.

Steps to Monetize Big Data
To glean value from Big Data efforts, companies need to embrace the real-time value provided by the cloud. Viewing one’s data in real-time through the lens of cloud computing enables anyone, in any company, to make smart business decisions from the mammoth amounts of data, coming from all over the place.

Therefore, companies looking to monetize Big Data need to take these steps:

Use the cloud: These days businesses can tap into an enormous range of cloud services. They can subscribe to high-performance infrastructure services like Amazon Web Services, rent platforms as a service (comprising hardware, operating systems, storage and network capacity) from salesforce.com, store information in services like Box or automate billings with companies like Zuora. These are just examples.

Companies can also pick and choose from a long list of cloud-based apps to handle business tasks, from customer relationship management and marketing to human resources and financial management. In fact, I would argue that cloud services become the business application suite, eventually displacing behemoth on-premise packages from SAP or Oracle. Emphasis on “eventually,” since few enterprises are ready to jettison their million-dollar investments in Oracle and SAP.

For this reason, I advise companies to:

Start with what’s important: Forget about separate data sources. Data today spews in from hundreds sources, be it sales and customer data from salesforce.com, inventory levels from SAP, logistics information from your suppliers and employee data from Oracle. Companies run into trouble when they start off boiling the ocean, which is why I suggest companies begin with a few sources and then build up from there.

Fortunately, there is a way, thanks to a new generation of application programming interfaces (APIs) that allows more kinds of software, from different software makers, to communicate with each other, regardless of location. As a result, any company, regardless of size, can access the data it needs to make better decisions.

Which is why my next point is:

Make Big Data insight democratic: Five years ago, only executives at very large companies had access to business intelligence tools that culled patterns from data.

The cloud makes everything democratic — not just access to the data itself, but the insight as well, including best practices that don’t require the expertise of a SQL or a MapReduce programmer. The cloud enables anyone, anywhere, to recognize patterns from data and make smart decisions, faster. And that means any business professional, at any company should be able to monetize their Big Data.

When Big Data finally becomes useful to the rest of us, and not just IT wizards, it will take on an even larger role today and into tomorrow.

Read the original blog entry...

More Stories By Roman Stanek

Roman Stanek is a technology visionary who has spent the past fifteen years building world-class technology companies. Currently Founder & CEO of Good Data, which provides collaborative analytics on demand, he previously co-founded first NetBeans, now a part of Sun Microsystems and one of the leading Java IDEs, and then and Systinet, now owned by Hewlett-Packard and the leading SOA Governance platform on the market.

@MicroservicesExpo Stories
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Conferences agendas. Event navigation. Specific tasks, like buying a house or getting a car loan. If you've installed an app for any of these things you've installed what's known as a "disposable mobile app" or DMA. Apps designed for a single use-case and with the expectation they'll be "thrown away" like brochures. Deleted until needed again. These apps are necessarily small, agile and highly volatile. Sometimes existing only for a short time - say to support an event like an election, the Wor...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Cloud Migration Management (CMM) refers to the best practices for planning and managing migration of IT systems from a legacy platform to a Cloud Provider through a combination professional services consulting and software tools. A Cloud migration project can be a relatively simple exercise, where applications are migrated ‘as is’, to gain benefits such as elastic capacity and utility pricing, but without making any changes to the application architecture, software development methods or busine...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Data center models are changing. A variety of technical trends and business demands are forcing that change, most of them centered on the explosive growth of applications. That means, in turn, that the requirements for application delivery are changing. Certainly application delivery needs to be agile, not waterfall. It needs to deliver services in hours, not weeks or months. It needs to be more cost efficient. And more than anything else, it needs to be really, dc infra axisreally, super focus...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Many people recognize DevOps as an enormous benefit – faster application deployment, automated toolchains, support of more granular updates, better cooperation across groups. However, less appreciated is the journey enterprise IT groups need to make to achieve this outcome. The plain fact is that established IT processes reflect a very different set of goals: stability, infrequent change, hands-on administration, and alignment with ITIL. So how does an enterprise IT organization implement change...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations migh...
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
Mashape is bringing real-time analytics to microservices with the release of Mashape Analytics. First built internally to analyze the performance of more than 13,000 APIs served by the mashape.com marketplace, this new tool provides developers with robust visibility into their APIs and how they function within microservices. A purpose-built, open analytics platform designed specifically for APIs and microservices architectures, Mashape Analytics also lets developers and DevOps teams understand w...
Sumo Logic has announced comprehensive analytics capabilities for organizations embracing DevOps practices, microservices architectures and containers to build applications. As application architectures evolve toward microservices, containers continue to gain traction for providing the ideal environment to build, deploy and operate these applications across distributed systems. The volume and complexity of data generated by these environments make monitoring and troubleshooting an enormous chall...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud envir...
Containers and Docker are all the rage these days. In fact, containers — with Docker as the leading container implementation — have changed how we deploy systems, especially those comprised of microservices. Despite all the buzz, however, Docker and other containers are still relatively new and not yet mainstream. That being said, even early Docker adopters need a good monitoring tool, so last month we added Docker monitoring to SPM. We built it on top of spm-agent – the extensible framework f...
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
SYS-CON Events announced today that the "Second Containers & Microservices Conference" will take place November 3-5, 2015, at the Santa Clara Convention Center, Santa Clara, CA, and the “Third Containers & Microservices Conference” will take place June 7-9, 2016, at Javits Center in New York City. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.