|By Roman Stanek||
|November 19, 2012 07:00 AM EST||
Inventory levels. Sales results. Negative comments on Facebook. Positive comments on Twitter. Shopping on Amazon. Listening to Pandora. Online search habits. No matter what you call it or what the information describes, it’s all data being collected about you.
Thanks to new technologies like Hadoop, once-unquantifiable data (like Facebook conversations and Tweets) can now be quantified. Now, because nearly everything is measurable, everything is measured. The result: companies are spending big dollars to collect, store and measure astronomical amounts of data.
Show me the data!
There’s a name for this movement: Big Data. Not only is it a name, it has been the “it, it” of 2012, possibly trumping “the cloud.”
IDC defines Big Data as projects collecting 100 terabytes of data (hence the name), comprising two or more data formats. Earlier this year, the research firm predicted the market for Big Data technology and services will reach $16.9 billion by 2015, from $3.2 billion in 2010. That’s an astounding 40 percent annual growth rate.
The interesting thing is that IDC expects most of this spending to focus on infrastructure — the plumbing that enables companies to download, collect and store vast amounts of data.
To me, this is a missed opportunity. Why? We need to focus on unlocking the real business benefits from all this data.
Companies have not yet grasped the business potential of all the data pouring in from hundreds of sources—think apps in the cloud, on-premise partner software and from their own enterprise. In effect, businesses haven’t figured out how to make money from this fire hose of disparate data sources.
My point-of-view is that Big Data’s only real value lies in businesses’ ability to transform data into insight they can act on.
This means enabling sales managers to quickly analyze sales reps’ results, view new contracts lost or signed, and react to how actual performance compares against the plan they set months earlier. Help-desk staff could see how individual customers affect sales and profit, showing them when to go above-and-beyond to retain certain customers while allowing low-flyers to churn. Or helping insurance agents to predict kinds and amounts of damage as hurricanes hurtle toward their region.
Steps to Monetize Big Data
To glean value from Big Data efforts, companies need to embrace the real-time value provided by the cloud. Viewing one’s data in real-time through the lens of cloud computing enables anyone, in any company, to make smart business decisions from the mammoth amounts of data, coming from all over the place.
Therefore, companies looking to monetize Big Data need to take these steps:
Use the cloud: These days businesses can tap into an enormous range of cloud services. They can subscribe to high-performance infrastructure services like Amazon Web Services, rent platforms as a service (comprising hardware, operating systems, storage and network capacity) from salesforce.com, store information in services like Box or automate billings with companies like Zuora. These are just examples.
Companies can also pick and choose from a long list of cloud-based apps to handle business tasks, from customer relationship management and marketing to human resources and financial management. In fact, I would argue that cloud services become the business application suite, eventually displacing behemoth on-premise packages from SAP or Oracle. Emphasis on “eventually,” since few enterprises are ready to jettison their million-dollar investments in Oracle and SAP.
For this reason, I advise companies to:
Start with what’s important: Forget about separate data sources. Data today spews in from hundreds sources, be it sales and customer data from salesforce.com, inventory levels from SAP, logistics information from your suppliers and employee data from Oracle. Companies run into trouble when they start off boiling the ocean, which is why I suggest companies begin with a few sources and then build up from there.
Fortunately, there is a way, thanks to a new generation of application programming interfaces (APIs) that allows more kinds of software, from different software makers, to communicate with each other, regardless of location. As a result, any company, regardless of size, can access the data it needs to make better decisions.
Which is why my next point is:
Make Big Data insight democratic: Five years ago, only executives at very large companies had access to business intelligence tools that culled patterns from data.
The cloud makes everything democratic — not just access to the data itself, but the insight as well, including best practices that don’t require the expertise of a SQL or a MapReduce programmer. The cloud enables anyone, anywhere, to recognize patterns from data and make smart decisions, faster. And that means any business professional, at any company should be able to monetize their Big Data.
When Big Data finally becomes useful to the rest of us, and not just IT wizards, it will take on an even larger role today and into tomorrow.
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Jan. 22, 2017 03:15 PM EST Reads: 4,741
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Jan. 22, 2017 03:00 PM EST Reads: 1,191
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Jan. 22, 2017 02:30 PM EST Reads: 2,637
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Jan. 22, 2017 02:30 PM EST Reads: 3,753
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 22, 2017 02:00 PM EST Reads: 5,283
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
Jan. 22, 2017 12:15 PM EST Reads: 2,628
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Jan. 22, 2017 12:00 PM EST Reads: 3,633
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Jan. 22, 2017 11:45 AM EST Reads: 2,969
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
Jan. 22, 2017 11:45 AM EST Reads: 6,457
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Jan. 22, 2017 11:00 AM EST Reads: 1,241
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Jan. 22, 2017 10:30 AM EST Reads: 3,272
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
Jan. 22, 2017 10:15 AM EST Reads: 865
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 22, 2017 08:30 AM EST Reads: 978
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
Jan. 22, 2017 08:30 AM EST Reads: 5,008
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Jan. 22, 2017 06:30 AM EST Reads: 5,606
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 22, 2017 02:45 AM EST Reads: 6,160
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
Jan. 22, 2017 02:30 AM EST Reads: 7,921
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 22, 2017 01:00 AM EST Reads: 2,913
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 22, 2017 12:45 AM EST Reads: 3,575
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Jan. 22, 2017 12:00 AM EST Reads: 886