|By Nicos Vekiarides||
|January 3, 2012 05:30 AM EST||
It’s time to make a few predictions for 2012 in the cloud data space. 2011 was a year of adoption, during which many companies started to leverage the cloud, enjoying the economies of scale, security and ease in managing their growing data needs. Those successes promise even greater cloud adoption in 2012. With that in mind, here are 10 predictions for hot trends to watch for in the cloud data space:
- Hybrid data storage environments combining cloud storage with existing storage. For most companies, the notion of moving all of their data to the cloud is not fathomable. However, continuously expanding data storage needs are fueling a need for more capacity. What better way to address this need than with cloud storage? The benefits include access to a secure, limitless pool of storage capacity, no future need for upgrade or replacement and reduced capital expenses. Look for auto-tiering technologies to seamlessly combine hybrid cloud and on-premise environments in a way that operates with existing applications.
- Private cloud environments in enterprise companies. Enterprises looking to leverage the economies, efficiencies and scale of cloud providers are adopting cloud models in-house, such as OpenStack, for both compute and storage environments. These private clouds offer scale, agility and price/performance typically unmatched by traditional infrastructure solutions and can reside inside a company’s firewall. In the storage space, look for technologies that can combine existing SAN infrastructure and private cloud storage into a unified Cloud SAN.
- Disaster recovery to the cloud as a viable option. Traditionally, companies that need disaster recovery (DR) and business continuity (BC) have relied on dedicated replicated infrastructure at an offsite location to be able to recover from physical disaster. This means paying for idle hardware that’s waiting for a disaster. DR in the cloud, on the other hand, means not having to pay for this infrastructure except when it is needed. The tradeoff? While not necessarily a zero-downtime solution, look for cloud DR with recovery time objectives (RTOs) in a matter of hours.
- Disaster recovery from the cloud as a new need. What happens to business data stored by SaaS application in the case of a disaster? The truth is most SaaS providers do have a DR strategy, but many businesses will demand a recovery strategy under their control. Look for emergence of solutions that backup SaaS data either locally or to an alternate provider as an extra level of protection.
- Simplified onboarding of applications to the cloud. Certain business applications can move entirely to the cloud, thereby saving the administrative and maintenance headaches of their hardware/software platforms onsite. Many IT-strapped businesses can benefit from tools to make this migration viable. Look for robust tool sets that can migrate applications to a choice of cloud providers – and also bring those applications back on-premise should the need arise.
- Non-relational databases for big data. NoSQL databases, like Apache CouchDB, enable tremendous scalability in order to meet the needs of Terabytes and Petabytes of data accessed by millions of users. Big data will force many companies to consider these alternatives to traditional databases and cloud deployment models will simplify the roll-out. Look for vendors providing supported NoSQL solutions.
- Use of the cloud for analytics. Analytics tend to require a scalable compute and storage environment as well as rather expensive software. Similar to idle hardware for disaster recovery purposes, analytics for many businesses may represent a seasonal need that only runs in short bursts and may not justify purchasing a dedicated software/hardware environment. Analytic environments in the cloud can turn the expense into a “pay-per-use” bill, meeting business goals at a far lower price point.
- SSD tiers of storage in the cloud. Moving higher performance applications into the cloud doesn’t always guarantee that they will get the level of performance they need from their data storage. By offering high-performance tiers of storage that are SSD-based (i.e. flash), cloud providers will be able to address the needs for predictable and faster application response times.
- Improvements in data reduction technology. With cloud storage commanding a per GB operating expense, deduplication and compression technologies have become rather ubiqitous in minimizing costs. While some may argue the capacity optimization game has played out, there is still the challenge of capacity optimization on a more global scale across multiple tenants and a challenge for rich media content which does not fare particularly well with today’s reduction technologies. Look for the introduction of new data reduction technologies that address both needs.
- “Cloud-envy” from cloud laggards. While many companies have already adopted the cloud and many more will adopt in 2012, others may still wait and ponder well past 2012. Regardless of which category a company falls into, the economics and efficiencies of the cloud have become irrefutable. As a result, some of the laggards will likely seek ways to leverage cloud methodologies that improve IT efficiency on-premise. Undoubtedly, some will fall prey to cloudwashing by purchasing traditional IT infrastructure named “cloud” in an attempt to satisfy their “cloud-envy.”
Bottom line? Cloud deployments are becoming simpler and more secure and the economics continue to improve. Which of these trends will your business follow in 2012?
I’ve been thinking a bit about microservices (μServices) recently. My immediate reaction is to think: “Isn’t this just yet another new term for the same stuff, Web Services->SOA->APIs->Microservices?” Followed shortly by the thought, “well yes it is, but there are some important differences/distinguishing factors.” Microservices is an evolutionary paradigm born out of the need for simplicity (i.e., get away from the ESB) and alignment with agile (think DevOps) and scalable (think Containerizati...
May. 29, 2015 08:00 AM EDT Reads: 1,812
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
May. 29, 2015 08:00 AM EDT Reads: 1,970
The release of Kibana 4.x has had an impact on monitoring and other related activities. In this post we’re going to get specific and show you how to add Node.js monitoring to the Kibana 4 server app. Why Node.js? Because Kibana 4 now comes with a little Node.js server app that sits between the Kibana UI and the […]
May. 29, 2015 08:00 AM EDT Reads: 1,281
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
May. 29, 2015 08:00 AM EDT Reads: 2,396
Containers Expo Blog covers the world of containers, as this lightweight alternative to virtual machines enables developers to work with identical dev environments and stacks. Containers Expo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Bookmark Containers Expo Blog ▸ Here Follow new article posts on Twitter at @ContainersExpo
May. 29, 2015 07:00 AM EDT Reads: 2,020
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
May. 29, 2015 07:00 AM EDT Reads: 5,422
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, discussed how a user-centric Application Performance Management solution can help inspire your users with every applicati...
May. 29, 2015 04:00 AM EDT Reads: 5,178
As enterprises engage with Big Data technologies to develop applications needed to meet operational demands, new computation fabrics are continually being introduced. To leverage these new innovations, organizations are sacrificing market opportunities to gain expertise in learning new systems. In his session at Big Data Expo, Supreet Oberoi, Vice President of Field Engineering at Concurrent, Inc., discussed how to leverage existing infrastructure and investments and future-proof them against e...
May. 29, 2015 03:00 AM EDT Reads: 3,625
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today's databases are anything but agile - they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
May. 29, 2015 12:00 AM EDT Reads: 3,866
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT transformation?
May. 29, 2015 12:00 AM EDT Reads: 4,753
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
May. 28, 2015 11:15 PM EDT Reads: 2,819
SYS-CON Events announced today that SUSE, a pioneer in open source software, will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. SUSE provides reliable, interoperable Linux, cloud infrastructure and storage solutions that give enterprises greater control and flexibility. More than 20 years of engineering excellence, exceptional service and an unrivaled partner ecosystem power the products and support that help ...
May. 28, 2015 10:30 PM EDT Reads: 1,069
Virtualization is everywhere. Enormous and highly profitable companies have been built on nothing but virtualization. And nowhere has virtualization made more of an impact than in Cloud Computing, the rampant and unprecedented adoption of which has been the direct result of the wide availability of virtualization software and techniques that enabled it. But does the cloud actually require virtualization?
May. 28, 2015 09:00 PM EDT Reads: 2,442
There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way ...
May. 28, 2015 09:00 PM EDT Reads: 1,994
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
May. 28, 2015 07:00 PM EDT Reads: 4,031
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
May. 28, 2015 05:00 PM EDT Reads: 2,172
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
May. 28, 2015 05:00 PM EDT Reads: 2,192
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
May. 28, 2015 03:30 PM EDT Reads: 2,322
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
May. 28, 2015 03:00 PM EDT Reads: 2,909
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
May. 28, 2015 02:00 PM EDT Reads: 2,468