Welcome!

Microservices Expo Authors: AppDynamics Blog, Anders Wallgren, Liz McMillan, Elizabeth White, Martin Etmajer

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, Open Source Cloud, @CloudExpo, @BigDataExpo

Containers Expo Blog: Blog Feed Post

Challenges in Virtualization

Companies looking at virtualization solutions need storage solutions that are flexible

By Sue Poremba

Virtualization has been a boon to enterprise as it makes IT operations more efficient. Some like its green qualities as virtualization saves on energy consumption, while others appreciate the storage capacity, as well as the data recovery solutions for if disaster strikes.

However, the virtual environment is invisible, and with that come more challenges in making sure it runs smoothly. The cloud might be simple to setup, but it becomes more complex over time. In addition, the more machines and data involved, the more difficult it can be to monitor for space, CPU spikes, network security and other indicators.

“If there is a bug or a discrepancy, I need to know that there’s a problem before my customer does. And though that is the biggest challenge, it’s also a great opportunity,” Russ Caldwell, CTO, Emcien Corporation said.

One of those challenges is making sure storage in the virtualized environment is adequate. “We focus on storage and database environments that scale as the customers grow,” said Caldwell. “Determining how fast customers grow and change is the biggest factor for determining the adequate storage size.”

Companies looking at virtualization solutions need storage solutions that are flexible so they can add or remove storage, as needed. Even though it may have been the right size in the beginning of a project, things change, and a flexible virtualization tool can give that peace of mind when things change. For example, when we’re working with slow-moving manufacturing data, we can determine the adequate storage size easier than when we’re working with hundreds of millions of bank nodes, where the growth is much more dramatic.

The key, according to John Ross with virtual solution company Phantom Business Development at Net Optics, is to truly assess the performance of the servers and the requirements of the virtual machines. This requires monitoring to be in place for the life of the systems to predict utilization and to modify placement based on performance. “When this is not accounted for, it can appear as though there is high CPU utilization on the hosts as well as the VMs,” said Ross, “With the use of protocols such as NFS and ISCSI, it can put quite a load on the network.”

Companies moving to the cloud also have to change how they think about networking. “It can be hard to understand how network connection works when there aren’t wires to simply plug it into a box, but instead virtual, invisible connections that need to be managed through APIs or online interfaces,” said Caldwell. One of the challenges for a company with multiple clients is keeping client data separate from one another. Grouping machines together and isolating them in their own network is the best approach in tackling this challenge. Using excellent monitoring tools smartly can ensure that the network is as reliable as possible.

“Network connectivity comes down to whether the network connection is a single point of failure: If your virtualization solution is off-site, it’s only as good as the quality of the Internet connection between you and your provider,” said William L. Horvath with DoX Systems. If you have a single connection between you and the Internet, that’s one problem. (You can reduce the risks by contracting with two or more ISPs and getting routers that support trunking.) Likewise, if your virtualization provider’s facility is in a single geographical location (say, Manhatten) that loses functionality for an extended period of time due to some natural disaster, you’re hosed. Our Chamber of Commerce lost access to a cloud-based service not too long ago because someone in the data center, which wasn’t owned by the service provider, forgot to disable the fire suppression system during emergency testing, which unexpectedly destroyed most of the hard drives in the servers.

To avoid the challenges involved in virtualization, Ross provided the following tips:

1. Plan on virtualizing everything — not just the servers but the network, the storage, the security … everything!

2. Standardize everything, from the operating systems on upwards through middleware and applications. The more uniformity exists within configurations, the easier it will be to scale and move these workloads optimally around the environment.

3. Ensure network capabilities are met. This will dynamically change and collapse. There will be huge flow changes as utilization and cloud are adopted.

4. Implement resource monitoring. Existing legacy tools will not provide the data or detail needed.

5. Implement a decommissioning process. Ross repeatedly finds several unused machines running. In a virtual environment, this can become a major issue, consuming resources and driving up costs.

6. Plan for backup and disaster recovery. This will drastically change in virtualization and must be addressed.

7. Train your team based on what the management will look like, not on the migration.

The cloud solves certain problems really well and it allows for SMBs to have the flexible infrastructures that they require — without a lot of capital or hardware or payroll costs. Using the cloud wisely with the right tools, companies can get a leg ahead.

Sue Poremba is a freelance writer focusing primarily on security and technology issues and occasionally blogs for Rackspace Hosting.

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@MicroservicesExpo Stories
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Microservices are all the rage right now — and the industry is still learning, experimenting, and developing patterns, for successfully designing, deploying and managing Microservices in the real world. Are you considering jumping on the Microservices-wagon? Do Microservices make sense for your particular use case? What are some of the “gotchas” you should be aware of? This morning on #c9d9 we had experts from popular chat app Kik, SMB SaaS platform Yodle and hosted CI solution Semaphore sha...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...