Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Yeshim Deniz, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, @DevOpsSummit

@CloudExpo: Blog Feed Post

Three Public Cloud Providers, One Monitoring Goal | @CloudExpo #Cloud #APM #Monitoring

Each public cloud provider requires unique considerations, but you should have one goal in mind, continuous monitoring

Three Public Cloud Providers, One Monitoring Goal
By Joe Michalowski

So you've decided to take the infrastructure as a service (IaaS) approach to cloud migration. That's great-you're on your way to realizing the cost savings and flexibility of cloud computing.

But the decisions don't stop when you choose between SaaS, PaaS, and IaaS. Choosing your public cloud provider-Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP)-is the next step.

Each public cloud provider requires unique considerations, but you should have one goal in mind, regardless of the provider: continuous monitoring.

Monitoring Considerations for Amazon Web Services
AWS is loaded with services to give you flexibility in application migration. However, this leaves you with choices as to which ones will work best for your specific business needs.

The following two choices could negatively impact end-user experience if not handled properly:

  • AWS Lambda vs. EC2 for Computing: This decision is similar to the one you make regarding SaaS, PaaS and IaaS. While Lambda can take administrative tasks like provisioning capacity, monitoring fleet health and security patching out of your hands, EC2 offers the freedom and flexibility to support customized applications. Code deployment and monitoring are critical to maintain the user experience. But even if you choose Lambda to offload admin tasks, you have to remain proactive with continuous monitoring.
  • Amazon S3 vs. Glacier for Storage: Storage is a critical component of application design if you want to maintain good end-user experience in the cloud. S3 and Glacier are often discussed in the context of backup, but the use cases for these services are nearly endless. The goal of each is to minimize latency and make application availability seamless for users. Amazon S3 is a robust storage system for mission-critical applications whereas Glacier is meant as a "write once, retrieve rarely/never" service that minimizes costs.

There are monitoring capabilities that come built into many AWS services, but modern end-user experience demands require more than those traditional BGP approaches.

NoSQL vs. SQL Databases in Microsoft Azure
The type of database you choose for your cloud application is critical for performance and end-user experience. SQL has been the traditional go-to database for decades, but modern demands are causing developers to look elsewhere. SQL databases just can't keep up with high volume, velocities and varieties of data in some use cases.

NoSQL databases have emerged to enable storage of unstructured data at scale. For use cases like social and Internet of Things applications where document databases and key value stores are common, NoSQL will facilitate better performance.

Microsoft Azure offers multiple services for both NoSQL and SQL needs, including Azure Redis Cache for NoSQL and Azure SQL server. When you've decided which type of database is right for your application, you have to set yourself up for success by implementing continuous monitoring.

Maintaining End-User Experience in Google Cloud Platform
Google Compute and BigQuery are major components of the Google Cloud Platform. Google Compute promises unmatched virtual machine performance to maximize your workloads. However, GCP doesn't highlight its inherent monitoring capabilities. If you want to ensure the end-user experience consistently matches GCP workload performance, you'll need continuous monitoring.

In addition to Google Compute, BigQuery works on the back end to help power your data analytics processes-but most importantly, comes with monitoring functions to help track workloads. However, workload performance is just one piece of the puzzle. You need continuous monitoring to gain insights from the end user's perspective to maximize the ROI of your GCP investment.

Continuous Monitoring: The Link Between All Public Cloud Providers
There are many factors that go into a decision between public cloud providers. But no matter what, the end user has to be your top priority. No matter how much money you save operating in the cloud, you'll never see ROI if end users can't actually access and make the most of cloud-hosted applications.

With continuous monitoring, you can gain visibility into every aspect of application performance and the end-user experience. Your infrastructure may be hosted in the cloud, but that doesn't mean you offload all responsibility for maintenance.

Read the original blog entry...

More Stories By AppNeta Blog

AppNeta is the leader in proactive end-user performance monitoring solutions built for the distributed digital enterprise. With AppNeta, IT and Network Ops teams can assure continual and exceptional delivery of business-critical applications. AppNeta’s SaaS-based solutions give IT teams essential application and network performance data, allowing them to continuously monitor user experience across any application, network, data center or cloud. For more information, visit www.appneta.com.

Microservices Articles
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.