Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Kevin Jackson, Yeshim Deniz, Jason Bloomberg

Related Topics: Open Source Cloud, Java IoT, Microservices Expo, Linux Containers

Open Source Cloud: Article

Memory Monitoring and Limiting with LXC

Building a Metrics Infrastructure

Original article can be found here: Memory Monitoring with LXC

Memory Monitoring And Limiting With LXC

For a long time we didn't limit the amount of memory that you can use during your build on Codeship. There was the possibility of a bad build eating up all of our memory.

A few weeks ago that bad build happened, using up so much memory that it decreased performance and eventually killed the test server. Even though we measure the memory usage of the whole test server, we didn't have the data to figure out exactly which build caused the trouble.

Combined maximum and minimum memory usage of Amazon EC2 Instances.

Combined maximum and minimum memory usage of Amazon EC2 Instances.

How to avoid builds eating up all of your memory
We couldn't risk that this problem happened again and be a threat to other builds on that test server. We didn't have enough data about the memory limits at this point so we had to take an educated guess. My first assumption was the most conservative one. Each test server has 60G of memory and we run 22 builds on each instance, so each build can get a maximum of 2,5G memory. As LXC manages memory with cgroups, it is simple to set a memory limit for each container.

Setting lxc.cgroup.memory.limit_in_bytes in container config:

lxc.cgroup.memory.limit_in_bytes = 2560M

This worked, the bad build wasn't a threat anymore.

We got a few support request where people were asking about their builds being stuck. While the 2,5G are enough for 95% of the builds, the other 5% were hitting the limit. We needed to increase the memory limit to ensure that all builds are running fine. After 2 incremental steps The Codeship is sailing smooth with a 10GB memory limit for each build.

Codeship - A hosted Continuous Deployment platform for web applications

Monitoring in the Future
We wanted to have more data to improve the memory limit in the future. We started measuring the maximum memory usage for each build. We export it to Librato Metrics and save it as metadata of the build in our database. This allows to inspect the memory usage easily. We plan to show the memory usage to our users in the future.

LXC tracks the memory usage for each container and exposes that and many other values in the cgroup. Right before we shutdown the build, we read the memory usage from the cgroup. You can read the memory usage from the cgroup on the LXC Host.

/sys/fs/cgroup/memory/lxc/name_of_running_container/memory.max_usage_in_bytes

and send the data to Librato Metrics.

Metriks.timer('build.memory_usage) .update(metrics[:max_memory])

Building a Metrics Infrastructure
It is important to back your actions with data. Data is your best argument. In case you don't have any useful data to solve the problem, it is important that you can easily add more metrics to your infrastructure. Sit back with your Coworkers and think about good metricsfor your product. Sometimes it's not easy to spot them on the first glimpse. We often talk about our metrics.

We discuss new metrics but also talk about removing redundant metrics. It is very exciting to talk about data and what your Coworkers conclude from that data. We are able to add new metrics to our infrastructure in minutes. By building this metrics infrastructure you can handle any new challenges by quickly adding mesurements that help you decide on the next steps. This infrastructure can be the difference between life and death of your service, so make sure you have it in place.

Do you monitor? Which tools and services do you use for it? Let us know in the comments!

Further Information on Linux Containers


Download Efficiency in Development Workflows: A free eBook for Software Developers. This book will save you a lot of time and make you and your development team happy.

Go ahead and try Codeship for Continuous Integration and Continuous Deployment! Set up for your GitHub and BitBucket projects only takes 3 minutes. It's free!

More Stories By Manuel Weiss

I am the cofounder of Codeship – a hosted Continuous Integration and Deployment platform for web applications. On the Codeship blog we love to write about Software Testing, Continuos Integration and Deployment. Also check out our weekly screencast series 'Testing Tuesday'!

@MicroservicesExpo Stories
The reality of data ubiquity is here—data is buried in operational statistics, machine logs, stacks of overflowing tickets and customer details, among other things. How can any user get valuable information amid this rapid influx of data? Imagine a situation where your firm’s revenue takes a hit owing to an unexpected failure in some business process. It would be a nightmare for IT admins to sift through the interminable piles of data to deduce exactly why and where the problem occurred. To sav...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Colocation is a central pillar of modern enterprise infrastructure planning because it provides greater control, insight, and performance than managed platforms. In spite of the inexorable rise of the cloud, most businesses with extensive IT hardware requirements choose to host their infrastructure in colocation data centers. According to a recent IDC survey, more than half of the businesses questioned use colocation services, and the number is even higher among established businesses and busine...
For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers bala...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.