Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Harry Trott, Yeshim Deniz, Kevin Jackson

Related Topics: @DevOpsSummit, Containers Expo Blog, Cloud Security

@DevOpsSummit: Blog Feed Post

Ten Tips for Integrating Security into DevOps | @DevOpsSummit #DevOps #InfoSec

Imagine a world where product owners, Development, QA, IT Operations, and Infosec work together

Ten Tips for Integrating Security into DevOps
By Gene Kim

Imagine a world where product owners, Development, QA, IT Operations, and Infosec work together, not only to help each other, but also to ensure that the overall organization succeeds. By working toward a common goal, they enable the fast flow of planned work into production (e.g., performing tens, hundreds, or even thousands of code deploys per day), while achieving world-class stability, reliability, availability, and security.

In this world, Infosec is always working on ways to reduce friction for the team, creating the work systems that enable developers to be more productive and get better outcomes. By doing this, small teams can fully leverage the collective experience and knowledge of not just Infosec, but also QA and Ops, in their daily work without being dependent on other teams, deploying safely, securely and quickly into production.

This enables organizations to create a safe system of work, where small teams are able to quickly and independently develop, test, and deploy code and value quickly, safely, securely, and reliably to customers. This allows organizations to maximize developer productivity, enable organizational learning, create high employee satisfaction, and win in the marketplace.

Instead of inspecting security into our product at the end of the process, we will create and integrate security controls into the daily work of Development and Operations, so that security is part of everyone's job, every day.

The Need for Force Multiplication
One interpretation of DevOps is that it came from the need to enable developers productivity, because as the number of developers grew, there weren't enough Ops people to handle all the resulting deployment work.

This shortage is even worse in Infosec - James Wickett described vividly why Infosec needs DevOps:

The ratio of engineers in Development, Operations, and Infosec in a typical technology organization is 100:10:1. When Infosec is that outnumbered, without automation and integrating information security into the daily work of Dev and Ops, Infosec can only do compliance checking, which is the opposite of security engineering-and besides, it also makes everyone hate us.

Getting Started

1. Integrate security into development iteration demonstrations.
Here's an easy way to prevent Infosec from being a blocker at the end of the project: invite Infosec into product demonstrations at the end of each development interval. This helps everyone understand team goals as they relate to organizational goals, see their implementations during the build process, and gives them the chance to offer input into what's needed to meet security and compliance objectives while there's still ample time to make corrections.

2. Ensure security work is in our Dev and Ops work tracking systems.
Infosec work should be as visible as all other work in the value stream. We can do this by tracking them in the same work tracking system that Development and Operations use daily so they can be prioritized alongside everything else.

3. Integrate Infosec into blameless post-mortem processes.
Also consider doing a postmortem after every security issue to prevent a repeat of the same problem. In a presentation at the 2012 Austin DevOpsDays, Nick Galbreath, who headed up Information Security at Etsy for many years, describes how they treated security issues, "We put all security issues into JIRA, which all engineers use in their daily work, and they were either ‘P1' or ‘P2,' meaning that they had to be fixed immediately or by the end of the week, even if the issue is only an internally-facing application.

4. Integrate preventive security controls into shared source code repositories and shared services.
Shared source code repositories are a fantastic way to enable anyone to discover and reuse the collective knowledge of the organization, not only for code, but also for toolchains, deployment pipeline, standards-and security. Security information should include any mechanisms or tools for safeguarding applications and environments, such as libraries pre-blessed by security to fulfill their specific objectives. Also, putting security artifacts into the version control system that Dev and Ops use daily keeps security needs on their radar.

5. Integrate security into the deployment pipeline.
To keep Infosec issues top of mind of Dev and Ops, we want to continually give those teams fast feedback about potential risks associated with their code. Integrating security into the pipeline involves automating as many security tests as possible so that they run alongside all other automated tests. Ideally, these tests should be performed on every code commit by Dev or Ops, and even in the earliest stages of a software project.

6. Protect the deployment pipeline from malicious code.
Unfortunately, malicious code can be injected into the infrastructures that support CI/CD. A good place to hide that code is in unit tests because no one looks at them and because they're run every time someone commits code to the repo. We can (and must) protect deployment pipelines through steps such as:

  • Hardening continuous build and integration servers so we can reproduce them in an automated manner
  • Reviewing all changes introduced into version control to prevent continuous integration servers from running uncontrolled code
  • Instrumenting the repository to detect when test code contains suspicious API calls

7. Secure your applications.
Development testing usually focuses on the correctness of functionality. InfoSec, however, often focuses on testing for what can go wrong. Instead of performing these tests manually, aim to generate them as part of automated unit or functional tests so that they can be run continuously in the deployment pipeline. It's also useful to define design patterns to help developers write code to prevent abuse, such as putting in rate limits for services and graying out submit buttons after they've been pressed.

8. Secure the software supply chain.
It's not enough to protect our applications, environment, data and our pipelines - we must also ensure the security of our software supply chain, particularly in light of startling statistics* about just how vulnerable it is. While the use of and reliance on commercial and open source components is convenient, it's also extremely risky. When selecting software, then, it's critical to detect components or libraries that have known vulnerabilities and work with developers to carefully select components with a track record of being fixed quickly.

9. Secure your environments.
We must ensure that all our environments in a hardened, risk-reduced state. This involves generating automated tests to ensure that all appropriate settings have been correctly applied for configuration hardening, database security, key lengths, and so forth. It also involves using tests to scan environments for known vulnerabilities and using a security scanner to map them out

10. Integrate information security into production telemetry.
Internal security controls are often ineffective in quickly detecting breaches because of blind spots in monitoring or because no one is examining the relevant telemetry every day. To adapt, integrate security telemetry into the same tools that Development, QA, and Operations use. This gives everyone in the pipeline visibility into how application and environments are performing in a hostile threat environment where attackers are constantly attempting to exploit vulnerabilities, gain unauthorized access, plant backdoors, and commit fraud (among other insidious things!).

You can read the full details of each of these steps and more in The DevOps Handbook.

*See Sonatype's 2015 "State of the Software Supply Chain" Report and Verizon's 2014 "Data Breach Investigations Report."

(Adapted from portions of The DevOps Handbook)

The post 10 Tips for Integrating Security into DevOps appeared first on XebiaLabs.

Related posts:

DevSecOps: Embracing Automation While Letting Go of Tradition Hidden Software Development Costs That Crush Your Bottom Line DevSecOps: Catching Fire

Read the original blog entry...

More Stories By XebiaLabs Blog

XebiaLabs is the technology leader for automation software for DevOps and Continuous Delivery. It focuses on helping companies accelerate the delivery of new software in the most efficient manner. Its products are simple to use, quick to implement, and provide robust enterprise technology.

@MicroservicesExpo Stories
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
The margins of cloud products like virtual machines are still in the 50% range. In essence, price drops are going to be a regular feature for the foreseeable future. This begets the question - are hosted solutions becoming irrelevant today? Boston-based market research firm, 451 Research, has been publishing their ‘Cloud Price Index' for a few years now. The quarterly study looks into the pricing of various offerings in the cloud market to understand the shifting dynamics in the public, privat...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
Managing mission-critical SAP systems and landscapes has never been easy. Add public cloud with its myriad of powerful cloud native services and this may not change any time soon. Public cloud offers exciting new possibilities for enterprise workloads. But to make use of these possibilities and capabilities, IT teams need to re-think everything they have done before. Otherwise, they will just end up using public cloud as a hosting platform for their workloads, aka known as “lift and shift.”
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
The reality of data ubiquity is here—data is buried in operational statistics, machine logs, stacks of overflowing tickets and customer details, among other things. How can any user get valuable information amid this rapid influx of data? Imagine a situation where your firm’s revenue takes a hit owing to an unexpected failure in some business process. It would be a nightmare for IT admins to sift through the interminable piles of data to deduce exactly why and where the problem occurred. To sav...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...