Welcome!

Microservices Expo Authors: John Rauser, Liz McMillan, Madhavan Krishnan, VP, Cloud Solutions, Virtusa, Jason Bloomberg, Pat Romanski

Related Topics: @CloudExpo, Containers Expo Blog, @DXWorldExpo, @ThingsExpo, @DevOpsSummit

@CloudExpo: Article

IaC App Delivery Lifecycle | @CloudExpo @Tasktop #IaC #DevOps #API #DX

Infrastructure as Code (IaC) enables infrastructure management through a software-defined layer

How Infrastructure as Code Automates IT Operations in the Application Delivery Lifecycle

As more technologies become software-defined, their adoption demands a significant shift in thinking about how a business organizes its value stream. This shift may be difficult, but it enables you to anticipate changes and position your business to react when software-defined technologies emerge.

Recently a new set of tools and practices has emerged to create and manage environments. Known as Infrastructure as Code (IaC), it enables infrastructure management through a software-defined layer.

Application Delivery: Getting It Right
All applications run inside what we call an "environment" - a stack of hardware and software components built to support the application. This stack includes: networking, storage, virtual machines, operating systems, databases, libraries, dependencies and the application itself. Building an environment requires many activities to bring up that stack, provisioning and configuring each component according to the requirements of the application.

All of this is done to serve the application, which demands that things be "just so." The processes used to get an environment ‘just right' (and keep it that way) have been the subject of much analysis and design over the years, becoming part of the body of work known as IT Service Management (ITSM).

Infrastructure as Code
IaC replaces many of the ITSM processes involved in the deployment and ongoing management of the complete hardware-software environment in which an application will run.

While IT professionals have always used some automation such as scripting to help deploy environments, IaC is a recent development characterized by use of the following:

  1. Code - At the core of IaC is the code: definition files that declare the specification for each component of the environment and how it is configured. These files might be written in YAML or JSON, and will be checked into a version control system like Git.
  2. Automation tooling - Specialized tools read the definition files and use them to construct the environment and configure components according to specification.
  3. Application Programming Interfaces (APIs) - Automation tools perform the actions described in the definition files against APIs. Not only will the automation tools use APIs to provision and configure the components of the environment being managed, but the tool itself will be programmable through its own API.

The development of powerful automation tools, along with the widespread proliferation of APIs, has allowed IaC to emerge as a very effective means of managing IT operations processes.

Rather than working with GUIs, scripts and command-line interfaces to perform actions, we are able to work with documents (code) that exhaustively describe the environment. These are easily shared, reviewed and versioned.

With IaC, the actions at each step are executed, not performed, and are therefore much less prone to human error. The following steps demonstrate how IaC is actually used.

Putting it to work
While setting up an environment requires a number of different components and services, we can group these into three distinct steps:

  1. Provisioning - The first step is to provision the foundational infrastructure systems - servers, networks, databases and storage. Provisioning tools perform this task and are usually supplied by the infrastructure vendor. For example, Amazon provides CloudFormation to create VPCs (networks) and spin up EC2 instances (Servers) and, likewise, Azure gives us Resource Manager to create Network Security Groups and bring up virtual machines. There are also some provisioning tools like Terraform that are vendor agnostic, making switching between infrastructure vendors easier.
  2. Configuration - The second step is to configure the provisioned components; configuration management tools accomplish this task. This is a broader set of tools used to perform operations like transferring files, installing services, configuring settings and so on. There are many tools in this space, but the "Big Three" are Puppet, Chef and Ansible. Each has its own advantages and disadvantages; however they all accomplish the same goal - configure the components with the required dependencies and settings.
  3. Deployment - The third step is to deploy the application. More and more this involves the use of container technologies like Docker. Container technologies are a recent advancement in IT that deserve their own article and explanation; suffice it to say that a container allows an application and its dependencies to be wrapped up into a package that is easy to deploy into its own isolated space on a machine. Containers provide an additional layer of abstraction from the provisioning and configuration.

The tooling landscape used to perform these steps is highly fragmented, and strategies often use an opinionated, best-of-breed approach with a different tool at each of the three steps. For example, a team might use CloudFormation to set up the virtual machines and connect them to the network, Chef to configure and secure the virtual machines, and Docker to load the application into an isolated container.

There is no "correct" way to set up your automation stack - this will depend on the limitations of the tools and the needs of the organization - but it should be understood that adopting this technology will invariably change the way the organization manages IT work.

Anticipating Change
IaC presents us with a large and increasingly complex software-defined layer that is used to perform infrastructure management functions. It is important to note, however, that what becomes software-defined here is not the infrastructure itself, although software-defined infrastructure is a prerequisite for IaC; rather they are the systems and processes that are used to manage the infrastructure, such as asset management, change management, configuration management and more. These functions can now be emulated in code.

This can create major challenges for traditional service management strategies - the skills, roles, responsibilities, methods and practices used to manage infrastructure change considerably. On the other hand, it creates great opportunities by providing a catalyst to launch DevOps initiatives and increase the scope of Agile practices across the value stream.

By understanding IaC as a software-defined technology, we can gain insight into the impact on the enterprise.

Redefining IT
As a software-defined technology, we should immediately expect that IaC will be highly impactful on the business and how it is organized. It is important to recognize IaC as a software-defined technology, because looking at it through this lens gives us an understanding of what to expect as we start to adopt this technology in the enterprise. Analysis using three key attributes of software definition will shed some light on the impacts:

1. Abstraction
IaC requires that all operations on infrastructure are declared in definition files and executed using an automation tool. This automation layer provides an abstraction from operations like deploying, configuring and managing components. This means that these operations shift left in the software supply chain - they are performed earlier and all together, rather than sequentially at the final stages of activity.

With this abstraction, the skill specialization for managing infrastructure shifts from traditional vendor and application specific sysadmin skills to the ability to write code and think through the abstraction. The roles and responsibilities for managing infrastructure can move to anyone proficient in writing code.

2. Control
Since infrastructure can be fully documented in code, we can "read" the environment - see everything that was deployed and how it was configured by simply reading the definition files.

The focus of service management and control systems therefore shifts to managing the automation tooling and definition files. For example, use of a version control system to manage the definition files brings about the idea of "versioned infrastructure," where the change record is reflected in the version history.

Similarly, change management can be accomplished through code reviews performed individually when the changes are checked in, rather than putting batched changes before a change review board.

3. Mutability
The environment is exhaustively described in definition files, which are not dependent on infrastructure attributes. Ideally, the infrastructure itself is "immutable" - no changes are made directly to the infrastructure once it is deployed. Infrastructure components are locked down and not directly accessible to humans - changes are deployed only with the automation tooling.

This has two impacts: first, it introduces commoditization to deployment process. The same definition file can be used to bring up one server or a hundred. Additional assets can be deployed as needed, just-in-time, and torn down when they are no longer required. Elastic assets means infrastructure is always ‘right-sized.'

Second, the underlying infrastructure becomes modular. We can use our definition file to bring up our components on-premise, or in the cloud - on AWS or on Azure. While there are some dependencies involved in the automation tooling, overall the infrastructure layer has few enough hardware or vendor dependencies that operators have more freedom to choose where to host their infrastructure.

Adoption Through Transformation
For a business that relies on a software supply chain to deliver value to customers, IaC represents a significant opportunity to increase efficiency, lower costs and reduce risk.

Further, IaC has emerged as a vital driver in transforming and modernizing the software supply chain. Digital-first businesses like Amazon, Netflix and Facebook live and breathe software-defined infrastructure. This is because they have been able to build their businesses, cultures and value streams in a greenfield without the encumbrance of legacy systems and practices.

As with all software-defined technologies, incumbent enterprises that have a well-established value stream will have difficulty with wide-scale adoption. Some of the challenges they face include:

  • Culture: All software-defined technologies present the significant challenge of redefining roles, shifting responsibilities and altering work structures. Part of the transformation to adopt these technologies, therefore, involves a cultural change across the technical side of the organization. DevOps, as a culture, has emerged from this, and embraces these new roles and responsibilities. This needs to be nurtured and allowed to grow through a process of sharing and collaborating.
  • Practices: IT professionals are typically accustomed to working within project-based work structures like PMP and Prince2. IaC, however, begs for the use of software development practices like Agile to manage work. Implementing a software-defined infrastructure will have the effect of proliferating management techniques like Scrum and Kanban. This is a great opportunity, but equally a challenge to get everyone onboard and trained on the new methodologies and the tools they use.
  • Value: A fundamental characteristic of software-defined technologies is that they recast management strategies. With IaC, the IT supply chain is altered beyond recognition, requiring new thinking about service management strategies. The changes in where, when and how infrastructure will be managed and deployed mean organizations need to undergo a paradigm shift in thinking about how value delivery is organized.

In order to fully adopt software-defined infrastructure, incumbent enterprises will need to be ready to undergo a transformation. There needs to be a commitment and willingness to invest in new tools, new skills and new relationships.

Automation of any kind is a threat to the status quo, and change often brings on a crisis of identity for those who are content to stay the same. In this case, IT automation through IaC challenges the processes and systems used by ITSM. However, IT leaders will do well to be open-minded. Through a process of experimentation and discovery we can ease the adoption of this software-defined technology and use it as a catalyst to grow the value of IT across the business.

Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.

Download Show Prospectus ▸ Here

The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago.

All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.

With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!

Sponsors of Internet of @ThingsExpo will benefit from unmatched branding, profile building and lead generation opportunities through:

  • Featured on-site presentation and ongoing on-demand webcast exposure to a captive audience of industry decision-makers.
  • Showcase exhibition during our new extended dedicated expo hours
  • Breakout Session Priority scheduling for Sponsors that have been guaranteed a 35 minute technical session
  • Online advertising in SYS-CON's i-Technology Publications
  • Capitalize on our Comprehensive Marketing efforts leading up to the show with print mailings, e-newsletters and extensive online media coverage.
  • Unprecedented PR Coverage: Editorial Coverage on ITweetup to over 75,000 plus followers, press releases sent on major wire services to over 500 industry analysts.

For more information on sponsorship, exhibit, and keynote opportunities, contact Carmen Gonzalez by email at events (at) sys-con.com, or by phone 201 802-3021.

With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo, October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.

Track 1. Enterprise Cloud | Cloud-Native
Track 2.
Big Data | Analytics
Track 3. Internet of Things | IIoT | Smart Cities

Track 4. DevOps | Digital Transformation (DX)

Track 5. APIs | Cloud Security | Mobility

Track 6.
AI | ML | DL | Cognitive
Track 7.
Containers | Microservices | Serverless
Track 8. FinTech | InsurTech | Token Economy

Cloud Expo | @ThingsExpo 2017 Silicon Valley
(October 31 - November 2, 2017, Santa Clara Convention Center, CA)

Cloud Expo | @ThingsExpo 2018 New York 
(June 12-14, 2018, Javits Center, Manhattan)

Download Show Prospectus ▸ Here

Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers.  

Companies are each developing their unique mix of cloud technologies and services, forming multi-cloud and hybrid cloud architectures and deployments across all major industries. Cloud-driven thinking has become the norm in financial services, manufacturing, telco, healthcare, transportation, energy, media, entertainment, retail and other consumer industries, and the public sector.

Cloud Expo is the single show where technology buyers and vendors can meet to experience and discus cloud computing and all that it entails. Sponsors of Cloud Expo will benefit from unmatched branding, profile building and lead generation opportunities through:

  • Featured on-site presentation and ongoing on-demand webcast exposure to a captive audience of industry decision-makers.
  • Showcase exhibition during our new extended dedicated expo hours
  • Breakout Session Priority scheduling for Sponsors that have been guaranteed a 35-minute technical session
  • Online advertising in SYS-CON's i-Technology Publications
  • Capitalize on our Comprehensive Marketing efforts leading up to the show with print mailings, e-newsletters and extensive online media coverage.
  • Unprecedented PR Coverage: Editorial Coverage on Cloud Computing Journal.
  • Tweetup to over 75,000 plus followers
  • Press releases sent on major wire services to over 500 industry analysts.

For more information on sponsorship, exhibit, and keynote opportunities, contact Carmen Gonzalez by email at events (at) sys-con.com, or by phone 201 802-3021.

The World's Largest "Cloud Digital Transformation" Event

@CloudExpo | @ThingsExpo 2017 Silicon Valley
(Oct. 31 - Nov. 2, 2017, Santa Clara Convention Center, CA)

@CloudExpo | @ThingsExpo 2018 New York 
(June 12-14, 2018, Javits Center, Manhattan)

Full Conference Registration Gold Pass and Exhibit Hall ▸ Here

Register For @CloudExpo ▸ Here via EventBrite

Register For @ThingsExpo ▸ Here via EventBrite

Register For @DevOpsSummit ▸ Here via EventBrite

Sponsorship Opportunities

Sponsors of Cloud Expo | @ThingsExpo will benefit from unmatched branding, profile building and lead generation opportunities through:

  • Featured on-site presentation and ongoing on-demand webcast exposure to a captive audience of industry decision-makers
  • Showcase exhibition during our new extended dedicated expo hours
  • Breakout Session Priority scheduling for Sponsors that have been guaranteed a 35 minute technical session
  • Online targeted advertising in SYS-CON's i-Technology Publications
  • Capitalize on our Comprehensive Marketing efforts leading up to the show with print mailings, e-newsletters and extensive online media coverage
  • Unprecedented Marketing Coverage: Editorial Coverage on ITweetup to over 100,000 plus followers, press releases sent on major wire services to over 500 industry analysts

For more information on sponsorship, exhibit, and keynote opportunities, contact Carmen Gonzalez (@GonzalezCarmen) today by email at events (at) sys-con.com, or by phone 201 802-3021.

Secrets of Sponsors and Exhibitors ▸ Here
Secrets of Cloud Expo Speakers ▸ Here

All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.

With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo@ThingsExpo, October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-4, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.

Delegates to Cloud Expo | @ThingsExpo will be able to attend 8 simultaneous, information-packed education tracks.

There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content.

Join Cloud Expo | @ThingsExpo conference chair Roger Strukhoff (@IoT2040), October 31 - November 2, 2017, Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, for three days of intense Enterprise Cloud and 'Digital Transformation' discussion and focus, including Big Data's indispensable role in IoT, Smart Grids and (IIoT) Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) Digital Transformation in Vertical Markets.

Financial Technology - or FinTech - Is Now Part of the @CloudExpo Program!

Accordingly, attendees at the upcoming 21st Cloud Expo | @ThingsExpo October 31 - November 2, 2017, Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, will find fresh new content in a new track called FinTech, which will incorporate machine learning, artificial intelligence, deep learning, and blockchain into one track.

Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expensive intermediate processes from their businesses.

FinTech brings efficiency as well as the ability to deliver new services and a much improved customer experience throughout the global financial services industry. FinTech is a natural fit with cloud computing, as new services are quickly developed, deployed, and scaled on public, private, and hybrid clouds.

More than US$20 billion in venture capital is being invested in FinTech this year. @CloudExpo is pleased to bring you the latest FinTech developments as an integral part of our program, starting at the 21st International Cloud Expo October 31 - November 2, 2017 in Silicon Valley, and June 12-14, 2018, in New York City.

@CloudExpo is accepting submissions for this new track, so please visit www.CloudComputingExpo.com for the latest information.

Speaking Opportunities

The upcoming 21st International @CloudExpo@ThingsExpo, October 31 - November 2, 2017, Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY announces that its Call For Papers for speaking opportunities is open.

Submit your speaking proposal today! ▸ Here

About SYS-CON Media & Events
SYS-CON Media (www.sys-con.com) has since 1994 been connecting technology companies and customers through a comprehensive content stream - featuring over forty focused subject areas, from Cloud Computing to Web Security - interwoven with market-leading full-scale conferences produced by SYS-CON Events. The company's internationally recognized brands include among others Cloud Expo® (@CloudExpo), Big Data Expo® (@BigDataExpo), DevOps Summit (@DevOpsSummit), @ThingsExpo® (@ThingsExpo), Containers Expo (@ContainersExpo) and Microservices Expo (@MicroservicesE).

Cloud Expo®, Big Data Expo® and @ThingsExpo® are registered trademarks of Cloud Expo, Inc., a SYS-CON Events company.

More Stories By John Rauser

John Rauser is the IT Manager at Tasktop Technologies, a global enterprise software company. He also serves as VP Operations at the board of the Project Management Institute - Canadian West Coast Chapter, providing leadership and expertise on technology issues. He has a passion for discussing the business impacts of technology and analyzing strategies for managing IT.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
While we understand Agile as a means to accelerate innovation, manage uncertainty and cope with ambiguity, many are inclined to think that it conflicts with the objectives of traditional engineering projects, such as building a highway, skyscraper or power plant. These are plan-driven and predictive projects that seek to avoid any uncertainty. This type of thinking, however, is short-sighted. Agile approaches are valuable in controlling uncertainty because they constrain the complexity that ste...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
The cloud revolution in enterprises has very clearly crossed the phase of proof-of-concepts into a truly mainstream adoption. One of most popular enterprise-wide initiatives currently going on are “cloud migration” programs of some kind or another. Finding business value for these programs is not hard to fathom – they include hyperelasticity in infrastructure consumption, subscription based models, and agility derived from rapid speed of deployment of applications. These factors will continue to...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
"Opsani helps the enterprise adopt containers, help them move their infrastructure into this modern world of DevOps, accelerate the delivery of new features into production, and really get them going on the container path," explained Ross Schibler, CEO of Opsani, and Peter Nickolov, CTO of Opsani, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
Cavirin Systems has just announced C2, a SaaS offering designed to bring continuous security assessment and remediation to hybrid environments, containers, and data centers. Cavirin C2 is deployed within Amazon Web Services (AWS) and features a flexible licensing model for easy scalability and clear pay-as-you-go pricing. Although native to AWS, it also supports assessment and remediation of virtual or container instances within Microsoft Azure, Google Cloud Platform (GCP), or on-premise. By dr...
Let's do a visualization exercise. Imagine it's December 31, 2018, and you're ringing in the New Year with your friends and family. You think back on everything that you accomplished in the last year: your company's revenue is through the roof thanks to the success of your product, and you were promoted to Lead Developer. 2019 is poised to be an even bigger year for your company because you have the tools and insight to scale as quickly as demand requires. You're a happy human, and it's not just...
Many enterprise and government IT organizations are realizing the benefits of cloud computing by extending IT delivery and management processes across private and public cloud services. But they are often challenged with balancing the need for centralized cloud governance without stifling user-driven innovation. This strategy requires an approach that fundamentally reshapes how IT is delivered today, shifting the focus from infrastructure to services aggregation, and mixing and matching the bes...
identify the sources of event storms and performance anomalies will require automated, real-time root-cause analysis. I think Enterprise Management Associates said it well: “The data and metrics collected at instrumentation points across the application ecosystem are essential to performance monitoring and root cause analysis. However, analytics capable of transforming data and metrics into an application-focused report or dashboards are what separates actual application monitoring from relat...
The benefits of automation are well documented; it increases productivity, cuts cost and minimizes errors. It eliminates repetitive manual tasks, freeing us up to be more innovative. By that logic, surely, we should automate everything possible, right? So, is attempting to automate everything a sensible - even feasible - goal? In a word: no. Consider this your short guide as to what to automate and what not to automate.
DevOps teams have more on their plate than ever. As infrastructure needs grow, so does the time required to ensure that everything's running smoothly. This makes automation crucial - especially in the server and network monitoring world. Server monitoring tools can save teams time by automating server management and providing real-time performance updates. As budgets reset for the New Year, there is no better time to implement a new server monitoring tool (or re-evaluate your current solution)....
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
We just came off of a review of a product that handles both containers and virtual machines in the same interface. Under the covers, implementation of containers defaults to LXC, though recently Docker support was added. When reading online, or searching for information, increasingly we see “Container Management” products listed as competitors to Docker, when in reality things like Rocket, LXC/LXD, and Virtualization are Dockers competitors. After doing some looking around, we have decided tha...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Digital transformation has changed the way users interact with the world, and the traditional healthcare experience no longer meets rising consumer expectations. Enterprise Health Clouds (EHCs) are designed to easily and securely deliver the smart and engaging digital health experience that patients expect today, while ensuring the compliance and data integration that care providers require. Jikku Venkat