Welcome!

Microservices Expo Authors: Elizabeth White, Sridhar Chalasani, Tirumala Khandrika, Gopala Krishna Behara, Gordon Haff

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Blog Feed Post

Getting at the Heart of Security in the Cloud

CloudPassage digs a bit deeper into the issue of security and public cloud computing and finds some interesting results

Security is a pretty big word. It’s used to represent everything from attack prevention to authentication and authorization to securing transport protocols. It’s used as an umbrella term for such a wide variety of concerns that it has become virtually meaningless when applied to technology.

security-umbrellaFor some time, purveyors of security studies have asked the market, “What’s stopping you from adopting cloud?” Invariably one of the most often cited show-stoppers is “security.” Pundits raced to tell us this, but in no wise did they offer deeper insight into what, exactly, security meant.

So it was nice to see CloudPassage dig deeper into “security in the cloud” with a recent survey it conducted. You may recall that CloudPassage has a more than passing interest in cloud-based security, as its focus is on cloud-based security with an emphasis on host-based firewalls. Published in February 2012, it sheds some light on what IT professionals consider most important with respect to public cloud security.

Not unsurprisingly, “lack of perimeter defenses and/or network control” was the most often cited concern with respect to security in public cloud environments with 25% of respondents indicating it was troubling. This response would appear to go hand in hand with the 12% who cited an inability to leverage “enterprise security tools” in public cloud environments. It is no secret that duplicating security architectures and processes in the cloud is not something we seen done at this juncture. When you combine an inability to replicate security policy and process in the cloud due to incompatibilities of infrastructure and software with a less than robust security service offering in public cloud environments, the “lack of perimeter defenses and/or network control” answer being top of the list makes a lot of sense.

cloudpassage-concerns

WHERE ARE WE GOING?

There are myriad surveys that indicate organizations are moving to use public cloud computing, despite these concerns, and one assumes that this means they are finding ways to resolve these issues. Many organizations are turning back the clock and taking advantage of agent-based (host deployed) solutions to secure their assets in public cloud environments, which affords much better protection than nothing at all, and others still are leveraging the tried-and-true “checklist” method: manually securing servers based on best-practices and corporate policy.

Neither is optimal from an operational perspective. Neither is the use of cloud provider offered services such as Amazon security groups because the result is a disjointed set of security policies across multiple environments. Policy languages and implementation – not to mention capabilities – vary widely from service to service. While the most basic of protections – firewalling – is more compatible from the perspective of ability to codify, still the actual policy language will differ. These disconnects can lead to gaps in security policies that leave open to attack the organization’s assets. Inconsistent management and deployment processes spanning multiple environments leave open the possibility of human error and misconfiguration, an often cited cause of outages and breaches in general.

cloudpassage-securetoday

Where we are today is sitting with a disjointed set of options from which to choose, and the need to somehow cobble together these disparate tools and services into a comprehensive security strategy capable of consistently securing servers, applications, and other resources from attack, exploitation, and breach.

It is not really an inspiring view at the moment.

Vendors and providers need to work toward some common language and services that enable consistent replication – and thus enforcement - of the policies that govern access and protection of all corporate resources, regardless of location. Whether through standards initiatives or brokerage of APIs or better ability of organizations to deploy security solutions in both the data center and public cloud environments is not necessarily the question. The question is how can enterprises better address the specific security-related concerns they have regarding public cloud deployments in a way that minimizes risk of misconfiguration or gaps in policy enforcement while providing for operationally consistent processes that ensure the benefits of public cloud computing are not lost.

REVERSE INTEGRATION

One of the interesting trends that we’re seeing is around the demand for consistency in infrastructure across environments, and this will eventually drive demand for integration of what are today “cloud only” solutions back into data center components. Folks like CloudPassage and other cloud-focused systems that deliver host-based security coupled with a SaaS management model will eventually need to consider integration with “traditional” enterprise solutions as a means to deliver the consistency necessary to maintain cloud-related operational benefits.

Right now we’re seeing a move toward preserving operational consistency through replication of policy from within the data center out, to the cloud. But as cloud-hosted solutions continue to mature and evolve, one would expect to see the ability to replicate policy in the other direction – from the cloud back into the data center. This is no trivial task, as it requires the SaaS management component of such solutions to become what might be considered a policy broker; that is, their system becomes the point of policy creation and management and it is through integration with both cloud and data center infrastructure that such policies are deployed, updated, and managed.

This is why the notion of API-enabled infrastructure, a.k.a. Infrastructure 2.0, is so important. It’s not just about creating a vibrant and healthy ecosystem of solutions within the data center, but in the cloud and in between, as well. It is the glue that will integrate disparate systems and normalize policies across environments, and ultimately provide the market with a broader set of choices that can more efficiently and effectively address the specific security (and other operational) concerns that may be preventing organizations from fully embracing cloud computing.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@MicroservicesExpo Stories
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry’s single source for the cloud. Fusion’s advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including cloud...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
This week's news brings us further reminders that if you're betting on cloud, you're headed in the right direction. The cloud is growing seven times faster than the rest of IT, according to IDC, with a 25% spending increase just from 2016 to 2017. SaaS still leads the pack, with an estimated two-thirds of public cloud spending going that way. Large enterprises, with more than 1,000 employees, are predicted to account for more than half of cloud spending and have the fastest annual growth rate.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions with...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.