Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cognitive Computing , Agile Computing, Cloud Security

@CloudExpo: Blog Feed Post

Cloud Security – Implementing a Secure Cloud Backup Case Study

Cloud Security and Cloud Encryption options considered

Secure cloud backup is a scenario which increasingly gains traction. It allows organizations to implement an off-site backup while maintaining costs at a minimum. In this blog post I would like to focus on a specific use case of secure cloud backup. The system we describe is comprised of an on-premise replication server, Porticor Cloud Security, and Amazon S3 as the final backup destination, all integrated by one of our fine cloud integrators.

Secure Cloud Backup – The Business Need
In this use case, an enterprise organization was struggling with an inefficient and costly offsite backup infrastructure that was meant to manage an incrementally expanding database.  An offsite server farm was costly to operate and maintain and the tape backup and recovery methods used were time consuming. Furthermore, the company failed to meet regulatory requirements with regard to data availability. To eliminate the complexity and cost associated with this backup methodology, one of our integrators, Emind Systems implemented an onsite dedicated server which mirrored directories and volumes on the local network and replicated data to an Amazon Web Services S3. But a critical requirement was cloud data security and encryption; this is where Porticor comes in.

Cloud Security and Cloud Encryption options considered
One of the top concerns of enterprises deploying cloud encryption is “data confidentiality”, or in other words – who controls the encryption keys, and therefore can potentially access the data. Some cloud providers offer data encryption as part of their service, but as the cloud provider manages and maintains your cloud encryption keys, they can potentially see your data. Rich Mogull described it well on his blog “How to Tell If Your Cloud Provider Can Read Your Data (Hint: They Can)”. One secure alternative would be an on-premise key management server, but that is costly (in operation and capital expenses), and limits the cloud flexibility tremendously.

Secure Cloud Backup – Mission Accomplished
To avoid these issues while maintaining information security, Porticor’s Virtual Private Data has been integrated  (for further reading download the white paper here) in their backup scenario. The end result is a highly scalable, elastic and secure backup solution. The onsite server mirrored the data and transferred it to a pre-configured AWS S3 bucket, and Porticor encrypts each object on its way to S3. Each object has been encrypted using a unique encryption key, yet the customer maintains a single “project” key, which allows for an automated key management cycle while not sharing the encryption keys with anyone.

 

Ariel Dan is co-founder at Porticor Cloud Security

The post Cloud Security – Implementing a Secure Cloud Backup Case Study appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...