Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Article

Cloud Computing: Securing the Cloud

I wrote my own S3 client which uses strong encryption on the I/O stream as it leaves my computer

Jonathan Craven's Blog

I don't trust Amazon S3 with my personal data. It's not a question of trusting Amazon to abide by the terms of service - I do trust them as a company, but no company can be immune from a rogue employee or corporate espionage, and it is not easy to trust their security procedures unless you can audit them yourself at whim, which is a practical impossibility.

I have already written about how I have enthusiastically adopted Amazon S3 as a solution for off-site backups, and for publishing heavier content than my home server could handle, such as video. The other day one of the hosts of Buzz Out Loud mentioned that he didn't trust his personal data in the cloud just yet. He could see that it was the way of the future, but was not yet comfortable with the trust issues. Then then this week John C. Dvorak echoed the same concerns on TWiT.

They are right of course, and I don't trust Amazon with my personal data either. I have a lot of personal data to back up, such as every e-mail I wrote or received from 1998 to around 2005 (I've let GMail handle it since then, where I technically ought to back it up via POP, but haven't...), not to mention other personal identifying data that I would not want in the wrong hands. It is not a question of trusting Amazon to abide by the terms of service—I do trust them as a company, but no company can be immune from a rogue employee or corporate espionage, and it is not easy to trust their security procedures unless you can audit them yourself at whim, which is a practical impossibility.

My solution to this problem is one that your average user, even a geek like Tom Merritt, probably can't do: I wrote my own S3 client which uses strong encryption on the I/O stream as it leaves my computer. Amazon thus stores for me a few gigabytes of what is literally useless ones and zeroes, but when I download it with my special client it is decrypted on the fly back into the original file. Such a solution requires not only the knowledge of how to code one's own S3 client, but also enough knowledge of cryptography and computer security to know whether a solution is really secure, or whether it could be cracked by those with enough resources. I'm fortunate to be in a position do do this by myself.

I'm sure that at some point there will be, and maybe there already is, a client program you can download to do this for you, where you set your own key phrase. But unless you audit the entire source code of that program, you can't be sure that it isn't sending your key out to some third party. An open source solution would allow you to check this, but frankly the time it would take to audit all the code would be longer than the time it takes to write your own (at least it was in my case). But in the absence of a widely audited and popularly acknowledeged open source way of encrypting the stream before it leaves your computer, we'll never get beyond the issue of trusting the company you're giving your data to.

(The only problem, now, is keeping my source code to my client and my key file safe, since if I lose those I would be left unable to download my own backups!*)

* Don't worry, I have worked out a solution for this, but I'm not going to post it here!

 

More Stories By Jonathan Craven

Jonathan Craven is an American software engineer currently living and working in northern France.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...