Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, Zakia Bouachraoui

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Cloud Security, @DXWorldExpo

@CloudExpo: Blog Feed Post

Four Great Tips: Cloud Security for Big Data

The combination of cloud computing and big data is a match made in heaven

The combination of cloud computing and big data is a match made in heaven. Big data requires a flexible compute environment, which can scale quickly and automatically to support massive amounts of data. Infrastructure clouds provide exactly that. But whenever cloud computing is discussed, the question comes up:

cloud security best practices Cloud Security big data  big data news cloud security 4 Great Tips: Cloud Security for Big Data

What about cloud security for big data?

When it comes to cloud security in a big data use case, the expectation is that any security solution will provide the same flexibility as the cloud without compromising the overall security of the implementation. When taking your big data to the cloud, the following four tips will enable you to achieve cloud flexibility paired with strict cloud security.

1.  Encrypt sensitive data (seriously)

Data encryption creates the “virtual walls” for your cloud infrastructure. Deploying cloud encryption is considered a fundamental first step, but there is no solution with a “one size fits all” approach. Some encryption solutions require on premise gateway encryption, which does not work well in cloud big-data scenarios.  Other approaches (for example, data encryption powered by the cloud provider itself) force the end user to trust someone else with the encryption keys, which is both risky and a compliance deal-breaker.

Recent encryption technologies, like split-key encryption, are tailored specifically to the cloud and leverage the best of both worlds by providing an infrastructure cloud solution while keeping the encryption keys safe and in the hands of the customer.

To achieve the best possible encryption for your big data scenario, use split-key encryption.

2. Look for cloud security solutions that can architecturally scale

In big data, each component of the architecture should scale, and the cloud security solution is no different. When selecting a cloud security solution, make sure it is available across all relevant cloud geo-locations. Furthermore, it must scale effectively with your big data infrastructure.

On the surface level, this means, of course, that hardware cannot be involved.  Hardware Security Modules (HSMs) do not fit the big data use case because of the inability to scale and flex to fit the cloud model.

To achieve the necessary scalability, use a cloud security solution that is designed for the cloud, but achieves security that is comparable to (or better than) hardware-based solutions.

3. Automate as much as possible

Big data cloud computers are frustrated from the fact that their cloud security architecture does not easily scale (see tip #2). Traditional encryption solutions require an HSM (hardware) element. Needless to say, hardware implementation cannot be automated.

To be able to automate as much of your cloud security as possible, strive for a virtual appliance approach, not a hardware approach.  Also, make sure that a usable API (ideally a RESTful API) is available as part of the cloud security offering.

A virtual appliance plus RESTful API will enable the required flexibility and automation needed in a cloud big data use case.

4. Do not compromise on data security

Because cloud security is often complicated, we see “security shortcuts” in big data implementations. Security shortcuts are usually taken to avoid complexity and maintain the big data architecture “unharmed.”

Some customers use freeware encryption tools and keep the encryption key on disk (which is highly insecure and may expose the encrypted data to anyone with access to the virtual disk), while others simply do not encrypt. These shortcuts are certainly not complicated, but, obviously, they are also not secure.

When it comes to big data security, map your data according to its sensitivity and protect it accordingly. In some cases, the consequences are dramatic. Not all big data infrastructure is secure, and one might need to find an alternative, if the data at stake is regulated or sensitive.

Cloud security for big data is available

Big data can continue to enjoy the scalability, flexibility, and automation offered by cloud computing while maintaining the strictest security standards for the data.  Encryption is considered a fundamental first step in protecting cloud (big) data, and new technologies such as split-key encryption and homomorphic key management should be leveraged to protect sensitive data and comply with regulations like HIPAA, PCI, and many others.

The post 4 Great Tips: Cloud Security for Big Data appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

Microservices Articles
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...