Welcome!

Microservices Expo Authors: Christopher Keene, Roger Strukhoff, Lori MacVittie, David Sprott, Craig Lowell

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Cloud Security, @BigDataExpo

@CloudExpo: Blog Feed Post

Four Great Tips: Cloud Security for Big Data

The combination of cloud computing and big data is a match made in heaven

The combination of cloud computing and big data is a match made in heaven. Big data requires a flexible compute environment, which can scale quickly and automatically to support massive amounts of data. Infrastructure clouds provide exactly that. But whenever cloud computing is discussed, the question comes up:

cloud security best practices Cloud Security big data  big data news cloud security 4 Great Tips: Cloud Security for Big Data

What about cloud security for big data?

When it comes to cloud security in a big data use case, the expectation is that any security solution will provide the same flexibility as the cloud without compromising the overall security of the implementation. When taking your big data to the cloud, the following four tips will enable you to achieve cloud flexibility paired with strict cloud security.

1.  Encrypt sensitive data (seriously)

Data encryption creates the “virtual walls” for your cloud infrastructure. Deploying cloud encryption is considered a fundamental first step, but there is no solution with a “one size fits all” approach. Some encryption solutions require on premise gateway encryption, which does not work well in cloud big-data scenarios.  Other approaches (for example, data encryption powered by the cloud provider itself) force the end user to trust someone else with the encryption keys, which is both risky and a compliance deal-breaker.

Recent encryption technologies, like split-key encryption, are tailored specifically to the cloud and leverage the best of both worlds by providing an infrastructure cloud solution while keeping the encryption keys safe and in the hands of the customer.

To achieve the best possible encryption for your big data scenario, use split-key encryption.

2. Look for cloud security solutions that can architecturally scale

In big data, each component of the architecture should scale, and the cloud security solution is no different. When selecting a cloud security solution, make sure it is available across all relevant cloud geo-locations. Furthermore, it must scale effectively with your big data infrastructure.

On the surface level, this means, of course, that hardware cannot be involved.  Hardware Security Modules (HSMs) do not fit the big data use case because of the inability to scale and flex to fit the cloud model.

To achieve the necessary scalability, use a cloud security solution that is designed for the cloud, but achieves security that is comparable to (or better than) hardware-based solutions.

3. Automate as much as possible

Big data cloud computers are frustrated from the fact that their cloud security architecture does not easily scale (see tip #2). Traditional encryption solutions require an HSM (hardware) element. Needless to say, hardware implementation cannot be automated.

To be able to automate as much of your cloud security as possible, strive for a virtual appliance approach, not a hardware approach.  Also, make sure that a usable API (ideally a RESTful API) is available as part of the cloud security offering.

A virtual appliance plus RESTful API will enable the required flexibility and automation needed in a cloud big data use case.

4. Do not compromise on data security

Because cloud security is often complicated, we see “security shortcuts” in big data implementations. Security shortcuts are usually taken to avoid complexity and maintain the big data architecture “unharmed.”

Some customers use freeware encryption tools and keep the encryption key on disk (which is highly insecure and may expose the encrypted data to anyone with access to the virtual disk), while others simply do not encrypt. These shortcuts are certainly not complicated, but, obviously, they are also not secure.

When it comes to big data security, map your data according to its sensitivity and protect it accordingly. In some cases, the consequences are dramatic. Not all big data infrastructure is secure, and one might need to find an alternative, if the data at stake is regulated or sensitive.

Cloud security for big data is available

Big data can continue to enjoy the scalability, flexibility, and automation offered by cloud computing while maintaining the strictest security standards for the data.  Encryption is considered a fundamental first step in protecting cloud (big) data, and new technologies such as split-key encryption and homomorphic key management should be leveraged to protect sensitive data and comply with regulations like HIPAA, PCI, and many others.

The post 4 Great Tips: Cloud Security for Big Data appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@MicroservicesExpo Stories
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
Thomas Bitman of Gartner wrote a blog post last year about why OpenStack projects fail. In that article, he outlined three particular metrics which together cause 60% of OpenStack projects to fall short of expectations: Wrong people (31% of failures): a successful cloud needs commitment both from the operations team as well as from "anchor" tenants. Wrong processes (19% of failures): a successful cloud automates across silos in the software development lifecycle, not just within silos.
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
A company’s collection of online systems is like a delicate ecosystem – all components must integrate with and complement each other, and one single malfunction in any of them can bring the entire system to a screeching halt. That’s why, when monitoring and analyzing the health of your online systems, you need a broad arsenal of different tools for your different needs. In addition to a wide-angle lens that provides a snapshot of the overall health of your system, you must also have precise, ...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...

Modern organizations face great challenges as they embrace innovation and integrate new tools and services. They begin to mature and move away from the complacency of maintaining traditional technologies and systems that only solve individual, siloed problems and work “well enough.” In order to build...

The post Gearing up for Digital Transformation appeared first on Aug. 24, 2016 11:45 AM EDT  Reads: 1,285

To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...