Microservices Expo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Ian Khan, Jason Bloomberg

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Cloud Security, @BigDataExpo

@CloudExpo: Blog Feed Post

Four Great Tips: Cloud Security for Big Data

The combination of cloud computing and big data is a match made in heaven

The combination of cloud computing and big data is a match made in heaven. Big data requires a flexible compute environment, which can scale quickly and automatically to support massive amounts of data. Infrastructure clouds provide exactly that. But whenever cloud computing is discussed, the question comes up:

cloud security best practices Cloud Security big data  big data news cloud security 4 Great Tips: Cloud Security for Big Data

What about cloud security for big data?

When it comes to cloud security in a big data use case, the expectation is that any security solution will provide the same flexibility as the cloud without compromising the overall security of the implementation. When taking your big data to the cloud, the following four tips will enable you to achieve cloud flexibility paired with strict cloud security.

1.  Encrypt sensitive data (seriously)

Data encryption creates the “virtual walls” for your cloud infrastructure. Deploying cloud encryption is considered a fundamental first step, but there is no solution with a “one size fits all” approach. Some encryption solutions require on premise gateway encryption, which does not work well in cloud big-data scenarios.  Other approaches (for example, data encryption powered by the cloud provider itself) force the end user to trust someone else with the encryption keys, which is both risky and a compliance deal-breaker.

Recent encryption technologies, like split-key encryption, are tailored specifically to the cloud and leverage the best of both worlds by providing an infrastructure cloud solution while keeping the encryption keys safe and in the hands of the customer.

To achieve the best possible encryption for your big data scenario, use split-key encryption.

2. Look for cloud security solutions that can architecturally scale

In big data, each component of the architecture should scale, and the cloud security solution is no different. When selecting a cloud security solution, make sure it is available across all relevant cloud geo-locations. Furthermore, it must scale effectively with your big data infrastructure.

On the surface level, this means, of course, that hardware cannot be involved.  Hardware Security Modules (HSMs) do not fit the big data use case because of the inability to scale and flex to fit the cloud model.

To achieve the necessary scalability, use a cloud security solution that is designed for the cloud, but achieves security that is comparable to (or better than) hardware-based solutions.

3. Automate as much as possible

Big data cloud computers are frustrated from the fact that their cloud security architecture does not easily scale (see tip #2). Traditional encryption solutions require an HSM (hardware) element. Needless to say, hardware implementation cannot be automated.

To be able to automate as much of your cloud security as possible, strive for a virtual appliance approach, not a hardware approach.  Also, make sure that a usable API (ideally a RESTful API) is available as part of the cloud security offering.

A virtual appliance plus RESTful API will enable the required flexibility and automation needed in a cloud big data use case.

4. Do not compromise on data security

Because cloud security is often complicated, we see “security shortcuts” in big data implementations. Security shortcuts are usually taken to avoid complexity and maintain the big data architecture “unharmed.”

Some customers use freeware encryption tools and keep the encryption key on disk (which is highly insecure and may expose the encrypted data to anyone with access to the virtual disk), while others simply do not encrypt. These shortcuts are certainly not complicated, but, obviously, they are also not secure.

When it comes to big data security, map your data according to its sensitivity and protect it accordingly. In some cases, the consequences are dramatic. Not all big data infrastructure is secure, and one might need to find an alternative, if the data at stake is regulated or sensitive.

Cloud security for big data is available

Big data can continue to enjoy the scalability, flexibility, and automation offered by cloud computing while maintaining the strictest security standards for the data.  Encryption is considered a fundamental first step in protecting cloud (big) data, and new technologies such as split-key encryption and homomorphic key management should be leveraged to protect sensitive data and comply with regulations like HIPAA, PCI, and many others.

The post 4 Great Tips: Cloud Security for Big Data appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@MicroservicesExpo Stories
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, will discuss the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docke...
The reason I believe digital transformation is not only more than a fad, but is actually a life-or-death imperative for every business and IT executive on the planet is simple: there will be no place for an “industrial enterprise” in a digital world. Transformation, by definition, is a metamorphosis from one state to another, wholly new state. As such, a true digital transformation must be the act of transforming an industrial-era organization into something wholly different – the Digital Enter...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, will discuss what every business should plan for how to structure their teams to d...
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all...
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications. The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-tim...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...

Let's just nip the conflation of these terms in the bud, shall we?

"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.

They are not.

One is about the application. The other, the network. T...

This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...