Microservices Expo Authors: Liz McMillan, David Sprott, Pat Romanski, Derek Weeks, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, API Journal, Agile Computing, Cloud Security

@CloudExpo: Blog Post

Cloud Security: Five Lessons from the Yahoo Password Breach

This one-off incident should not dampen enterprise enthusiasm for a road map to hybrid computing adoption

Last week one news item that attracted media attention was the hacking of some nearly 450,000 passwords from Yahoo Service called ‘Yahoo Voice'. The communications on the incident state that, SQL Injection is the primary technique adopted by hackers to get the information out of databases and publish them.

As per further communications, we find the affected company taking more precautions to ensure that security is their highest priority. These events will also generally shake the Cloud Adoption at the enterprise level, where always the Fear, Uncertainty and Doubt in the minds of CIOs may increase due to these incidents.

However the following are the best practices and guidelines that should be adopted by any enterprise when adopting hybrid cloud computing and a one-off incident should not dampen their road map to hybrid computing adoption.

In other words the incident is not really due to the adoption of Cloud or a multi-tenant model rather from not following the age-old best practices. This also indicates the significance of Cloud Brokers or intermediaries who have substantial knowledge of enterprise computing paradigm to play an increasing role in ensuring enterprises adopt cloud computing in a secure manner.

Lessons Learned In Yahoo Password Hacking

1. Not Following Security Fundamentals During Development
Whatever the changes in technology are, the fundamentals of development and coding remains the same. Most times SaaS vendors have high pressure on time-to-market, which may at times make them work around on security fundamentals, which is not correct. If you are accepting input from the user, it needs to be validated always before the input is acted upon. Lack of validation of input is the root cause behind attacks like Cross-Site Scripting or SQL Injection. In a multi tenant world the risk of SQL Injection is increased many a fold, because the hacker could always enter the system as a legitimate user by creating an valid account and then start hacking the data other customers.

I have earlier elaborated about SQL Injection In SaaS in an article in 2010 itself, whose thought process is still valid when it comes to developing SaaS and multi tenant applications.

2. Not Encrypting at the Database Level
Encryption of key data is one of the most important security aspect of any database in general and a multi tenant database in particular. However most times enterprises may take the path of encrypting at the Virtual Machine or Volume level, which means the Entire Disk even if it is physically stolen will not be useful to the hacker.

While this is a very useful feature such level of encryption still not useful, when the hacker legitimately gets access to the virtual machine in which the database is hosted. So additionally database level encryption which further provides encryption at the user level, i.e only users who have got READ permissions on the database can view the data, will provide an added level of security.

In my earlier article on Protecting Data At Rest In Public Clouds, I have compared the options between middle ware vs RDBMS in protecting the data. As evident a combination of both would make the multi-tenant database highly secured.

3. Exposing Too Much Metadata
How many times you have a database table storing credit card information is named as ‘CREDIT_CARD' or the password column in a user database is named as PIN or Password. While the best practices of database design in the past indicated the use of correct column names, and meta data like comments at the database level, they may act detrimental to the sensitive data in a today's world.

It is always advisable to avoid too much meta data from sensitive columns and keep them as part of the supporting documentation. There is no rule which states that a CREDIT CARD information is stored in a table named temp_t1 and let your application map temp_t1 is indeed a table containing credit card information.

4. Not Using Best of Features in the Databases
Today most of the software vendors have the pressure to make their product run against multiple databases . This provides them a much wider marketing capability. While this is important from the business perspective, due to this restriction we have seen the products that are using powerful RDBMS systems as a mere data store and not utilizing their best security features. This will result in reduced security at the database level because none of the security features are really used.

In my earlier article on Implementing Multi Tenancy Using Oracle Virtual Private Database, I have elaborated on how the best features like VPD in Oracle provide lot of security features so that if properly applied, some data is not visible to a user unless the application sets the appropriate context. Similarly these features can mask a certain columns to be not visible when queried directly. Also the features like fine grained auditing provide lot of auditing features against database security breaches.

Also if database level security is properly applied, we could utilize roles, grants and privileges in such a manner that the database connections get only EXECUTE privileges on the stored procedures and no direct READ access to the underlying tables, these kind of design patterns protect the data further.

This points to the fact that the product vendors should concentrate on unique security features of the databases like Oracle, Sql Server and DB2 and implement them in their design apart from the application level security. The need for porting the application to multiple databases should not reduce the security needs of the application.

5. Not Masking the Data
Masking the data is about , taking the data in the current form and convert it into a different format for storage purposes. It is slightly different from the encryption that the data is in readable form but will not make any sense if you don't know the de-masking algorithm. Traditionally masking is only done when the production data is ported to a test or development machine and also when a production data is ported to a offshore development location.

However we have not seen much instances of live production instances using masking techniques to protect the security information.

What is the golden rule that states that a social security number always to be stored as a single column of 9 characters and stored contiguously, what if the same is stored in three different columns with the numbers flipped and the application reverses them and forms into a proper social security number? This is just an example, the same principle can be applied to a credit card or a password or a pin, so that only application knows how to make meaning out of a masked sensitive data and not anyone else.

As indicated in the beginning of the articles, there will be a few instances of security breaches when so much new services are delivered over the public internet in a short time. However enterprises need not worry from them as they are not issues out of the fundamental concept of Cloud or Multi Tenancy itself, but because of the way the they are implemented. Adopting the best of design patterns like DAO access pattern which abstracts the database calls from the application and doing proper validation along with the database level protection techniques explained above, will prevent these incidents in the future.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@MicroservicesExpo Stories
In many organizations governance is still practiced by phase or stage gate peer review, and Agile projects are forced to accommodate, which leads to WaterScrumFall or worse. But governance criteria and policies are often very weak anyway, out of date or non-existent. Consequently governance is frequently a matter of opinion and experience, highly dependent upon the experience of individual reviewers. As we all know, a basic principle of Agile methods is delegation of responsibility, and ideally ...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
With emerging ideas, innovation, and talents, the lines between DevOps, release engineering, and even security are rapidly blurring. I invite you to sit down for a moment with Principle Consultant, J. Paul Reed, and listen to his take on what the intersection between these once individualized fields entails, and may even foreshadow.
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will present at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they manag...
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
operations aren’t merging to become one discipline. Nor is operations simply going away. Rather, DevOps is leading software development and operations – together with other practices such as security – to collaborate and coexist with less overhead and conflict than in the past. In his session at @DevOpsSummit at 19th Cloud Expo, Gordon Haff, Red Hat Technology Evangelist, will discuss what modern operational practices look like in a world in which applications are more loosely coupled, are deve...
DevOps is a term that comes full of controversy. A lot of people are on the bandwagon, while others are waiting for the term to jump the shark, and eventually go back to business as usual. Regardless of where you are along the specturm of loving or hating the term DevOps, one thing is certain. More and more people are using it to describe a system administrator who uses scripts, or tools like, Chef, Puppet or Ansible, in order to provision infrastructure. There is also usually an expectation of...
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...