Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Karthick Viswanathan, LeanTaaS Blog, Gopala Krishna Behara

Related Topics: Cloud Security, Java IoT, Microservices Expo, Microsoft Cloud, Linux Containers, Agile Computing

Cloud Security: Article

Confronting Identity Theft Head-On with Multi-Factor Authentication

Methods of identity theft have outpaced popular security measures, necessitating a new standard in data defense

The online world has become a dangerous place. According to a survey, 90 percent of all companies fell victim to a security breach in the last twelve months. Hacking and advanced persistent threats (APTs) have rendered the two-factor authentication token, now over 20 years old, essentially obsolete. Without question, a real need exists for a truly secure approach to real-time multifactor authentication to combat today's modern threats.

Remote Access Spikes Security Risk
The use of online services has exploded in the last decade as enterprises have adopted remote access as the default way to access systems and conduct business. With the pervasive use of online access to conduct business, the threat of identity theft has increased with stunning speed and complexity. Ponemon Research surveyed more than 500 corporations and found that 90 percent had been successfully hacked in the last twelve months. This finding underscores the need for major enterprises to adopt stringent, effective security methods as a means to protect against breaches. As a result, modern mobile phone-based multifactor authentication is in high demand.

Advances in Hacking
In the same way that the remote access industry has evolved, so have threats and their complexity. In the early days of online services, usernames and passwords were typically the only form of authentication. To defeat them, hackers used "brute force" attacks to guess the username or password, or "dictionary attacks" to assume a user's identity. In a dictionary attack, a computer or a hacker attempts various combinations of potential passwords until access is granted.

Systems eventually evolved to block these attempts by locking the account down after a few faulty attempts, leading hackers to develop new techniques like key loggers. Today, the most widely used attacks are pharming, phishing or a combination of the two. These terms describe methods by which users are led to a counterfeit website that looks just like the original. This tricks the user into entering his or her username and password. Some of the more advanced attacks send stolen information to the hackers in real time via a small instant message program, compromising many popular two-factor authentication tokens. As an example, Zeus malware captures a user's credentials - even advanced time-based token codes - and sends the information to the hacker.

As if that weren't enough, newer and more sophisticated methods of intercepting user interactions with online services have emerged in recent years, including man-in-the-browser, man-in-the-middle and session hijacking. Even the most secure traditional two-factor authentication token devices can no longer secure a user's identity against these new, more insidious threats. Yet many organizations are unaware that traditional tokens can be compromised, posing a significant security risk.

Many Security Technologies Fall Short
Today's ever-changing threat environment creates a never-ending battle wherein organizations must constantly evaluate the right level of investment in security. Often, the best possible protection is not financially feasible for many organizations, and thus a trade-off has to be made. To protect against identity theft schemes within budgetary constraints, organizations have sampled different technologies, including certificates, biometric scanning, identity cards and hard- and software tokens, with the latter being the most dominate technology. Certificates are often viewed as the ideal way to connect two devices with a secure, identifiable connection. The main issue is the deployment and administration of these certificates and the risks that these are copied without the user knowing it. Furthermore, the certificate authority might be compromised as well.

Biometric scanning has also enjoyed some success, often seen as a very secure alternative. However, the assumption that you always have a functioning finger or iris scanner handy has proven impractical, and the resulting scan produces a digital file that can itself be compromised. Another alternative is the identity card, which often proves impractical in a world of Bring Your Own Device ("BYOD"), where users demand access from an ever-changing variety of devices. Therefore, a new approach is needed.

A Mobile Approach to Security
Many organizations have begun using multi-factor authentication based on mobile networks to address today's modern threats while meeting a user's need for easier and more flexible solutions.

Two elements drive the adoption of the new crop of multi-factor authentication: one, the need to deliver hardened security that anticipates novel threats; and two, the need to deploy this level of security easily and at a low cost. The device used in the authentication process also needs to be connected to the network in real time and be unique to the user in question.

If the authentication engine sends a regular token via SMS, however, today's malware threats can steal the code easily. Therefore, organizations must seek strategies that operate efficiently in a message-based environment to successfully defend against modern threats. Key elements can include:

  • One-time password: To get the highest possible level of security, the one-time password (OTP) must both be generated in real time and be specific (locked) to the particular session, as opposed to tokens that use seed files where the passcodes are stored.
  • Minimal complexity: To minimize infrastructure complexity, the solution should plug into different login scenarios, such as Citrix, VMware, Cisco, Microsoft, SSL VPNs, IPsec VPNs and web logins. Other ways to minimize infrastructure overload include providing these logins in an integrated, session-based architecture.
  • Multiple defenses: To support real-time code delivery, the organization needs robust and redundant server-side architecture along with multiple delivery mechanism support, regardless of geographic location.
  • Easy management: The solution should be able to be managed easily within the existing user management infrastructure.
  • Context-specific: To maximize security, the company should leverage contextual information - such as geo-location and behavior patterns - to effectively authenticate the user.

The Security Horizon
The modern convenience of online services has brought with it the modern scourge of identity theft. Methods of identity theft have outpaced popular security measures, necessitating a new standard in data defense: session- and location-specific multi-factor authentication. This kind of real-time solution, delivered to a user's mobile phone, can provide the security organizations must have if they hope to protect their employees, users and data from modern online threats.

More Stories By Claus Rosendal

Claus Rosendal is a founding member of SMS PASSCODE A/S, where he oversees the product strategy and development in the role of Chief Technology Officer. Prior to founding SMS PASSCODE A/S, he was a co-founder of Conecto A/S, a leading consulting company within the area of mobile computing and IT security solutions with special emphasis on Citrix, Blackberry and other advanced handheld devices. Prior to founding Conecto A/S, he headed up his own IT consulting company, where he was responsible for several successful ERP implementations in different companies (C5 / SAP). Claus holds a Master Degree in computer science from University of Copenhagen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Many IT organizations have come to learn that leveraging cloud infrastructure is not just unavoidable, it’s one of the most effective paths for IT organizations to become more responsive to business needs. Yet with the cloud comes new challenges, including minimizing downtime, decreasing the cost of operations, and preventing employee burnout to name a few. As companies migrate their processes and procedures to their new reality of a cloud-based infrastructure, an incident management solution...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things ...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
Cloud Governance means many things to many people. Heck, just the word cloud means different things depending on who you are talking to. While definitions can vary, controlling access to cloud resources is invariably a central piece of any governance program. Enterprise cloud computing has transformed IT. Cloud computing decreases time-to-market, improves agility by allowing businesses to adapt quickly to changing market demands, and, ultimately, drives down costs.
Recent survey done across top 500 fortune companies shows almost 70% of the CIO have either heard about IAC from their infrastructure head or they are on their way to implement IAC. Yet if you look under the hood while some level of automation has been done, most of the infrastructure is still managed in much tradition/legacy way. So, what is Infrastructure as Code? how do you determine if your IT infrastructure is truly automated?
Every few years, a disruptive force comes along that prompts us to reframe our understanding of what something means, or how it works. For years, the notion of what a computer is and how you make one went pretty much unchallenged. Then virtualization came along, followed by cloud computing, and most recently containers. Suddenly the old rules no longer seemed to apply, or at least they didn’t always apply. These disruptors made us reconsider our IT worldview.