Welcome!

Microservices Expo Authors: Gopala Krishna Behara, Sridhar Chalasani, Tirumala Khandrika, Elizabeth White, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Article

The Impact of the Cloud on Digital Forensics - Part 1

Taking digital forensics beyond the traditional security perimeter into a cloud security perimeter.

Digital Forensics is not an elephant, it is a process and not just one process, but a group of tasks and processes in investigation. Examiners now perform targeted examinations using forensic tools and databases of known files, selecting specific files and data types for review while ignoring files of irrelevant type and content. Despite the application of sophisticated tools, the forensic process still relies on the examiner's knowledge of the technical aspects of the specimen and understanding of the case and the law - Mark Pollitt.

As has been established from articles by various authors including myself, this re-branded model of  computing now called cloud computing proposes benefits that can improve productivity, harness high-speed systems which can  manage large data sets as well as systems implementations, and could have a net positive impact on the operational budget (scaling,elasticity) of some small and midsized enterprises.

Of course there is the possibility that a private cloud for a small enterprise may not warrant its cost, in comparison to that of harnessing the benefits of a public cloud offering.

For a larger enterprise with say multiple and/or international locations, a private cloud infrastructure can provide an added cost benefit that whilst not as cheap as a public cloud offering, would offset that cost variance in terms of the risk profile of systems being moved into a private cloud e.g. critical databases, transactional and/or processing systems as well as potential compliance concerns.

If however an enterprise chooses to utilize a public cloud offering there will be the added complications for information security, in terms of procedural and legal standpoints. This leads us to the point that, with a public cloud system; we no longer have the traditional defined security perimeter.

This new cloud security perimeter can now be any place on any device where people will access an enterprise provided network, resources and systems.

With regard to digital forensics and the e-discovery process, this new cloud security perimeter stemming from the trend with which data is now accessed via the internet, housed and consumed on multiple systems and devices internationally, will pose some serious challenges(legally and technically) with the potential to complicate a security investigation. e.g. defining incident response, access rules and policies governing  access as well as  support processes.

Traditional network forensics  metrics will not give a complete picture of what can occur within the cloud computing environment; for instance there could be limitations in terms of focus only on data going  into and out from  systems which an enterprise has access to, and as we know this generally stops at the gateway into the cloud.

In terms of network forensics, packet capture and analysis is important; with the cloud ecosystem there is the real possibility of an increase in the vast amount of data that may need to be processed. This will only increase the workload on the digital investigator who will most likely have more than a plate full of hex patterns, network metadata and logs to analyze., as is the case with a traditional system analysis.

This increased volume can severely cripple an investigation; more so if a forensic investigator does not completely understand the cloud ecosystem's architecture, its complex linkages that bridge cloud services and an enterprise's systems in addition to how these systems impact an enterprise in terms of potential ingress points that can lead to systems compromise.

The cloud while a boon to enterprise CapEx/OpEx is also a gold-mine for crackers who can set up systems for attack with as little as $50 e.g with  Amazon Web Services (AWS), an Amazon Machine Image (AMI) either Linux or Windows can  run a virtual machine which can be set it up to do whatever an end-user wants to do with it, that is, within the confines of the virtualized world; this environment is owned by the enduser (a cracker in this case) from the operating system  up.

Of course the IAAS and other hardware systems, IDS/IPS, firewalls, remain under the control and belong to the cloud service provider.

With regard to say conducting a forensic investigation on a virtualized server,there is that potential loss of data that can be relevant to an investigation once an image is stopped or a virtualized server is shut down, with minimal chance of retrieving a specific image from its virtualized server.

As mentioned there are several merits for the case to adopt a cloud service however, from a digital forensics point of view; an understanding of the inherent limitations of such a system needs to be clearly understood and properly reviewed and scoped by an enterprises IT Security team  regarding how such an implementation will adapt to their current security model. These metrics may vary based on the selected cloud provider the enterprise will use.

Gathered data can then assist the enterprise security on how to mitigate the potential for compromise and other risk that can affect the enterprises operations stemming from this added environment. This in turn can potentially alleviate the pains of a digital forensics investigation with cloud computing overtures.

Digital Forensic expert Nicole Bebee stated, "No research has been published on how cloud computing environmnets affect digital artifacts, and legal issues related to cloud computing environments."

Of note is the fact that with the top CSPs (Amazon, Rackspace, Azure) one can find common attributes from which a security manager can tweak the enterprises security policies.

Some things of note that will impact a forensic investigation within the cloud ecosystem are:

  1. A network forensics investigator is limited to tools on the box rather than the entire network, however if a proper ISO is made of the machine image, then all the standard information in the machine image's ISO should be available as it would with any other server in a data center.
  2. Lack of access to network routers, load balancers and other networked components.
  3. No access to large firewall installations
  4. There are challenges in mapping known hops from instance to instance which will remain static across the cloud-routing schema.
  5. System Administrators can build and tear down virtual machines (VMs) at will. This can influence an enterprises security policy and plans as, new rules and regulations will have to be implemented as we work with cloud servers and services that are suspected of being compromised.
  6. An enterprises threat environment should be treated with the same mindset for the cloud ecosystem as it would for any exposed service that is offered across the Internet.
  7. With the cloud ecosystem an advantage with regards to forensics is the ability for a digital investigator to store very large log files on a storage instance or in a very large database for easy data retrieval and discovery.
  8. An enterprise has to be open to the fact that there will be a risk of data being damaged, accessed, altered, or denied by the CSP.
  9. Routing information that is not already on "the box" will be difficult to obtain within this ecosystem.
  10. For encrypted disks, wouldn't it be theoretically feasible to spin up "n" cloud instances to help crack the encryption? According to Dan Morrill this can be an expensive process.

As those of us who are students and practitioners within the field of digital forensic know , any advance in this area tend to be primarily reactionary in nature and most likely developed  to respond to a specific incident or subset of incidents. This can pose a major challenge in the traditional systems; one can only imagine what can occur when faced with a distributed cloud ecosystem.

In terms of digital forensics, any tool that will make an examiners job easier, improve results, reduce false positives and generate data that is relevant, pertinent and can be admitted in a court of law will be of value.

Being my firms lead solutions researcher and consultant I am always on the lookout for any new process, system or tool that will make my job as well as that of my team easier as we work with our clients. This led me to attend a webinar: The Case for Network Forensics; from a company called Solera Networks ...continued in Part 2.

Special thanks to Mark Pollitt for his valuable insight.

References

  1. Politt MM. Six blind men from Indostan. Digital forensics research workshop (DFRWS); 2004.
  2. Digital Forensics:Defining a Research Agenda -Nance,Hay Bishop 2009;978-0-7695-3450-3/09 IEEE
  3. Dan Morrill- 10 things to think about with cloud-computing and forensics

More Stories By Jon Shende

Jon RG Shende is an executive with over 18 years of industry experience. He commenced his career, in the medical arena, then moved into the Oil and Gas environment where he was introduced to SCADA and network technologies,also becoming certified in Industrial Pump and Valve repairs. Jon gained global experience over his career working within several verticals to include pharma, medical sales and marketing services as well as within the technology services environment, eventually becoming the youngest VP of an international enterprise. He is a graduate of the University of Oxford, holds a Masters certificate in Business Administration, as well as an MSc in IT Security, specializing in Computer Crime and Forensics with a thesis on security in the Cloud. Jon, well versed with the technology startup and mid sized venture ecosystems, has contributed at the C and Senior Director level for former clients. As an IT Security Executive, Jon has experience with Virtualization,Strategy, Governance,Risk Management, Continuity and Compliance. He was an early adopter of web-services, web-based tools and successfully beta tested a remote assistance and support software for a major telecom. Within the realm of sales, marketing and business development, Jon earned commendations for turnaround strategies within the services and pharma industry. For one pharma contract he was responsibe for bringing low performing districts up to number 1 rankings for consecutive quarters; as well as outperforming quotas from 125% up to 314%. Part of this was achieved by working closely with sales and marketing teams to ensure message and product placement were on point. Professionally he is a Fellow of the BCS Chartered Institute for IT, an HITRUST Certified CSF Practitioner and holds the CITP and CRISC certifications.Jon Shende currently works as a Senior Director for a CSP. A recognised thought Leader, Jon has been invited to speak for the SANs Institute, has spoken at Cloud Expo in New York as well as sat on a panel at Cloud Expo Santa Clara, and has been an Ernst and Young CPE conference speaker. His personal blog is located at http://jonshende.blogspot.com/view/magazine "We are what we repeatedly do. Excellence, therefore, is not an act, but a habit."

@MicroservicesExpo Stories
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and attent...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
Everyone wants to use containers, but monitoring containers is hard. New ephemeral architecture introduces new challenges in how monitoring tools need to monitor and visualize containers, so your team can make sense of everything. In his session at @DevOpsSummit, David Gildeh, co-founder and CEO of Outlyer, will go through the challenges and show there is light at the end of the tunnel if you use the right tools and understand what you need to be monitoring to successfully use containers in your...
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
By now, every company in the world is on the lookout for the digital disruption that will threaten their existence. In study after study, executives believe that technology has either already disrupted their industry, is in the process of disrupting it or will disrupt it in the near future. As a result, every organization is taking steps to prepare for or mitigate unforeseen disruptions. Yet in almost every industry, the disruption trend continues unabated.
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
Lots of cloud technology predictions and analysis are still dealing with future spending and planning, but there are plenty of real-world cloud use cases and implementations happening now. One approach, taken by stalwart GE, is to use SaaS applications for non-differentiated uses. For them, that means moving functions like HR, finance, taxes and scheduling to SaaS, while spending their software development time and resources on the core apps that make GE better, such as inventory, planning and s...
Building custom add-ons does not need to be limited to the ideas you see on a marketplace. In his session at 20th Cloud Expo, Sukhbir Dhillon, CEO and founder of Addteq, will go over some adventures they faced in developing integrations using Atlassian SDK and other technologies/platforms and how it has enabled development teams to experiment with newer paradigms like Serverless and newer features of Atlassian SDKs. In this presentation, you will be taken on a journey of Add-On and Integration ...