Welcome!

Microservices Expo Authors: Elizabeth White, Carmen Gonzalez, Pat Romanski, Liz McMillan, Sematext Blog

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Article

The Impact of the Cloud on Digital Forensics - Part 1

Taking digital forensics beyond the traditional security perimeter into a cloud security perimeter.

Digital Forensics is not an elephant, it is a process and not just one process, but a group of tasks and processes in investigation. Examiners now perform targeted examinations using forensic tools and databases of known files, selecting specific files and data types for review while ignoring files of irrelevant type and content. Despite the application of sophisticated tools, the forensic process still relies on the examiner's knowledge of the technical aspects of the specimen and understanding of the case and the law - Mark Pollitt.

As has been established from articles by various authors including myself, this re-branded model of  computing now called cloud computing proposes benefits that can improve productivity, harness high-speed systems which can  manage large data sets as well as systems implementations, and could have a net positive impact on the operational budget (scaling,elasticity) of some small and midsized enterprises.

Of course there is the possibility that a private cloud for a small enterprise may not warrant its cost, in comparison to that of harnessing the benefits of a public cloud offering.

For a larger enterprise with say multiple and/or international locations, a private cloud infrastructure can provide an added cost benefit that whilst not as cheap as a public cloud offering, would offset that cost variance in terms of the risk profile of systems being moved into a private cloud e.g. critical databases, transactional and/or processing systems as well as potential compliance concerns.

If however an enterprise chooses to utilize a public cloud offering there will be the added complications for information security, in terms of procedural and legal standpoints. This leads us to the point that, with a public cloud system; we no longer have the traditional defined security perimeter.

This new cloud security perimeter can now be any place on any device where people will access an enterprise provided network, resources and systems.

With regard to digital forensics and the e-discovery process, this new cloud security perimeter stemming from the trend with which data is now accessed via the internet, housed and consumed on multiple systems and devices internationally, will pose some serious challenges(legally and technically) with the potential to complicate a security investigation. e.g. defining incident response, access rules and policies governing  access as well as  support processes.

Traditional network forensics  metrics will not give a complete picture of what can occur within the cloud computing environment; for instance there could be limitations in terms of focus only on data going  into and out from  systems which an enterprise has access to, and as we know this generally stops at the gateway into the cloud.

In terms of network forensics, packet capture and analysis is important; with the cloud ecosystem there is the real possibility of an increase in the vast amount of data that may need to be processed. This will only increase the workload on the digital investigator who will most likely have more than a plate full of hex patterns, network metadata and logs to analyze., as is the case with a traditional system analysis.

This increased volume can severely cripple an investigation; more so if a forensic investigator does not completely understand the cloud ecosystem's architecture, its complex linkages that bridge cloud services and an enterprise's systems in addition to how these systems impact an enterprise in terms of potential ingress points that can lead to systems compromise.

The cloud while a boon to enterprise CapEx/OpEx is also a gold-mine for crackers who can set up systems for attack with as little as $50 e.g with  Amazon Web Services (AWS), an Amazon Machine Image (AMI) either Linux or Windows can  run a virtual machine which can be set it up to do whatever an end-user wants to do with it, that is, within the confines of the virtualized world; this environment is owned by the enduser (a cracker in this case) from the operating system  up.

Of course the IAAS and other hardware systems, IDS/IPS, firewalls, remain under the control and belong to the cloud service provider.

With regard to say conducting a forensic investigation on a virtualized server,there is that potential loss of data that can be relevant to an investigation once an image is stopped or a virtualized server is shut down, with minimal chance of retrieving a specific image from its virtualized server.

As mentioned there are several merits for the case to adopt a cloud service however, from a digital forensics point of view; an understanding of the inherent limitations of such a system needs to be clearly understood and properly reviewed and scoped by an enterprises IT Security team  regarding how such an implementation will adapt to their current security model. These metrics may vary based on the selected cloud provider the enterprise will use.

Gathered data can then assist the enterprise security on how to mitigate the potential for compromise and other risk that can affect the enterprises operations stemming from this added environment. This in turn can potentially alleviate the pains of a digital forensics investigation with cloud computing overtures.

Digital Forensic expert Nicole Bebee stated, "No research has been published on how cloud computing environmnets affect digital artifacts, and legal issues related to cloud computing environments."

Of note is the fact that with the top CSPs (Amazon, Rackspace, Azure) one can find common attributes from which a security manager can tweak the enterprises security policies.

Some things of note that will impact a forensic investigation within the cloud ecosystem are:

  1. A network forensics investigator is limited to tools on the box rather than the entire network, however if a proper ISO is made of the machine image, then all the standard information in the machine image's ISO should be available as it would with any other server in a data center.
  2. Lack of access to network routers, load balancers and other networked components.
  3. No access to large firewall installations
  4. There are challenges in mapping known hops from instance to instance which will remain static across the cloud-routing schema.
  5. System Administrators can build and tear down virtual machines (VMs) at will. This can influence an enterprises security policy and plans as, new rules and regulations will have to be implemented as we work with cloud servers and services that are suspected of being compromised.
  6. An enterprises threat environment should be treated with the same mindset for the cloud ecosystem as it would for any exposed service that is offered across the Internet.
  7. With the cloud ecosystem an advantage with regards to forensics is the ability for a digital investigator to store very large log files on a storage instance or in a very large database for easy data retrieval and discovery.
  8. An enterprise has to be open to the fact that there will be a risk of data being damaged, accessed, altered, or denied by the CSP.
  9. Routing information that is not already on "the box" will be difficult to obtain within this ecosystem.
  10. For encrypted disks, wouldn't it be theoretically feasible to spin up "n" cloud instances to help crack the encryption? According to Dan Morrill this can be an expensive process.

As those of us who are students and practitioners within the field of digital forensic know , any advance in this area tend to be primarily reactionary in nature and most likely developed  to respond to a specific incident or subset of incidents. This can pose a major challenge in the traditional systems; one can only imagine what can occur when faced with a distributed cloud ecosystem.

In terms of digital forensics, any tool that will make an examiners job easier, improve results, reduce false positives and generate data that is relevant, pertinent and can be admitted in a court of law will be of value.

Being my firms lead solutions researcher and consultant I am always on the lookout for any new process, system or tool that will make my job as well as that of my team easier as we work with our clients. This led me to attend a webinar: The Case for Network Forensics; from a company called Solera Networks ...continued in Part 2.

Special thanks to Mark Pollitt for his valuable insight.

References

  1. Politt MM. Six blind men from Indostan. Digital forensics research workshop (DFRWS); 2004.
  2. Digital Forensics:Defining a Research Agenda -Nance,Hay Bishop 2009;978-0-7695-3450-3/09 IEEE
  3. Dan Morrill- 10 things to think about with cloud-computing and forensics

More Stories By Jon Shende

Jon RG Shende is an executive with over 18 years of industry experience. He commenced his career, in the medical arena, then moved into the Oil and Gas environment where he was introduced to SCADA and network technologies,also becoming certified in Industrial Pump and Valve repairs. Jon gained global experience over his career working within several verticals to include pharma, medical sales and marketing services as well as within the technology services environment, eventually becoming the youngest VP of an international enterprise. He is a graduate of the University of Oxford, holds a Masters certificate in Business Administration, as well as an MSc in IT Security, specializing in Computer Crime and Forensics with a thesis on security in the Cloud. Jon, well versed with the technology startup and mid sized venture ecosystems, has contributed at the C and Senior Director level for former clients. As an IT Security Executive, Jon has experience with Virtualization,Strategy, Governance,Risk Management, Continuity and Compliance. He was an early adopter of web-services, web-based tools and successfully beta tested a remote assistance and support software for a major telecom. Within the realm of sales, marketing and business development, Jon earned commendations for turnaround strategies within the services and pharma industry. For one pharma contract he was responsibe for bringing low performing districts up to number 1 rankings for consecutive quarters; as well as outperforming quotas from 125% up to 314%. Part of this was achieved by working closely with sales and marketing teams to ensure message and product placement were on point. Professionally he is a Fellow of the BCS Chartered Institute for IT, an HITRUST Certified CSF Practitioner and holds the CITP and CRISC certifications.Jon Shende currently works as a Senior Director for a CSP. A recognised thought Leader, Jon has been invited to speak for the SANs Institute, has spoken at Cloud Expo in New York as well as sat on a panel at Cloud Expo Santa Clara, and has been an Ernst and Young CPE conference speaker. His personal blog is located at http://jonshende.blogspot.com/view/magazine "We are what we repeatedly do. Excellence, therefore, is not an act, but a habit."

@MicroservicesExpo Stories
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...