|By Ashish Nanjiani||
|April 16, 2017 01:00 PM EDT||
Disclaimer : I am an IT guy and my knowledge on human body is limited to my daughter's high school biology class book and information obtained from search engines. So, excuse me if any of the information below is not represented accurately !
The human body is the most complex machine ever created. With a complex network of interconnected organs, millions of cells and the most advanced processor, human body is the most automated system in this planet. In this article, we will draw comparisons between the working of a human body to that of a data center. We will learn how self-defense and self-healing capabilities of our human body is similar to firewalls and intelligent monitoring capabilities in our data centers. We will draw parallels between human body automation to data center automation and explain different levels of automation we need to drive in data centers. This article is divided into four parts covering each of body main functions and drawing parallels on automation
Have you ever felt sick? How do you figure out that you are going to get sick and you need to call it a day. Can you control how fast your heart should beat or can you control your breath as per your wish? Human body is the most automated system we have in the entire universe. It's the most advance machine with the fastest microprocessor and a lightning network which powers us every day. There is lot to learn on how the architect of our body has designed our body and how using the same design principals we should automate the data center of the future.
The fundamental principal of automation is to use the data to do intelligent analytics that enables us to take action. When we are about to fell sick, our body gives us some indicators (alerts) which tells us things are not going per plan and we need to take action. Such indicators can be in the form of developing fever or chills, feeling cold, or having pain. Once we get these alerts either we take action, i.e., take medication or we let our body self-heal if the alert is not to worry about, e.g., a small cut.
Our body like our systems (compute, network, etc.) have a way to read these alerts and take appropriate actions. In addition, our body has tremendous and most advance security system always working to defend ourselves from various malicious attacks! An example when the virus strikes the human body, it attacks the body cellular structure and begins to destroy it. Our body defense mechanism immediately sends white blood cells to attacks the invading virus and tries to destroys it. All this happens 24x7 and without us telling our body to do so! If the body fails to defend on its own then it gives signals to help it out and that is when we either go to a doctor to get us some medicine or take some other external remedies to help our body. Now imagine if we can develop similar advanced security system to defend our data centers from all the attacks. There are several things we can learn from how our body works and incorporate the same in creating highly automated data center of the future. Let's examine each of the body systems and how we can leverage it for our benefit. While this is not biology lesson it is time to go back to your school days.
The Immune System
This is perhaps the most intelligent and automated system in our body and most relevant to the way we should automate our data center security. Our immune (security) system is a collection of structures and processes who job is to protect against disease or other potentially damaging foreign bodies. These diseases and/or foreign bodies is equivalent to virus, malware or other type of security threats we see in our data center. Our immune system consists of various parts (hardware) and systems (software) which allows our body to self-defend and self-heal against attacks, which happens 24x7.
There are six main components of our immune system.
- Lymph Nodes: This is a small bean shape structures that produce and store cells to fight infection and diseases. Lymph nodes contains lymph, a clear liquid that carries those cells to various parts of our body.
- Spleen: This is located on your left-hand side of your body under your ribs and above your stomach. The spleen contains white blood cells that fight infection
- Bone-Marrow: The yellow tissue in the center of bones that produced white blood cells
- Lymphocytes: These small white blood cells play a large role in defending the body against disease. The two types of lymphocytes are B-cells, which make antibodies that attack bacteria and toxins, and T-cells, which help destroy infected or cancerous cells
- Thymus: Responsible to trigger and maintain production of antibodies
- Leukocytes: These are disease fighting white blood cells that identifies and eliminates pathogens
Together all the above components make up our immune system. Think these of various security devices like physical access card readers, firewalls, anti-virus software, anti-spam and other security mechanism we deploy in our data center. The immune system can be further divided in two systems.
The Innate Immune System
The innate immune response is the first step in protecting our bodies from foreign particles. It is an immediate response that's "hard-wired" into our immune system. It's a generalized system which protects against any type of virus attacks and not tied to specific immunity. For example, general barriers to infection include:
- Physical (skin, mucous, tears, saliva, and stomach acid)
- Chemical (specific proteins found in tears or saliva that attack foreign particles)
- Biological (microbiota or good bacteria in the gut that prevents overgrowth of bad bacteria)
The innate immune system is general i.e. anything that is identified as a foreign or non-self becomes target for the innate immune system
The Adaptive Immune Response
The innate immune response leads to the pathogen-specific adaptive immune response. While this response is more effective, it takes time to develop-generally about a week after the infection has occurred. This system is called adaptive because it's a self-learning system which adapts itself to new threats and creates a self-defense mechanism to neutralize such threats in the future much faster. A good example we all know from birth is vaccinations. We are injected with a weakened or dead virus to enable our body learn on how to defend against a particular type of virus. Our body then remembers this all its life and protects us 24x7 from this particular virus.
Thus, the immune system is both reactive and adaptive. It reacts when a pathogen enters our body to neutralizes it, it also is constantly learning and adapting to new threats. It's also intelligent to know what is self - Anything naturally in the body, e.g., our own cells to non-self-Anything that is not naturally present in the body. The system also is a quick reacting system and has inbuilt messaging system which passes signal from one cell to another to act on incoming threat all at lightning speed. In addition, its layered security system with multiple types of cells playing particular role to defend. While some cells are located at the entry point of our body like mouth, nose, ear, etc., and act as security guards, others are located in our circulatory systems or in our bone marrow and gets released as and when required.
Enough of biology. Let's get into our IT world. Imagine our data center having similar innate and adaptive capabilities. The innate or generalized security systems are our firewalls, emails scanners etc. which can neutralize generalized threats in our data center. They are not tied to specific threats like DoS or Dirty cow type OS vulnerability. These systems are continuously watching for any threats and neutralizes once they find known and familiar threats. E.g. email spam filters, anti-virus software, etc. Much like our body has physical, chemical and biological defense layers, our data center needs to have different security layers to product us from various types of attacks. At a minimum, we four level of security in our DC. Physical security (Access card readers, Security guards), network security (DNS, DMZ/Internal, Firewalls), component level (Compute, Storage) and application level (email, OS, Java, Oracle, etc.). There are lot of technologies available today which provides various layers of security including those provide by industry leaders like Cisco.
While we have innate defense capabilities, what we need to protect us against increasing sophistication of attacks is the adaptive self-defense capabilities. The system should self-learn various signatures and patterns from past attacks and can automatically create self-healing code (white blood cells) to defend against new threats. In other words, systems should be able to self-heal itself. Such a system will create new defense signatures based on previous attacks and adapt to new type of attacks.
Humans intervene only when the system fails to do its job. Let's take an example. Let us assume a new type of virus is released, it's an enhanced version of previously known virus, so the signature is different. If the virus pattern is not known, humans have to develop anti-virus signatures and then update anti-virus software to fix the exposure. This is like taking an external dose of antibiotics to heal your body. This can take days if not weeks to get the updated software from vendor and apply it across all vulnerable systems. Now what if we have systems in the future which can create required antibiotics on its own and try to fix the exposure? Such systems much like our body learns from previous attacks, modify its current software to adapt to new threat and tries to defend itself all without human intervention! Seems unreal. Our body is capable for doing this with to do this with 75% or more success rate. Can we aim for 80%?
Another capability we need in our data center is the self-healing capability. Much like how a human body detects abnormalities in the human body and attacks the problem without asking for your permission J, data center security mechanism as well as fault detection system should work in similar way. Imagine your body waiting for your instruction to defend from invading virus!! What if you were sleeping. When an abnormality is detected in the data center, we need to act immediately. Today, while many of data center security products are designed to detect malicious attacks and take appropriate action without human intervention, we need to extend this inside every component (compute/storage/network) in the data center. We should have intelligence at every layer to protect against increasing form of attacks and everything needs to be connected together. An end point device which detected a threat can alert all the security components at all layers about incoming threat. Each system notifies other systems on the status of threat and there is constant communication between fire-walls, compute, storage system based on type and level of attack.
As an example, imagine we discover a new super critical vulnerability in our operating system which allows an authorized user to get root privileges. Today in most enterprises it takes weeks if not days to detect and remediate the vulnerability. In tomorrow's world system should be smart enough to take detect such gaps and apply the fix immediately. Why wait when we know waiting can have adverse impact on our business and yes did I mentioned it should be done without downtime to business. After all your body does not need downtime to fix YOU.
To summarize we need following capabilities for our data center security
- Multi-layered inter-connected security system. There should be common messaging bus between different infrastructure components to detect and notify status of threats
- Should be both innate and adaptive to react to different type of threats
- Self-learning with self-healing capabilities. Should continuous learn and adapt to new threats
- Ability to react at the speed of light
In the next article, we will focus on the body's nervous system, which is the most complex but also the most intelligent sensor system in the planet.
Until next time....
Developers want to create better apps faster. Static clouds are giving way to scalable systems, with dynamic resource allocation and application monitoring. You won't hear that chant from users on any picket line, but helping developers to create better apps faster is the mission of Lee Atchison, principal cloud architect and advocate at New Relic Inc., based in San Francisco. His singular job is to understand and drive the industry in the areas of cloud architecture, microservices, scalability ...
Apr. 26, 2017 01:00 PM EDT Reads: 3,513
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
Apr. 26, 2017 11:45 AM EDT Reads: 8,982
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing be...
Apr. 26, 2017 09:26 AM EDT Reads: 377
This recent research on cloud computing from the Register delves a little deeper than many of the "We're all adopting cloud!" surveys we've seen. They found that meaningful cloud adoption and the idea of the cloud-first enterprise are still not reality for many businesses. The Register's stats also show a more gradual cloud deployment trend over the past five years, not any sort of explosion. One important takeaway is that coherence across internal and external clouds is essential for IT right n...
Apr. 26, 2017 08:45 AM EDT Reads: 1,692
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Apr. 26, 2017 07:45 AM EDT Reads: 3,264
To more closely examine the variety of ways in which IT departments around the world are integrating cloud services, and the effect hybrid IT has had on their organizations and IT job roles, SolarWinds recently released the SolarWinds IT Trends Report 2017: Portrait of a Hybrid Organization. This annual study consists of survey-based research that explores significant trends, developments, and movements related to and directly affecting IT and IT professionals.
Apr. 26, 2017 04:30 AM EDT Reads: 1,678
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Apr. 25, 2017 10:30 PM EDT Reads: 7,198
Is your application too difficult to manage? Do changes take dozens of developers hundreds of hours to execute, and frequently result in downtime across all your site’s functions? It sounds like you have a monolith! A monolith is one of the three main software architectures that define most applications. Whether you’ve intentionally set out to create a monolith or not, it’s worth at least weighing the pros and cons of the different architectural approaches and deciding which one makes the most s...
Apr. 25, 2017 08:45 PM EDT Reads: 2,777
Cloud Expo, Inc. has announced today that Aruna Ravichandran, vice president of DevOps Product and Solutions Marketing at CA Technologies, has been named co-conference chair of DevOps at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Apr. 25, 2017 08:30 PM EDT Reads: 2,571
In large enterprises, environment provisioning and server provisioning account for a significant portion of the operations team's time. This often leaves users frustrated while they wait for these services. For instance, server provisioning can take several days and sometimes even weeks. At the same time, digital transformation means the need for server and environment provisioning is constantly growing. Organizations are adopting agile methodologies and software teams are increasing the speed ...
Apr. 25, 2017 08:15 PM EDT Reads: 3,357
Back in February of 2017, Andrew Clay Schafer of Pivotal tweeted the following: “seriously tho, the whole software industry is stuck on deployment when we desperately need architecture and telemetry.” Intrigue in a 140 characters. For me, I hear Andrew saying, “we’re jumping to step 5 before we’ve successfully completed steps 1-4.”
Apr. 25, 2017 11:15 AM EDT Reads: 1,741
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Apr. 25, 2017 06:00 AM EDT Reads: 4,336
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Apr. 25, 2017 03:15 AM EDT Reads: 8,898
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Apr. 25, 2017 03:00 AM EDT Reads: 5,934
Software as a service (SaaS), one of the earliest and most successful cloud services, has reached mainstream status. According to Cisco, by 2019 more than four-fifths (83 percent) of all data center traffic will be based in the cloud, up from 65 percent today. The majority of this traffic will be applications. Businesses of all sizes are adopting a variety of SaaS-based services – everything from collaboration tools to mission-critical commerce-oriented applications. The rise in SaaS usage has m...
Apr. 22, 2017 06:15 PM EDT Reads: 4,860
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
Apr. 22, 2017 05:45 AM EDT Reads: 6,218
We'd all like to fulfill that "find a job you love and you'll never work a day in your life" cliché. But in reality, every job (even if it's our dream job) comes with its downsides. For you, the constant fight against shadow IT might get on your last nerves. For your developer coworkers, infrastructure management is the roadblock that stands in the way of focusing on coding. As you watch more and more applications and processes move to the cloud, technology is coming to developers' rescue-most r...
Apr. 22, 2017 04:00 AM EDT Reads: 4,100
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Apr. 22, 2017 03:45 AM EDT Reads: 3,597
Even for the most seasoned IT pros, the cloud is complicated. It can be difficult just to wrap your head around the many terms and acronyms that make up the cloud dictionary-not to mention actually mastering the technology. Unfortunately, complicated cloud terms are often combined to the point that their meanings are lost in a sea of conflicting opinions. Two terms that are used interchangeably (but shouldn't be) are hybrid cloud and multicloud. If you want to be the cloud expert your company ne...
Apr. 21, 2017 04:15 PM EDT Reads: 2,245
SYS-CON Events announced today that CollabNet, a global leader in enterprise software development, release automation and DevOps solutions, will be a Bronze Sponsor of SYS-CON's 20th International Cloud Expo®, taking place from June 6-8, 2017, at the Javits Center in New York City, NY. CollabNet offers a broad range of solutions with the mission of helping modern organizations deliver quality software at speed. The company’s latest innovation, the DevOps Lifecycle Manager (DLM), supports Value S...
Apr. 18, 2017 03:30 PM EDT Reads: 4,449