|By Terell Jones||
|September 2, 2012 12:00 PM EDT||
I have been asked to moderate a cloud computing discussion at Green Gov 2012. The title of the session is “Cloud Computing: The Next Generation of Computing and Sustainable IT”. It is a great honor to be selected to participate as moderator. I believe this is my second go around. As National Director of Cloud Services with Core BTS, Inc. it is my job to articulate the value of cloud computing. I have been pondering the title a bit and for me to actually discuss the next generation of Cloud, we have to identify the current situation. The cloud has gone way beyond Google Mail and SalesForce (CRM), into other areas like Cloud Security, Cloud Storage, and Cloud Back Up. Furthermore, we actually must define our idea of cloud computing and sustainable IT. Not everyone is on the same page.
What Is Cloud Computing?
NIST defines cloud computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. My own definition is slightly to the point, I consider cloud computing as Information Technology as a Utility Service. To be clear, I find Cloud Computing no different than Managed Services. It doesn’t matter if you utilize software as a service, platform as a service, or infrastructure as a service, the idea is to treat IT as a utility service to save overall costs.
What Is Sustainable IT?
I define Sustainable IT as energy efficient computing from the desktop to the data center, from hardware to software, from the network to the virtual cloud. Today I will focus mainly on Cloud Computing. For all intents and purposes, Cloud Computing is Sustainable IT. How can I say that? It’s simple math. Cloud computing, done right, can save an organization 50% to 80% in TCO. The timing could not be better. With a struggling economy, corporations are looking for ways to cut costs. When you get past the internal politics, the cloud hype cycle, and take a deep dive into the total cost of running an IT shop, you will be enlightened.
A very unique thing has occurred in the past 4 years with Sustainability and IT. CEO’s and CFO’s have been getting involved with IT budgets. The server sprawl and data center energy costs have become a major factor in the cost of doing business. A big mistake C-Level execs make is the fuzzy math used to calculate TCO for the enterprise. There is a strong tendency to calculate hardware and software costs only. To get the accurate TCO, you must take into consideration the following items:
- Power & Cooling
When all is said and done, you may pay only a third of the cost of running your own IT shop. A classic example is Google saving the General Services Administration (GSA) $15M over a five year period. GSA had 17,000 employees using Lotus Notes. Imagine the upgrade path if they did not consider going with Gmail. That would be a logistical nightmare. They would have to have several skill sets that are, most likely, obsolete. Never the less, they managed to cut their budget in half for email across the entire agency. Because the new technology Google offers, they were able to integrate video chat, and document-sharing capabilities, as well as, mobile devices. The USDA reduced it’s per user cost for email from $150 to $100. The Department of Homeland Security (DHS) cut it’s per user cost for email from $300 to $100.
Just with email we start to see significant savings in the cloud. So what next?
Next Generation Cloud Computing
We are currently seeing industry specific applications going to the cloud. Cloud commoditization is creeping up and down the stack, into different industries, causing a great deal of collaboration. Forrester Research predicts all cloud markets will continue to grow, and the total cloud market will reach about $61B by the end of 2012. With this continual increase in cloud usage, we will run unto cloud sprawl. This has gotten me excited with my position here at Core BTS. We specialize in two key areas that every organization on the planet will need to meet compliance. One being security the other being disaster recovery. Cyber-attacks are a fact of life in the world of today. Natural disasters, terrorist attacks, and system failures are common place.
What are the biggest predictions for information security? We will need more. Just think about all the areas which prompt a call to action: cloud sprawl, mobile devices, social media, malware, wireless. Information Security is no longer a niche market, it is a must have. It has to go main stream because the market demands it. Larger organizations will purchase boutique firms to shore up their share of the market. We partner with Trustwave. Trustwave allows us to offer a four compelling solutions:
- Managed Security Services
- Unified Security
Just to keep up with compliance is a monumental task. Our partnership allows us to help our clients with a strong strategy to address your regulatory requirements, such as PCI, HIPAA, SOX, GLBA, FISMA, ISO, and DLP. The demand for Information Security Governance has prompted a document called 20 Critical Security Controls for Effective Cyber Defense: Consensus Audit Guideline. This guideline alone should be all the more reason to put your security in the cloud. The cost to manage information security and the following 20 Critical Security Controls is staggering. You would need specialized hardware, software, people, and infrastructure.
- Critical Control 1: Inventory of Authorized and Unauthorized Devices
- Critical Control 2: Inventory of Authorized and Unauthorized Software
- Critical Control 3: Secure Configurations for Hardware and Software on Laptops, Workstations, and Servers
- Critical Control 4: Continuous Vulnerability Assessment and Remediation
- Critical Control 5: Malware Defenses
- Critical Control 6: Application Software Security
- Critical Control 7: Wireless Device Control
- Critical Control 8: Data Recovery Capability
- Critical Control 9: Security Skills Assessment and Appropriate Training to Fill Gaps
- Critical Control 10: Secure Configurations for Network Devices such as Firewalls, Routers, and Switches
- Critical Control 11: Limitation and Control of Network Ports, Protocols, and Services
- Critical Control 12: Controlled Use of Administrative Privileges
- Critical Control 13: Boundary Defense
- Critical Control 14: Maintenance, Monitoring, and Analysis of Security Audit Logs
- Critical Control 15: Controlled Access Based on the Need to Know
- Critical Control 16: Account Monitoring and Control
- Critical Control 17: Data Loss Prevention
- Critical Control 18: Incident Response Capability
- Critical Control 19: Secure Network Engineering
- Critical Control 20: Penetration Tests and Red Team Exercises
According to National Defense Magazine, we may be on the verge of a cyber-war in 2012. There have been numerous, almost daily, reports about China and other adversaries penetrating U.S. networks. Indeed, cyber security has been gaining lots of media attention. Targeted, zero day attacks will be the norm. Cybercriminals will adapt to the new cloud based protections looking for new ways to exploit networks. It’s a never ending battle. Smartphones will be a target, simply because it’s connected. Rogue Android and iPhone apps are just the beginning. Cyber Security is here to stay.
Cloud Back Up & Disaster Recovery
If you have sat around a computer in a corporate atmosphere as long as I have, chances are you have suffered panic or frustration with systems going down. Wondering whether you lost customer information, or whether that draft document you were working on was saved. It doesn’t have to be an event brought on by Mother Nature, it can be something simple like a server crashing. Disaster Recovery is changing to adapt to the overall changes in IT. IT as a commodity is fast becoming the de facto standard. So merely backing up data is not enough, we need to secure it and make it readily available. We also have to do that in the most secure effective way. In the past, DR was a very costly measure to keep systems up and running. We had to duplicate existing hardware, which is costly. We had to test that the DR plan, which was time consuming.
Our partnership with EVault helps us help our clients back up data to the DR site without violating standards for privacy and security. The HIPAA regulations regarding the security of digitally stored information are complex and difficult to follow. Outsourcing this function to the cloud helps you meet compliance, while saving on cost.
In summary, the next generation of cloud computing will be the increase in clouds for vertical markets, increase in cloud services up and down the stack, and the market demand for Cloud Security and Cloud Disaster Recovery.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 3, 2016 11:30 AM EST Reads: 2,064
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 3, 2016 11:15 AM EST Reads: 1,624
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 3, 2016 09:30 AM EST Reads: 826
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Dec. 3, 2016 08:30 AM EST Reads: 1,356
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 3, 2016 08:30 AM EST Reads: 714
As we enter the final week before the 19th International Cloud Expo | @ThingsExpo in Santa Clara, CA, it's time for me to reflect on six big topics that will be important during the show. Hybrid Cloud This general-purpose term seems to provide a comfort zone for many enterprise IT managers. It sounds reassuring to be able to work with one of the major public-cloud providers like AWS or Microsoft Azure while still maintaining an on-site presence.
Dec. 3, 2016 04:00 AM EST Reads: 2,715
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 3, 2016 02:15 AM EST Reads: 766
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Dec. 3, 2016 01:45 AM EST Reads: 4,537
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 3, 2016 12:15 AM EST Reads: 1,761
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 2, 2016 10:30 PM EST Reads: 1,737
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Dec. 2, 2016 08:00 PM EST Reads: 1,553
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 2, 2016 04:45 PM EST Reads: 2,123
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Dec. 2, 2016 03:30 PM EST Reads: 3,212
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 2, 2016 03:15 PM EST Reads: 1,456
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Dec. 2, 2016 01:45 PM EST Reads: 5,454
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Dec. 2, 2016 01:30 PM EST Reads: 5,706
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Dec. 2, 2016 01:00 PM EST Reads: 2,461
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Dec. 2, 2016 12:00 PM EST Reads: 1,848
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, showed how customers are able to achieve a level of transparency that enables everyone fro...
Dec. 2, 2016 11:30 AM EST Reads: 1,791
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Dec. 1, 2016 09:00 PM EST Reads: 1,724