Welcome!

Microservices Expo Authors: AppDynamics Blog, Liz McMillan, Elizabeth White, APM Blog, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, Containers Expo Blog, Agile Computing, Apache, Government Cloud

@CloudExpo: Article

Cloud Computing: The Next Generation of Computing & Sustainable IT

The next generation of cloud computing will be the increase in clouds for vertical market

I have been asked to moderate a cloud computing discussion at Green Gov 2012. The title of the session is “Cloud Computing: The Next Generation of Computing and Sustainable IT”. It is a great honor to be selected to participate as moderator. I believe this is my second go around. As National Director of Cloud Services with Core BTS, Inc. it is my job to articulate the value of cloud computing. I have been pondering the title a bit and for me to actually discuss the next generation of Cloud, we have to identify the current situation. The cloud has gone way beyond Google Mail and SalesForce (CRM), into other areas like Cloud Security, Cloud Storage, and Cloud Back Up. Furthermore, we actually must define our idea of cloud computing and sustainable IT. Not everyone is on the same page.

What Is Cloud Computing?
NIST defines cloud computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. My own definition is slightly to the point, I consider cloud computing as Information Technology as a Utility Service. To be clear, I find Cloud Computing no different than Managed Services. It doesn’t matter if you utilize software as a service, platform as a service, or infrastructure as a service, the idea is to treat IT as a utility service to save overall costs.

What Is Sustainable IT?
I define Sustainable IT as energy efficient computing from the desktop to the data center, from hardware to software, from the network to the virtual cloud. Today I will focus mainly on Cloud Computing. For all intents and purposes, Cloud Computing is Sustainable IT. How can I say that? It’s simple math. Cloud computing, done right, can save an organization 50% to 80% in TCO. The timing could not be better. With a struggling economy, corporations are looking for ways to cut costs. When you get past the internal politics, the cloud hype cycle, and take a deep dive into the total cost of running an IT shop, you will be enlightened.

A very unique thing has occurred in the past 4 years with Sustainability and IT. CEO’s and CFO’s have been getting involved with IT budgets. The server sprawl and data center energy costs have become a major factor in the cost of doing business. A big mistake C-Level execs make is the fuzzy math used to calculate TCO for the enterprise. There is a strong tendency to calculate hardware and software costs only. To get the accurate TCO, you must take into consideration the following items:

  • Hardware
  • Software
  • Maintenance
  • People
  • Facilities
  • Power & Cooling
  • Redundancy
  • Storage
  • Bandwidth

When all is said and done, you may pay only a third of the cost of running your own IT shop. A classic example is Google saving the General Services Administration (GSA) $15M over a five year period. GSA had 17,000 employees using Lotus Notes. Imagine the upgrade path if they did not consider going with Gmail. That would be a logistical nightmare. They would have to have several skill sets that are, most likely, obsolete. Never the less, they managed to cut their budget in half for email across the entire agency. Because the new technology Google offers, they were able to integrate video chat, and document-sharing capabilities, as well as, mobile devices. The USDA reduced it’s per user cost for email from $150 to $100. The Department of Homeland Security (DHS) cut it’s per user cost for email from $300 to $100.

Just with email we start to see significant savings in the cloud. So what next?

Next Generation Cloud Computing
We are currently seeing industry specific applications going to the cloud. Cloud commoditization is creeping up and down the stack, into different industries, causing a great deal of collaboration. Forrester Research predicts all cloud markets will continue to grow, and the total cloud market will reach about $61B by the end of 2012. With this continual increase in cloud usage, we will run unto cloud sprawl. This has gotten me excited with my position here at Core BTS. We specialize in two key areas that every organization on the planet will need to meet compliance. One being security the other being disaster recovery. Cyber-attacks are a fact of life in the world of today. Natural disasters, terrorist attacks, and system failures are common place.

Cloud Security
What are the biggest predictions for information security? We will need more. Just think about all the areas which prompt a call to action: cloud sprawl, mobile devices, social media, malware, wireless. Information Security is no longer a niche market, it is a must have. It has to go main stream because the market demands it. Larger organizations will purchase boutique firms to shore up their share of the market. We partner with Trustwave. Trustwave allows us to offer a four compelling solutions:

  1. Compliance
  2. Managed Security Services
  3. Spiderlabs
  4. Unified Security

Just to keep up with compliance is a monumental task. Our partnership allows us to help our clients with a strong strategy to address your regulatory requirements, such as PCI, HIPAA, SOX, GLBA, FISMA, ISO, and DLP. The demand for Information Security Governance has prompted a document called 20 Critical Security Controls for Effective Cyber Defense: Consensus Audit Guideline. This guideline alone should be all the more reason to put your security in the cloud. The cost to manage information security and the following 20 Critical Security Controls is staggering. You would need specialized hardware, software, people, and infrastructure.

20 Critical Security Controls – Version 3.1

  • Critical Control 1: Inventory of Authorized and Unauthorized Devices
  • Critical Control 2: Inventory of Authorized and Unauthorized Software
  • Critical Control 3: Secure Configurations for Hardware and Software on Laptops, Workstations, and Servers
  • Critical Control 4: Continuous Vulnerability Assessment and Remediation
  • Critical Control 5: Malware Defenses
  • Critical Control 6: Application Software Security
  • Critical Control 7: Wireless Device Control
  • Critical Control 8: Data Recovery Capability
  • Critical Control 9: Security Skills Assessment and Appropriate Training to Fill Gaps
  • Critical Control 10: Secure Configurations for Network Devices such as Firewalls, Routers, and Switches
  • Critical Control 11: Limitation and Control of Network Ports, Protocols, and Services
  • Critical Control 12: Controlled Use of Administrative Privileges
  • Critical Control 13: Boundary Defense
  • Critical Control 14: Maintenance, Monitoring, and Analysis of Security Audit Logs
  • Critical Control 15: Controlled Access Based on the Need to Know
  • Critical Control 16: Account Monitoring and Control
  • Critical Control 17: Data Loss Prevention
  • Critical Control 18: Incident Response Capability
  • Critical Control 19: Secure Network Engineering
  • Critical Control 20: Penetration Tests and Red Team Exercises

According to National Defense Magazine, we may be on the verge of a cyber-war in 2012. There have been numerous, almost daily, reports about China and other adversaries penetrating U.S. networks. Indeed, cyber security has been gaining lots of media attention. Targeted, zero day attacks will be the norm. Cybercriminals will adapt to the new cloud based protections looking for new ways to exploit networks. It’s a never ending battle. Smartphones will be a target, simply because it’s connected. Rogue Android and iPhone apps are just the beginning. Cyber Security is here to stay.

Cloud Back Up & Disaster Recovery
If you have sat around a computer in a corporate atmosphere as long as I have, chances are you have suffered panic or frustration with systems going down. Wondering whether you lost customer information, or whether that draft document you were working on was saved. It doesn’t have to be an event brought on by Mother Nature, it can be something simple like a server crashing. Disaster Recovery is changing to adapt to the overall changes in IT. IT as a commodity is fast becoming the de facto standard. So merely backing up data is not enough, we need to secure it and make it readily available. We also have to do that in the most secure effective way. In the past, DR was a very costly measure to keep systems up and running. We had to duplicate existing hardware, which is costly. We had to test that the DR plan, which was time consuming.

Our partnership with EVault helps us help our clients back up data to the DR site without violating standards for privacy and security. The HIPAA regulations regarding the security of digitally stored information are complex and difficult to follow. Outsourcing this function to the cloud helps you meet compliance, while saving on cost.

In summary, the next generation of cloud computing will be the increase in clouds for vertical markets, increase in cloud services up and down the stack, and the market demand for Cloud Security and Cloud Disaster Recovery.

More Stories By Terell Jones

Mr. Jones is the National Director of Cloud Services with Core BTS, Inc., a $180M corporation. He is based out of Fairfax, VA and handles the eastern region for cloud computing. After serving in the first Gulf War in the U.S. Navy Mr. Jones entered the IT field in 1995. He has over 17 years in Information Technology in the fields of Green IT, Cloud Computing, Virtualization, and Managed Services. He is internationally known as “the Green IT Guy” specializing in energy efficient computing from the desktop to the data center, from hardware to software, from the network to the virtual cloud. He has served as the Deputy Director at the Green IT Council since 2010.

@MicroservicesExpo Stories
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, will compare the Jevons Paradox to modern-day enterprise IT, e...
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
When scaling agile / Scrum, we invariable run into the alignment vs autonomy problem. In short: you cannot have autonomous self directing teams if they have no clue in what direction they should go, or even shorter: Alignment breeds autonomy. But how do we create alignment? and what tools can we use to quickly evaluate if what we want to do is part of the mission or better left out? Niel Nickolaisen created the Purpose Alignment model and I use it with innovation labs in large enterprises to de...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
Let's recap what we learned from the previous chapters in the series: episode 1 and episode 2. We learned that a good rollback mechanism cannot be designed without having an intimate knowledge of the application architecture, the nature of your components and their dependencies. Now that we know what we have to restore and in which order, the question is how?
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management solutions, helping companies worldwide activate their data to drive more value and business insight and to transform moder...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...