|By Patrick Kerpan||
|October 18, 2012 07:30 AM EDT||
Cloud computing is so alluring. The public cloud economizes infrastructure resources and creates a scalable, on-demand source for compute capacity. Additionally, the cloud can be a strategic asset for enterprises that know how to migrate, integrate and govern deployments securely.
Apple co-founder, Steve Wozniak recently said, "A lot of people feel 'Oh, everything is really on my computer,' but I say the more we transfer everything onto the web, onto the cloud, the less we're going to have control over it."
In fact, over 70% of IT professionals worry about security according to an IDG Enterprise Cloud Computing Study.
Boiled down, security, access and connectivity are really issues of control.
As any prudent cloud user, the application has its own unique security features, such as disk encryption and port filtering. But do these layers of security features overlap or conflict? What happens to ownership after migration? Do solutions really have to be architected before and after deployment?
Take an application-focused approach to security from the beginning. The application-controlled, application-owned security layers will ease the decision to deploy, test, and develop in the cloud and save on IT training and time along the road.
Control of Security: Who Has It?
Part of the "magic" cloud providers and vendors supply is wrapped up in layers of ownership and control in the form of firewalls, isolation, and the cloud edge. Most enterprise application owners hope that these layers will cover the possible gaps in security after migration. Unfortunately most enterprises need security controls they can attest to and providers ultimately own and control these security features.
Unfortunately the needs and concerns of the cloud service provider are distinctly different than the needs and concerns of the enterprise cloud service user (the application topology deployed to the cloud and its owner). Security loopholes can exist because there are gaps between the areas users and providers control and own. The known boundary between what the cloud user can control and view and what the cloud provider can view and control is the root source of enterprise executives' concerns with public cloud.
The provider-owned, provider-controlled features (as in the cloud edge, cloud isolation), the provider-owned, user-controlled features (or the multi-tenant API controlled router/ hypervisor), and the app-owner, app-controlled features (OS port filtering and disk encryption) can be configured in an overlay network to give the user the ultimate control of security.
Application-to-cloud migration and software defined networking (SDN) capabilities out there offer additional, overlapping layers of control and security that span the spheres of the traditional cloud layers.
In order for cloud projects to succeed, IT executives need methods and tools they can attest to and can pass audit. Understanding the perimeter of access, control, and visibility between the application layer and the cloud provider layers is the first step to a less painful cloud migration. With this knowledge enterprises can then design a migration process that fits their use-case to deploy application topologies to the public cloud in a secure and controlled fashion.
Three Migration Rules We Recommend Breaking
Today's migration "rules" create more hurdles than solutions. Rapid industry changes, lack of standard security approaches, and the confusion on the proper steps to cloud deployment cause enterprises to overlook the issues of application-level control.
In fact, application-centric concerns are not even being addressed. Popular migration advice urges enterprises to tackle huge hurdles before and during migration, including deploying all at once, re-architecting before migration, and postponing the cost benefits of using the cloud.
Break the following three migration rules and it is possible to renovate more efficiently, capitalize on the cloud's economies of scale, and quickly, easily, and securely control enterprise networks and applications in the cloud.
Rule 1: Deploy all at once or not at all
Just as lemmings became extinct by all jumping in head first, most enterprises require time to analyze and adjust to new technologies before committing serious time and effort. Employees, customers, and shareholders would not be happy if companies jumped into new technologies without first proving value. Thankfully, enough enterprises, organizations and governments have already seized the benefits of the cloud's flexibility, cost savings, and connectivity.
Now, the challenge for IT professionals is to find the cloud architecture and provider(s) that fit their enterprise's needs and avoid having to reinvent the cloud to do so. With proven solutions in the market, enterprises can skip the bare metal to virtual to test cloud development life cycle. Simply deploy directly to any cloud environment, develop, test, then release to speed the time to market.
Rule 2: Re-architect before migration
Most providers and brokers want enterprises to spend time and effort to re-build IT systems and as a result re-learn/re-train before migration. Advice articles list migration steps of parsing applications, virtualizing, re-architecting and then migrating. Cloud pundits advise IT professionals to be wary of all cloud security and take valuable time to renovate before migrating - which will slow down the process and postpone or even wipe out the financial benefits of the cloud.
The traditional datacenter has too much knowledge flowing in a vertical direction from application to infrastructure and infrastructure to application. Migrating to the cloud before the renovate, design, or innovate steps can cut down on the upfront hassle by removing the burdens of re-architecting and re-learning skills before migration. Saving time, IT resources, and forgoing the arduous re-training speeds up the process for migrating to the cloud and ultimately how the organization capitalizes on the cloud's flexibility.
Rule 3: Pay upfront for design and renovation costs
Why stop with the cloud's physical economies of scale when there are potential savings on the costs of IT overhead? The same time and effort put into saving "design economies of scale" can be used to save major overhead costs too. A single migration, rather than the process of backup, re-architecture, and then migration is more cost-effective. Why wait for cost savings until after migration when there is an option to realize faster deployment and speed to market?
The added customization and control needed to migrate in a logical set of steps puts the control and security solidly back into the application layer.
Enterprises will likely face a long, slow migration to the cloud but, with the tools to capture the efficiency of migrating through logical steps before designing, the process can be significantly less painful. The application-controlled, application-owned security layers will ease the decision to deploy, test, and develop in the cloud and save on IT training and time along the road.
Conventional wisdom is missing the application layer importance of security and control in the cloud. So only one migration question remains - why take the stairs when you can take the elevator?
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, explored the value of Kibana 4 for log analysis and provided a hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He examined three use cases: IT operations, business intelligence, and security and compliance. Asaf Yigal is co-founder and VP of Product at log analytics software company Logz.io. In the past, he was co-founder of social-trading platform Currensee, which...
Nov. 25, 2015 12:45 PM EST
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
Nov. 25, 2015 12:00 PM EST Reads: 514
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Nov. 25, 2015 10:00 AM EST Reads: 425
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Nov. 25, 2015 09:00 AM EST Reads: 264
One of the most important tenets of digital transformation is that it’s customer-driven. In fact, the only reason technology is involved at all is because today’s customers demand technology-based interactions with the companies they do business with. It’s no surprise, therefore, that we at Intellyx agree with Patrick Maes, CTO, ANZ Bank, when he said, “the fundamental element in digital transformation is extreme customer centricity.” So true – but note the insightful twist that Maes adde...
Nov. 25, 2015 09:00 AM EST Reads: 394
Nov. 25, 2015 08:30 AM EST Reads: 141
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Nov. 25, 2015 08:15 AM EST Reads: 344
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Su...
Nov. 25, 2015 07:45 AM EST Reads: 340
You may have heard about the pets vs. cattle discussion – a reference to the way application servers are deployed in the cloud native world. If an application server goes down it can simply be dropped from the mix and a new server added in its place. The practice so far has mostly been applied to application deployments. Management software on the other hand is treated in a very special manner. Dedicated resources are set aside to run the management software components and several alerting syst...
Nov. 25, 2015 07:30 AM EST Reads: 126
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
Nov. 25, 2015 04:00 AM EST Reads: 549
People want to get going with DevOps or Continuous Delivery, but need a place to start. Others are already on their way, but need some validation of their choices. A few months ago, I published the first volume of DevOps and Continuous Delivery reference architectures which has now been viewed over 50,000 times on SlideShare (it's free to download...no registration required). Three things helped people in the deck: (1) the reference architectures, (2) links to the sources for each architectur...
Nov. 25, 2015 02:30 AM EST Reads: 197
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion...
Nov. 25, 2015 02:30 AM EST Reads: 681
Hiring the wrong candidate can cost a company hundreds of thousands of dollars, and result in lost profit and productivity during the search for a replacement. In fact, the Harvard Business Review has found that as much as 80 percent of turnover is caused by bad hiring decisions. But when your organization has implemented DevOps, the job is about more than just technical chops. It’s also about core behaviors: how they work with others, how they make decisions, and how those decisions translate t...
Nov. 25, 2015 12:45 AM EST Reads: 137
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound...
Nov. 25, 2015 12:30 AM EST Reads: 411
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facin...
Nov. 25, 2015 12:30 AM EST Reads: 417
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNu...
Nov. 24, 2015 10:00 PM EST Reads: 261
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Nov. 24, 2015 08:00 PM EST Reads: 343
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem"...
Nov. 24, 2015 06:00 PM EST Reads: 370
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Nov. 24, 2015 06:00 PM EST Reads: 321
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
Nov. 24, 2015 03:30 PM EST Reads: 462