|By Christian Buckley||
|January 21, 2013 03:15 PM EST||
For many companies, the business benefits that cloud computing promises are too compelling too ignore: improved agility, lower costs, better resource allocation, and fewer operational issues. As a result, organizations have been moving commodity infrastructure and services to cloud-based services managed by some of the world's leading technology companies - including Office 365, Microsoft's primary offering for business productivity in the cloud.
Several new developments are making Office 365 even more enticing:
- New release of SharePoint: Extranets and public-facing websites are expensive in SharePoint 2010. However, with pricing changes and new web content management (WCM) functionality for SharePoint 2013, many organizations are beginning to take a second look at the cloud for some work streams as Office 365 gets updated in early 2013 with the latest SharePoint version.
- Broad Office 365 adoption: According to Kurt DelBene, President of the Office Division at Microsoft, Office 365 is on target to become the fastest-selling server product in Microsoft's history, outpacing all analyst expectations.
- Additional cost savings: With Office 365, organizations pay a monthly fee per user and gain access to ongoing maintenance and expertise to manage servers. That saves them from a huge, upfront operating expense. (We should point out, however, that because SharePoint Online is more of a product and service than a platform, it has more limited capabilities than the on-premises version, so long-term cost implications are not yet known.)
Given these gains, no company should ignore a move to the cloud. However, a full jump to the public cloud without careful consideration is ill advised. Some companies can't move everything to the cloud because they have compliance, regulatory, or government restrictions that limit where data can be stored and who can have access to it. But many companies shouldn't move everything to the cloud, because there is simply not parity between online and on-premise versions of SharePoint. What makes SharePoint compelling for many enterprises is the ability to extend, customize, and integrate with other enterprise systems, much of which is impossible with the Office 365 platform. Until there is parity, certain workstreams should stay in their current environments.
What's needed now is a thoughtful, strategic approach to cloud computing that is based on the needs of your business. Understanding which aspects of your organization's business systems can be moved, and should be moved, to the cloud is an important discussion that leaders must undertake. Although some work streams can be moved easily, many others require customization and management that only an on-premise deployment can support. That's why a hybrid model - comprising on-premise, private, and/or public cloud components - offers organizations the best way to ease into the cloud computing paradigm and leverage the promise of SharePoint Online.
Where to start? As part of readiness planning for cloud services adoption, companies must address seven critical success factors:
1. What are the business requirements?
As a first step, organizations must get their arms around the underlying business requirements of the environment, the key use cases, and the key work streams. For example, it may be possible to build out a lightweight customer relationship management solution, acting as a portal for sales, marketing, and support to connect with customers. But it may not make sense to move development team activities to the cloud due to integration issues with case management or configuration management systems. Outline your key workstreams, and then begin to map each workstream to your on-premise and online models to see which activities can be improved upon.
2. What are the business implications?
Companies must understand the implications of moving each use case and each work stream into the cloud. You must know the answers to these key questions: Can current functionality, security, audits, and reporting be replicated in the cloud? If key functions cannot be offered and supported, what are the risks? Say you have a ticketing system, with SharePoint acting as the front end. Without a full understanding of the architecture of the solution and how data is shared between the ticketing platform and SharePoint it may be difficult to understand the true cost implications of moving to the cloud. You also need an understanding of the performance and cost impacts to the large number of web service calls the platform may make within a pure cloud environment. Depending on the volume of data moved, how it is moved, and the timeframe for moving this important business system to the cloud, it may make financial sense to maintain an on-premise version of SharePoint for your product and support organizations.
3. What are the management ramifications?
Companies must understand the management implications of each work stream. It's not just a matter of "can we move it?" but "should we move it?" In some instances, a move to the cloud may result in added administrative effort and costs. Case in point: In one of the most common hybrid scenarios, a business that uses Office 365 as its extranet while maintaining an on-premise or dedicated cloud SharePoint environment may find that managing permissions, storage, and usage/activity is an extremely manual and time-consuming process. That's because there are no native tools for managing these functions across disparate SharePoint environments. Therefore, it's critical to look at your current metrics and KPIs for managing SharePoint, and understand the implications of duplicating these metrics within a new cloud environment - as well as combining and normalizing this data across all systems.
4. What about the end user experience?
We can't state this strongly enough: Companies must understand what the end-user experience will look like. If a hybrid environment adds effort or decreases productivity for end users, what is the cost? Consider these factors: Access to your enterprise platforms probably begins with a single sign-on experience - you log in once to get to all the tools and systems you need to do your business tasks. Your organization may have made significant investments to brand your internal platforms and put processes in place to maintain consistency across team sites and business unit portals. But, if you add another external system to the mix, what happens to the end-user experience? If permissions are separate, how does that affect end-user productivity? Your imperative is to understand how your primary users will conduct their business, thinking about the end-to-end experience, not just whether key functionality is being met through the new system. Remember: the more difficult a system is to use, the less likely people will be to use it.
5. How will we move?
If part of your organization, and key work streams, are moved to the cloud, what is the plan for the move? Will you move all at once? In waves? What about training? Migration and onboarding strategies vary widely. Your strategy should be based on critical path business use cases, helping those who rely on the new system before the masses. One strategy is to follow the 80/20 rule: concentrate your planning around the 20% of the users who use SharePoint most heavily, giving them 80% of your time, while spending 20% of your time with the remaining (mostly casual) users. However you decide to time moving end users and work streams, you must involve end users as you formulate and communicate your plan. The more you involve people in the process, the more likely they are to support that process.
6. How will we measure success?
Companies must have the right metrics in place to track performance across the entire environment. Companies also need to think about whether or not content needs to be synchronized between environments, or if these use cases can be maintained separately. Most companies fail at this today - because they don't have sufficient visibility into their SharePoint environment to generate and track adequate metrics. Moving to a hybrid model is a great opportunity to correct this trend. One strategy is to begin by identifying three key metrics across both systems, and build from there. An example might be Top 10 Most Active Sites, Top 10 Most Active Users, and Most Active Content Databases. Based on this data, you will gain a much better understanding of who is actually using SharePoint and where, allowing you to better allocate your time and resources to support the sites and users who are most active on the platform.
7. How will we enforce governance?
Ask yourself: Do we have a defined change management process? Do we have our roles and accountabilities defined? Are we actively reviewing and taking action on new requests? Are we giving end users and admins visibility into the changes being made and the priorities of those requirements? Having a governance body in place is crucial. Without automation, manual governance practices (policies, documentation, metrics) need to be extended or duplicated, with appropriate roles defined and assigned. Best practices include running through the document lifecycle across environments and identifying where current policies break. Focus your attention on the governance policies that manage risk - compliance, regulations, retention, or any other legal requirements of a hybrid system. Then define what it will take to maintain minimum security levels, and create a plan for automating and simplifying.
Despite the risks, some companies may be drawn into the cloud by the perceived cost savings, despite their customization or integration needs. This is a recipe for failure. Companies that successfully make the move to a hybrid model are those that understand the business activities that can be offloaded to the cloud, benefiting from its scale and cost benefits.
The beauty of a strategic, hybrid model is that it's not "all or nothing." By addressing the seven critical success factors outlined above, companies will be taking a holistic approach versus making a blind technology decision. By focusing on specific workstreams, and only building out those workstreams that can be supported, your company will end up with a strategic, hybrid model that supports the needs of your business.
For more information:
Alibaba, the world’s largest ecommerce provider, has pumped over a $1 billion into its subsidiary, Aliya, a cloud services provider. This is perhaps one of the biggest moments in the global Cloud Wars that signals the entry of China into the main arena. Here is why this matters. The cloud industry worldwide is being propelled into fast growth by tremendous demand for cloud computing services. Cloud, which is highly scalable and offers low investment and high computational capabilities to end us...
Aug. 2, 2015 11:00 AM EDT Reads: 193
One of the ways to increase scalability of services – and applications – is to go “stateless.” The reasons for this are many, but in general by eliminating the mapping between a single client and a single app or service instance you eliminate the need for resources to manage state in the app (overhead) and improve the distributability (I can make up words if I want) of requests across a pool of instances. The latter occurs because sessions don’t need to hang out and consume resources that could ...
Aug. 2, 2015 10:30 AM EDT Reads: 248
Approved this February by the Internet Engineering Task Force (IETF), HTTP/2 is the first major update to HTTP since 1999, when HTTP/1.1 was standardized. Designed with performance in mind, one of the biggest goals of HTTP/2 implementation is to decrease latency while maintaining a high-level compatibility with HTTP/1.1. Though not all testing activities will be impacted by the new protocol, it's important for testers to be aware of any changes moving forward.
Aug. 2, 2015 09:45 AM EDT Reads: 201
"We've just seen a huge influx of new partners coming into our ecosystem, and partners building unique offerings on top of our API set," explained Seth Bostock, Chief Executive Officer at IndependenceIT, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Aug. 1, 2015 09:00 PM EDT Reads: 669
This week, I joined SOASTA as Senior Vice President of Performance Analytics. Given my background in cloud computing and distributed systems operations — you may have read my blogs on CNET or GigaOm — this may surprise you, but I want to explain why this is the perfect time to take on this opportunity with this team. In fact, that’s probably the best way to break this down. To explain why I’d leave the world of infrastructure and code for the world of data and analytics, let’s explore the timing...
Aug. 1, 2015 07:45 PM EDT Reads: 415
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Aug. 1, 2015 04:45 PM EDT Reads: 496
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Aug. 1, 2015 03:30 PM EDT Reads: 254
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Aug. 1, 2015 03:00 PM EDT Reads: 527
Aug. 1, 2015 02:00 PM EDT Reads: 314
[slides] Storage for Docker Containers By @OnModulus | @DevOpsSummit #DevOps #Docker #Containers #Microservices
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
Jul. 31, 2015 11:45 PM EDT Reads: 794
Modern DevOps Tool Kit By @Logentries and @NewRelic | @DevOpsSummit #DevOps #Containers #Microservices
Auto-scaling environments, micro-service architectures and globally-distributed teams are just three common examples of why organizations today need automation and interoperability more than ever. But is interoperability something we simply start doing, or does it require a reexamination of our processes? And can we really improve our processes without first making interoperability a requirement for how we choose our tools?
Jul. 31, 2015 11:15 PM EDT Reads: 433
Cloud Migration Management (CMM) refers to the best practices for planning and managing migration of IT systems from a legacy platform to a Cloud Provider through a combination professional services consulting and software tools. A Cloud migration project can be a relatively simple exercise, where applications are migrated ‘as is’, to gain benefits such as elastic capacity and utility pricing, but without making any changes to the application architecture, software development methods or busine...
Jul. 31, 2015 10:00 PM EDT Reads: 1,357
The Internet of Things. Cloud. Big Data. Real-Time Analytics. To those who do not quite understand what these phrases mean (and let’s be honest, that’s likely to be a large portion of the world), words like “IoT” and “Big Data” are just buzzwords. The truth is, the Internet of Things encompasses much more than jargon and predictions of connected devices. According to Parker Trewin, Senior Director of Content and Communications of Aria Systems, “IoT is big news because it ups the ante: Reach out ...
Jul. 31, 2015 07:00 AM EDT Reads: 416
Where the Network Got Invited to the Party By @LMacVittie | @DevOpsSummit #DevOps #Docker #Containers #Microservices
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
Jul. 30, 2015 08:15 PM EDT Reads: 1,777
Designing the IT Architecture of the Future with Adrian Cockcroft | @DevOpsSummit #DevOps #Docker #Containers #Microservices
Our guest on the podcast this week is Adrian Cockcroft, Technology Fellow at Battery Ventures. We discuss what makes Docker and Netflix highly successful, especially through their use of well-designed IT architecture and DevOps.
Jul. 30, 2015 08:00 PM EDT Reads: 794
[slides] A New Architecture for the Internet of Things By @JKirklan | @ThingsExpo @RedHatNews #IoT #M2M #InternetOfThings
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Jul. 30, 2015 07:30 PM EDT Reads: 1,417
Take the Long View with Digital Transformation By @IoT2040 | @ThingsExpo #IoT #M2M #API #Microservices #InternetOfThings
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Jul. 30, 2015 05:00 PM EDT Reads: 1,110
[slides] Workloads and Public Cloud at @CloudExpo By @utollwi | @ProfitBricksUSA #DevOps #Containers #Microservices
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
Jul. 30, 2015 04:00 PM EDT Reads: 2,229
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
Jul. 30, 2015 02:30 PM EDT Reads: 133
[session] DevOps State of Mind By @RedHatNews | @DevOpsSummit #DevOps #PaaS #Jenkins #Kubernetes #Docker
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. The DevOps approach is a way to increase business agility through collaboration, communication, and integration across different teams in the IT organization. In his session at DevOps Summit, Chris Van Tuin, Chief Technologist for the Western US at Red Hat, will discuss: The acceleration of application delivery for the business with DevOps
Jul. 30, 2015 12:45 PM EDT Reads: 1,134