|By Patrick Burke||
|November 16, 2012 10:00 AM EST||
Hurricane Sandy brought with it a heavy dose of disaster, but utilizing the cloud helped mitigate what could have been even more difficulties faced by businesses.
"Overall, I think cloud does help," said Stephanie Balaouras, an analyst at Forrester Research. "Tier one cloud and SaaS providers such as Google and Amazon operate their cloud services from multiple data centers and can simply shift workloads to other locations as needed. They are also able to deliver a level of availability that many organizations could never achieve themselves. This includes the resiliency of the data center infrastructure itself to the resources that they invest in high availability and disaster recovery capabilities," Balaouras said, according to an article on SearchCloudSecurity.com.
"Cloud computing can absolutely help in BC/DR operations," said Kevin O'Shea, information security practice lead at engineering, construction and technical services firm URS Corporation. "For example, we saw several large webhost providers switch to alternate locations when their primary data centers went offline in New York City. However, businesses must be organized in such a way as to be able to offload critical applications and data to a cloud provider," he said.
O'Shea added that virtualization, identification and segregation of critical business processes are important steps that get organizations prepped for cloud services. "But rarely do we see businesses that have identified all their critical business processes, identified the critical cyber assets that support those processes, and virtualized and replicated key servers and data," he said. "Like many other problems in information security and BC/DR, gaining significant leverage from cloud services is not solely a technology issue. It involves a coordinated effort across the business processes, infrastructure and culture."
Where Is Cloud Computing Heading in 2013?
Analysts see a future in which cloud computing is moving IT from on-premise to off-premise and gaining a slice of business spending.
With cloud computing maturing this year, more organizations will start to move their IT infrastructure from on-premise to off-premise, according to IDC head of research Matthew Oostveen.
"2012 was the year that we all got tired of cloud; there's cloud fatigue," he said, according to an article in ComputerworldUK.com.
"What is certain is we are watching a migration taking place where on-premise computing is moving to off-premise computing. It may start incrementally where we see an up take of co-location services, and obviously the co-locations services are being supported by the influx of new data centers in the market place."
Telsyte senior analyst Rodney Gedda's predictions are similar to Oostveen's. "Cloud computing [services] will mature next year and continue to be procured as a replacement to on-premise infrastructure and as an option for service delivery," Gedda said.
He said there will be more innovation in application testing and development in 2013, with new cloud applications coming out to market.
"We can expect to see new types of applications delivered out of the cloud, more options for where data is hosted (and the type of infrastructure its hosted on) and more services offering enterprise-grade application hosting.
"We can also expect to see very strong enterprise-grade services coming out to market," according to Deloitte Consulting technology leader Robert Hillard.
"File sharing, document sharing, collaboration will very quickly gain enterprise strength and much greater support within the enterprise traditional governance," he said.
Why ‘Cloud' Is No Longer Sufficient
"Cloud" means many things to many people, so it's important to find out which type of cloud is being discussed in order to leverage it within an enterprise, according to an article on Wired.com.
Is it a public, off-premise, commodity, IaaS cloud? Or is it a private, on-premise, enterprise, PaaS cloud? Or another cloud model that combines various attributes?
In regard to the private cloud model, why would a company consider implementing one? Private cloud makes sense for companies because they can still achieve many of the various cloud model benefits. Implementing a cloud model is a great opportunity to improve IT services, introduce flexibility, decrease server implementation and reclamation times, develop chargeback and/or showback capabilities to demonstrate value, and just improve the reputation of IT in general, Wired says.
Additionally, you can pursue private cloud while leveraging your existing investments in both infrastructure and in people skills. And because you are building your private cloud on-premise within the internal network and on the existing virtual server environment, you are immediately negating a whole host of concerns around application incompatibilities. Perhaps most important, the idea of maintaining complete control over compute resources and valuable data on premise is most appealing.
Throwing around the term "cloud" is no longer sufficient. IT cloud computing models of different types serve different purposes, and they all have their proper place in the enterprise. Different cloud environments have the potential to provide enterprises with scalable and flexible IT platforms that add business value; the trick is finding which cloud computing models work best for your business.
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
Oct. 1, 2016 09:45 AM EDT Reads: 820
SYS-CON Events announced today that Sheng Liang to Keynote at SYS-CON's 19th Cloud Expo, which will take place on November 1-3, 2016 at the Santa Clara Convention Center in Santa Clara, California.
Oct. 1, 2016 09:15 AM EDT Reads: 119
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
Oct. 1, 2016 09:00 AM EDT Reads: 2,938
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Oct. 1, 2016 08:45 AM EDT Reads: 5,502
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
Oct. 1, 2016 07:00 AM EDT Reads: 2,411
With online viewership and sales growing rapidly, enterprises are interested in understanding how they analyze performance to positively impact business metrics. Deeper insight into the user experience is needed to understand why conversions are dropping and/or bounce rates are increasing or, preferably, to understand what has been helping these metrics improve. The digital performance management industry has evolved as application performance management companies have broadened their scope beyo...
Oct. 1, 2016 07:00 AM EDT Reads: 1,488
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Oct. 1, 2016 06:30 AM EDT Reads: 1,933
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Oct. 1, 2016 06:00 AM EDT Reads: 2,809
Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS - software, platform, and infrastructure as a service.
Oct. 1, 2016 05:30 AM EDT Reads: 1,056
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
Oct. 1, 2016 05:15 AM EDT Reads: 1,003
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Oct. 1, 2016 05:00 AM EDT Reads: 4,762
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Oct. 1, 2016 04:30 AM EDT Reads: 1,834
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 1, 2016 03:30 AM EDT Reads: 1,238
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
Oct. 1, 2016 02:30 AM EDT Reads: 3,059
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Oct. 1, 2016 02:15 AM EDT Reads: 2,159
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.
Oct. 1, 2016 02:15 AM EDT Reads: 569
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Oct. 1, 2016 01:15 AM EDT Reads: 3,058
Cloud Expo 2016 New York at the Javits Center New York was characterized by increased attendance and a new focus on operations. These were both encouraging signs for all involved in Cloud Computing and all that it touches. As Conference Chair, I work with the Cloud Expo team to structure three keynotes, numerous general sessions, and more than 150 breakout sessions along 10 tracks. Our job is to balance the state of enterprise IT today with the trends that will be commonplace tomorrow. Mobile...
Oct. 1, 2016 01:15 AM EDT Reads: 4,380
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software sec...
Oct. 1, 2016 12:15 AM EDT Reads: 1,437
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Sep. 30, 2016 10:30 PM EDT Reads: 934