Welcome!

Microservices Expo Authors: Pat Romanski, Dalibor Siroky, Stackify Blog, Elizabeth White, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, @DXWorldExpo, SDN Journal, FinTech Journal

@CloudExpo: Article

DEvOps and SDDC Among Top 10 Strategic Technology Trends for 2014

Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next 3 years

Gartner, Inc. on Tuesday highlighted the top ten technologies and trends that will be strategic for most organizations in 2014. Analysts presented their findings during Gartner Symposium/ITxpo, being held here through October 10.

Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.

A strategic technology may be an existing technology that has matured and/or become suitable for a wider range of uses. It may also be an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years. These technologies impact the organization's long-term plans, programs and initiatives.

“We have identified the top 10 technologies that companies should factor into their strategic planning processes,” said David Cearley. “This does not necessarily mean adoption and investment in all of the listed technologies, but companies should look to make deliberate decisions about them during the next two years.”

Mr. Cearley said that the Nexus of Forces, the convergence of four powerful forces: social, mobile, cloud and information, continues to drive change and create new opportunities, creating demand for advanced programmable infrastructure that can execute at web-scale.

The top ten strategic technology trends for 2014 include:

Mobile Device Diversity and Management
Through 2018, the growing variety of devices, computing styles, user contexts and interaction paradigms will make "everything everywhere" strategies unachievable. The unexpected consequence of bring your own device (BYOD) programs is a doubling or even tripling of the size of the mobile workforce. This is placing tremendous strain on IT and Finance organizations. Enterprise policies on employee-owned hardware usage need to be thoroughly reviewed and, where necessary, updated and extended. Most companies only have policies for employees accessing their networks through devices that the enterprise owns and manages. Set policies to define clear expectations around what they can and can't do. Balance flexibility with confidentiality and privacy requirements

Mobile Apps and Applications
Gartner predicts that through 2014, improved JavaScript performance will begin to push HTML5 and the browser as a mainstream enterprise application development environment. Gartner recommends that developers focus on creating expanded user interface models including richer voice and video that can connect people in new and different ways. Apps will continue to grow while applications will begin to shrink. Apps are smaller, and more targeted, while a larger application is more comprehensive. Developers should look for ways to snap together apps to create larger applications. Building application user interfaces that span a variety of devices require an understanding of fragmented building blocks and an adaptable programming structure that assembles them into optimized content for each device. The market for tools to create consumer and enterprise facing apps is complex with well over 100 potential tools vendors. For the next few years no single tool will be optimal for all types of mobile application so expect to employ several. The next evolution in user experience will be to leverage intent, inferred from emotion and actions, to motivate changes in end-user behavior.

The Internet of Everything
The Internet is expanding beyond PCs and mobile devices into enterprise assets such as field equipment, and consumer items such as cars and televisions. The problem is that most enterprises and technology vendors have yet to explore the possibilities of an expanded internet and are not operationally or organizationally ready. Imagine digitizing the most important products, services and assets. The combination of data streams and services created by digitizing everything creates four basic usage models – Manage; Monetize; Operate; Extend. These four basic models can be applied to any of the four "internets” (people, things, information and places). Enterprises should not limit themselves to thinking that only the Internet of Things (i.e., assets and machines) has the potential to leverage these four models. Enterprises from all industries (heavy, mixed, and weightless) can leverage these four models.

Hybrid Cloud and IT as Service Broker
Bringing together personal clouds and external private cloud services is an imperative. Enterprises should design private cloud services with a hybrid future in mind and make sure future integration/interoperability is possible. Hybrid cloud services can be composed in many ways, varying from relatively static to very dynamic. Managing this composition will often be the responsibility of something filling the role of cloud service broker (CSB), which handles aggregation, integration and customization of services. Enterprises that are expanding into hybrid cloud computing from private cloud services are taking on the CSB role. Terms like "overdrafting" and "cloudbursting" are often used to describe what hybrid cloud computing will make possible. However, the vast majority of hybrid cloud services will initially be much less dynamic than that. Early hybrid cloud services will likely be more static, engineered compositions (such as integration between an internal private cloud and a public cloud service for certain functionality or data). More deployment compositions will emerge as CSBs evolve (for example, private infrastructure as a service [IaaS] offerings that can leverage external service providers based on policy and utilization).

Cloud/Client Architecture
Cloud/client computing models are shifting. In the cloud/client architecture, the client is a rich application running on an Internet-connected device, and the server is a set of application services hosted in an increasingly elastically scalable cloud computing platform. The cloud is the control point and system or record and applications can span multiple client devices. The client environment may be a native application or browser-based; the increasing power of the browser is available to many client devices, mobile and desktop alike. Robust capabilities in many mobile devices, the increased demand on networks, the cost of networks and the need to manage bandwidth use creates incentives, in some cases, to minimize the cloud application computing and storage footprint, and to exploit the intelligence and storage of the client device. However, the increasingly complex demands of mobile users will drive apps to demand increasing amounts of server-side computing and storage capacity.

The Era of Personal Cloud
The personal cloud era will mark a power shift away from devices toward services. In this new world, the specifics of devices will become less important for the organization to worry about, although the devices will still be necessary. Users will use a collection of devices, with the PC remaining one of many options, but no one device will be the primary hub. Rather, the personal cloud will take on that role. Access to the cloud and the content stored or shared from the cloud will be managed and secured, rather than solely focusing on the device itself.

Software Defined Anything
Software-defined anything (SDx) is a collective term that encapsulates the growing market momentum for improved standards for infrastructure programmability and data center interoperability driven by automation inherent to cloud computing, DevOps and fast infrastructure provisioning. As a collective, SDx also incorporates various initiatives like OpenStack, OpenFlow, the Open Compute Project and Open Rack, which share similar visions. As individual SDx technology silos evolve and consortiums arise, look for emerging standards and bridging capabilities to benefit portfolios, but challenge individual technology suppliers to demonstrate their commitment to true interoperability standards within their specific domains. While openness will always be a claimed vendor objective, different interpretations of SDx definitions may be anything but open. Vendors of SDN (network), SDDC (data center), SDS (storage), and SDI (infrastructure) technologies are all trying to maintain leadership in their respective domains, while deploying SDx initiatives to aid market adjacency plays. So vendors who dominate a sector of the infrastructure may only reluctantly want to abide by standards that have the potential to lower margins and open broader competitive opportunities, even when the consumer will benefit by simplicity, cost reduction and consolidation efficiency.

Web-Scale IT
Web-scale IT is a pattern of global-class computing that delivers the capabilities of large cloud service providers within an enterprise IT setting by rethinking positions across several dimensions. Large cloud services providers such as Amazon, Google, Facebook, etc., are re-inventing the way IT in which IT services can be delivered. Their capabilities go beyond scale in terms of sheer size to also include scale as it pertains to speed and agility. If enterprises want to keep pace, then they need to emulate the architectures, processes and practices of these exemplary cloud providers. Gartner calls the combination of all of these elements Web-scale IT. Web-scale IT looks to change the IT value chain in a systemic fashion. Data centers are designed with an industrial engineering perspective that looks for every opportunity to reduce cost and waste. This goes beyond re-designing facilities to be more energy efficient to also include in-house design of key hardware components such as servers, storage and networks. Web-oriented architectures allows developers to build very flexible and resilient systems that recover from failure more quickly.

Smart Machines
Through 2020, the smart machine era will blossom with a proliferation of contextually aware, intelligent personal assistants, smart advisors (such as IBM Watson), advanced global industrial systems and public availability of early examples of autonomous vehicles. The smart machine era will be the most disruptive in the history of IT. New systems that begin to fulfill some of the earliest visions for what information technologies might accomplish — doing what we thought only people could do and machines could not —are now finally emerging. Gartner expects individuals will invest in, control and use their own smart machines to become more successful. Enterprises will similarly invest in smart machines. Consumerization versus central control tensions will not abate in the era of smart-machine-driven disruption. If anything, smart machines will strengthen the forces of consumerization after the first surge of enterprise buying commences.

3-D Printing
Worldwide shipments of 3D printers are expected to grow 75 percent in 2014 followed by a near doubling of unit shipments in 2015. While very expensive “additive manufacturing” devices have been around for 20 years, the market for devices ranging from $50,000 to $500, and with commensurate material and build capabilities, is nascent yet growing rapidly. The consumer market hype has made organizations aware of the fact 3D printing is a real, viable and cost-effective means to reduce costs through improved designs, streamlined prototyping and short-run manufacturing.

More Stories By Elizabeth White

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

@MicroservicesExpo Stories
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
It has never been a better time to be a developer! Thanks to cloud computing, deploying our applications is much easier than it used to be. How we deploy our apps continues to evolve thanks to cloud hosting, Platform-as-a-Service (PaaS), and now Function-as-a-Service. FaaS is the concept of serverless computing via serverless architectures. Software developers can leverage this to deploy an individual "function", action, or piece of business logic. They are expected to start within milliseconds...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
From manual human effort the world is slowly paving its way to a new space where most process are getting replaced with tools and systems to improve efficiency and bring down operational costs. Automation is the next big thing and low code platforms are fueling it in a significant way. The Automation era is here. We are in the fast pace of replacing manual human efforts with machines and processes. In the world of Information Technology too, we are linking disparate systems, softwares and tool...
DevOps is good for organizations. According to the soon to be released State of DevOps Report high-performing IT organizations are 2X more likely to exceed profitability, market share, and productivity goals. But how do they do it? How do they use DevOps to drive value and differentiate their companies? We recently sat down with Nicole Forsgren, CEO and Chief Scientist at DORA (DevOps Research and Assessment) and lead investigator for the State of DevOps Report, to discuss the role of measure...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...
With continuous delivery (CD) almost always in the spotlight, continuous integration (CI) is often left out in the cold. Indeed, it's been in use for so long and so widely, we often take the model for granted. So what is CI and how can you make the most of it? This blog is intended to answer those questions. Before we step into examining CI, we need to look back. Software developers often work in small teams and modularity, and need to integrate their changes with the rest of the project code b...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Cloud4U builds software services that help people build DevOps platforms for cloud-based software and using our platform people can draw a picture of the system, network, software," explained Kihyeon Kim, CEO and Head of R&D at Cloud4U, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...