Welcome!

Microservices Expo Authors: Liz McMillan, Mano Marks, JP Morgenthal, XebiaLabs Blog, Pat Romanski

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Agile Computing

@CloudExpo: Article

The Lost Stepping Stone to the Cloud

We propose that ITAM is on the cloud radar and in direct view straight ahead

IT Asset Management, or ITAM if you prefer, is the lost link to the cloud. There, now we've said it, do you feel better?

Before we can decide whether this is true, we need to understand what ITAM really is of course.

Sometimes called EAM (Enterprise Asset Management), the practice of asset management is quite a broad term used to describe the process of auditing and subsequently overseeing the total number of IT assets inside a company.

Encompassing hardware, but predominantly focused (these days) on software, asset management allows IT managers to improve cost control, perform maintenance scheduling, manage upgrades and look downstream towards decommissioning and replacement.

This last element of ITAM, i.e., ‘decommissioning and replacement' probably (more than any other aspect) comes from the traditional view of asset management where software, servers, desktops and mobile devices might be retired once they can be said to be obsolete.

From Retirement, to Upgrade
In the new service-based world of computing as typified by the cloud model we move to a point where retirement becomes a less tangible or visible process as services from our hosting provider of choice are simply upgraded. At the user end, there is often no visible difference (at either the datacenter or the desktop) apart from an improvement in performance.

But asset management goes beyond simple questions of system upgrade. With ITAM we are able to analyze the productive contribution made by every element of our IT stack to a higher-level business algorithm. Inside this calculation we can correlate and cross-correlate every IT asset against a financial model that encompasses contracts, profitability and trading figures right across the business.

...and Then Comes the Cloud
Analyst firm Gartner IT held its second Financial, Procurement & Asset Management Summit 2012 in London this October to discuss the impact of ITAM in light of the convergence of cloud, virtualization and bring your own device (BYOD) technologies.

So the connection starts to become clear. A migration to the cloud should be undertaken in order to improve computing flexibility and agility yes, but it should also be done with a view to optimizing costs and moving to a new virtualized computing framework where buying power is also improved.

Put simply, we have a new way of buying our IT assets in the form of services, so this means that we also have a new responsibility for managing those assets. In this scenario, we need ITAM to get us to the cloud, we need ITAM to manage us in the cloud and we need ITAM to keep us productively and profitably housed within the cloud as well.

What the Industry Says
Quoted on dedicated IT asset management website The ITAM Review, CTO and founder of iQuate Jason Keogh argues that asset management and the cloud must now focus on each other if we are to negotiate this missing stepping stone to virtual computing efficiency. Keogh describes ITAM and its place upon the stepping stones to cloud migration.

"A proper understanding of the risks and costs involved in moving expensive, datacenter and server [level] software into shared services and cloud-based infrastructures will save large enterprises millions of dollars, while reducing risk. If you don't agree strongly with that sentence, let me invert it for you: migrating expensive, server software to shared services and cloud-based infrastructure without a proper understanding of the risks involved will cost enterprises millions of dollars. Do you agree now?"

Cloud vendors and/or hosting providers largely concur and so advocate a diligent approach to asset management analysis to oversee (and help manage on an onward basis) the movement to a services-based computing structure.

"The ability to locate, audit, quantify and qualify the total scope of any firm's IT stack is a critical part of any progressive cloud adoption program," says Garry Prior, senior product manager at Rackspace.

"When firms have a clear understanding what it takes to transition their IT assets to the cloud and move to a utility billing model, there opens up a real opportunity to provision only the resources needed and avoid the future expense of depreciating or upgrading IT hardware. Adapting IT management for the cloud is central to our customers' deployments of virtualized hosted solutions and we welcome the burgeoning and strengthening of offerings in this sector."

Where Can I Buy Cloud ITAM?
Cloud ITAM offerings come in many shapes and sizes with Gartner defining many of these solutions as coming from the so-called ‘megavendors.' HP for its part offers HP Asset Manager software to automate best practices across the IT asset lifecycle and integrate with firms' IT processes.

According to the company's website, "With HP Asset Manager you can implement analytics to optimize the value of IT assets and services. You can improve audit compliance by controlling where assets are and who is using them throughout the life cycle. You can control purchasing and chargeback, and compare the performance of your vendors. HP Asset Manager provides IT service cost information along with the data and governance necessary for all your IT processes to function optimally."

Perhaps the fact that the aforementioned Gartner conference is now only in its second year should speak volumes to us in this space? The analyst firm doesn't need much encouragement to pump out its seemingly limitless market predictions. Equally, the firm can get a conference together to discuss emerging trends pretty fast - but we're only in year two!

We propose that ITAM is on the cloud radar and in direct view straight ahead.

•   •   •

This post first appeared on Enterprise CIO Forum.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Here’s a novel, but controversial statement, “it’s time for the CEO, COO, CIO to start to take joint responsibility for application platform decisions.” For too many years now technical meritocracy has led the decision-making for the business with regard to platform selection. This includes, but is not limited to, servers, operating systems, virtualization, cloud and application platforms. In many of these cases the decision has not worked in favor of the business with regard to agility and cost...
Software delivery was once specific to the IT industry. Now, Continuous Delivery pipelines are used around world from e-commerce to airline software. Building a software delivery pipeline once involved hours of scripting and manual steps–a process that’s painful, if not impossible, to scale. However Continuous Delivery with Application Release Automation tools offers a scripting-free, automated experience. Continuous Delivery pipelines are immensely powerful for the modern enterprise, boosting ...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
True Story. Over the past few years, Fannie Mae transformed the way in which they delivered software. Deploys increased from 1,200/month to 15,000/month. At the same time, productivity increased by 28% while reducing costs by 30%. But, how did they do it? During the All Day DevOps conference, over 13,500 practitioners from around the world to learn from their peers in the industry. Barry Snyder, Senior Manager of DevOps at Fannie Mae, was one of 57 practitioners who shared his real world journe...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
As 2016 approaches its end, the time to prepare for the year ahead is now! Following our own advice, we sat down with three XebiaLabs thought leaders–Andrew Phillips, Tim Buntel, and TJ Randall–and asked what they think the future has in store for the DevOps world. In 2017, we’ll see a new wave of “next gen platform” projects focused on container orchestration frameworks such as Kubernetes, and re-tooled PaaS platforms such as OpenShift or Cloud Foundry. Acceptance of the need for a cross-machi...
We call it DevOps but much of the time there’s a lot more discussion about the needs and concerns of developers than there is about other groups. There’s a focus on improved and less isolated developer workflows. There are many discussions around collaboration, continuous integration and delivery, issue tracking, source code control, code review, IDEs, and xPaaS – and all the tools that enable those things. Changes in developer practices may come up – such as developers taking ownership of code ...
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...