|By Adrian Bridgwater||
|November 3, 2012 01:00 PM EDT||
IT Asset Management, or ITAM if you prefer, is the lost link to the cloud. There, now we've said it, do you feel better?
Before we can decide whether this is true, we need to understand what ITAM really is of course.
Sometimes called EAM (Enterprise Asset Management), the practice of asset management is quite a broad term used to describe the process of auditing and subsequently overseeing the total number of IT assets inside a company.
Encompassing hardware, but predominantly focused (these days) on software, asset management allows IT managers to improve cost control, perform maintenance scheduling, manage upgrades and look downstream towards decommissioning and replacement.
This last element of ITAM, i.e., ‘decommissioning and replacement' probably (more than any other aspect) comes from the traditional view of asset management where software, servers, desktops and mobile devices might be retired once they can be said to be obsolete.
From Retirement, to Upgrade
In the new service-based world of computing as typified by the cloud model we move to a point where retirement becomes a less tangible or visible process as services from our hosting provider of choice are simply upgraded. At the user end, there is often no visible difference (at either the datacenter or the desktop) apart from an improvement in performance.
But asset management goes beyond simple questions of system upgrade. With ITAM we are able to analyze the productive contribution made by every element of our IT stack to a higher-level business algorithm. Inside this calculation we can correlate and cross-correlate every IT asset against a financial model that encompasses contracts, profitability and trading figures right across the business.
...and Then Comes the Cloud
Analyst firm Gartner IT held its second Financial, Procurement & Asset Management Summit 2012 in London this October to discuss the impact of ITAM in light of the convergence of cloud, virtualization and bring your own device (BYOD) technologies.
So the connection starts to become clear. A migration to the cloud should be undertaken in order to improve computing flexibility and agility yes, but it should also be done with a view to optimizing costs and moving to a new virtualized computing framework where buying power is also improved.
Put simply, we have a new way of buying our IT assets in the form of services, so this means that we also have a new responsibility for managing those assets. In this scenario, we need ITAM to get us to the cloud, we need ITAM to manage us in the cloud and we need ITAM to keep us productively and profitably housed within the cloud as well.
What the Industry Says
Quoted on dedicated IT asset management website The ITAM Review, CTO and founder of iQuate Jason Keogh argues that asset management and the cloud must now focus on each other if we are to negotiate this missing stepping stone to virtual computing efficiency. Keogh describes ITAM and its place upon the stepping stones to cloud migration.
"A proper understanding of the risks and costs involved in moving expensive, datacenter and server [level] software into shared services and cloud-based infrastructures will save large enterprises millions of dollars, while reducing risk. If you don't agree strongly with that sentence, let me invert it for you: migrating expensive, server software to shared services and cloud-based infrastructure without a proper understanding of the risks involved will cost enterprises millions of dollars. Do you agree now?"
Cloud vendors and/or hosting providers largely concur and so advocate a diligent approach to asset management analysis to oversee (and help manage on an onward basis) the movement to a services-based computing structure.
"The ability to locate, audit, quantify and qualify the total scope of any firm's IT stack is a critical part of any progressive cloud adoption program," says Garry Prior, senior product manager at Rackspace.
"When firms have a clear understanding what it takes to transition their IT assets to the cloud and move to a utility billing model, there opens up a real opportunity to provision only the resources needed and avoid the future expense of depreciating or upgrading IT hardware. Adapting IT management for the cloud is central to our customers' deployments of virtualized hosted solutions and we welcome the burgeoning and strengthening of offerings in this sector."
Where Can I Buy Cloud ITAM?
Cloud ITAM offerings come in many shapes and sizes with Gartner defining many of these solutions as coming from the so-called ‘megavendors.' HP for its part offers HP Asset Manager software to automate best practices across the IT asset lifecycle and integrate with firms' IT processes.
According to the company's website, "With HP Asset Manager you can implement analytics to optimize the value of IT assets and services. You can improve audit compliance by controlling where assets are and who is using them throughout the life cycle. You can control purchasing and chargeback, and compare the performance of your vendors. HP Asset Manager provides IT service cost information along with the data and governance necessary for all your IT processes to function optimally."
Perhaps the fact that the aforementioned Gartner conference is now only in its second year should speak volumes to us in this space? The analyst firm doesn't need much encouragement to pump out its seemingly limitless market predictions. Equally, the firm can get a conference together to discuss emerging trends pretty fast - but we're only in year two!
We propose that ITAM is on the cloud radar and in direct view straight ahead.
• • •
This post first appeared on Enterprise CIO Forum.
Opinions on how best to package and deliver applications are legion and, like many other aspects of the software world, are subject to recurring trend cycles. On the server-side, the current favorite is container delivery: a “full stack” approach in which your application and everything it needs to run are specified in a container definition. That definition is then “compiled” down to a container image and deployed by retrieving the image and passing it to a container runtime to create a running...
Oct. 9, 2015 02:30 AM EDT Reads: 223
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Oct. 9, 2015 02:00 AM EDT Reads: 2,191
If you are new to Python, you might be confused about the different versions that are available. Although Python 3 is the latest generation of the language, many programmers still use Python 2.7, the final update to Python 2, which was released in 2010. There is currently no clear-cut answer to the question of which version of Python you should use; the decision depends on what you want to achieve. While Python 3 is clearly the future of the language, some programmers choose to remain with Py...
Oct. 9, 2015 02:00 AM EDT Reads: 240
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Oct. 9, 2015 02:00 AM EDT Reads: 279
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
Oct. 9, 2015 02:00 AM EDT Reads: 211
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Oct. 9, 2015 02:00 AM EDT Reads: 874
As we increasingly rely on technology to improve the quality and efficiency of our personal and professional lives, software has become the key business differentiator. Organizations must release software faster, as well as ensure the safety, security, and reliability of their applications. The option to make trade-offs between time and quality no longer exists—software teams must deliver quality and speed. To meet these expectations, businesses have shifted from more traditional approaches of d...
Oct. 9, 2015 01:45 AM EDT Reads: 225
Ten years ago, there may have been only a single application that talked directly to the database and spit out HTML; customer service, sales - most of the organizations I work with have been moving toward a design philosophy more like unix, where each application consists of a series of small tools stitched together. In web example above, that likely means a login service combines with webpages that call other services - like enter and update record. That allows the customer service team to writ...
Oct. 9, 2015 01:45 AM EDT Reads: 424
JFrog has announced a powerful technology for managing software packages from development into production. JFrog Artifactory 4 represents disruptive innovation in its groundbreaking ability to help development and DevOps teams deliver increasingly complex solutions on ever-shorter deadlines across multiple platforms JFrog Artifactory 4 establishes a new category – the Universal Artifact Repository – that reflects JFrog's unique commitment to enable faster software releases through the first pla...
Oct. 9, 2015 12:30 AM EDT Reads: 630
Somebody call the buzzword police: we have a serious case of microservices-washing in progress. The term “microservices-washing” is derived from “whitewashing,” meaning to hide some inconvenient truth with bluster and nonsense. We saw plenty of cloudwashing a few years ago, as vendors and enterprises alike pretended what they were doing was cloud, even though it wasn’t. Today, the hype around microservices has led to the same kind of obfuscation, as vendors and enterprise technologists alike ar...
Oct. 9, 2015 12:00 AM EDT Reads: 449
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Oct. 9, 2015 12:00 AM EDT Reads: 219
With containerization using Docker, the orchestration of containers using Kubernetes, the self-service model for provisioning your projects and applications and the workflows we built in OpenShift is the best in class Platform as a Service that enables introducing DevOps into your organization with ease. In his session at DevOps Summit, Veer Muchandi, PaaS evangelist with RedHat, will provide a deep dive overview of OpenShift v3 and demonstrate how it helps with DevOps.
Oct. 8, 2015 11:30 PM EDT Reads: 655
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
Oct. 8, 2015 10:00 PM EDT Reads: 591
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Oct. 8, 2015 09:45 PM EDT Reads: 1,253
The APN DevOps Competency highlights APN Partners who demonstrate deep capabilities delivering continuous integration, continuous delivery, and configuration management. They help customers transform their business to be more efficient and agile by leveraging the AWS platform and DevOps principles.
Oct. 8, 2015 09:30 PM EDT Reads: 235
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Oct. 8, 2015 09:30 PM EDT Reads: 213
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Oct. 8, 2015 09:15 PM EDT Reads: 227
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Oct. 8, 2015 09:00 PM EDT Reads: 1,093
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Oct. 8, 2015 06:00 PM EDT Reads: 141
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Oct. 8, 2015 05:00 PM EDT Reads: 439