Welcome!

Microservices Expo Authors: Flint Brenton, Liz McMillan, Pat Romanski, Elizabeth White, Charles Araujo

News Feed Item

Dimension Data Launches New Technology Lifecycle Management Assessment

Holistic View of Technology Assets Enables Cost Containment, Improved Planning and Tighter Security

NEW YORK, Sept. 24 /PRNewswire/ -- Dimension Data, a global IT solutions and services provider, today announced the release of its next-generation technology lifecycle assessment service. The company's new Technology Lifecycle Management Assessment (TLM Assessment) enables organizations to contain costs, tighten security, and achieve greater automation and accuracy in network asset planning.

Dimension Data's TLM Assessment provides organizations with a high level of visibility into their networks. With a growing number of applications being deployed on today's corporate networks, there's an increasingly wide array of technology to be monitored and maintained. By automating asset identification and the evaluation of asset status, organizations can accurately identify basic security, configuration and end-of-life network device issues so they can be proactively addressed. This holistic view gives companies the time, knowledge and insight to plan for upgrades, make informed decisions on prolonging the use of certain assets, and budget for those that need upgrades or spares.

In addition to cost savings, Dimension Data's TLM Assessment can also help companies reap the "green" benefits associated with improved network asset planning. With a holistic view of technology assets, organizations can determine which equipment to replace, or which technologies, processes and practices to rationalize in order to improve their environmental impact. An accurate inventory of devices empowers organizations to reduce their carbon footprint without incurring the cost of new hardware or software.

As reported in Dimension Data's 2009 Network Barometer Report, most organizations currently have multiple network vulnerabilities putting them at risk:

  • 73% of networking devices are running with known security vulnerabilities. This exposes a business to both external and internal security attacks and breaches, and could seriously jeopardize an organization's ability to meet regulatory compliance.

  • There is an average of 30 configuration issues per network device, despite the fact that there are widely published and recommended standards to safeguard against these problems. The financial services sector, with an average of 36, has the highest average number of configuration errors per device.

  • Almost half of all network devices have entered the obsolescence cycle, putting organizations at risk for extended downtime and unplanned, forced expenditure to regain business continuity. Some industry analysts calculate that system downtime can cost as much as $42,000 per month in direct IT expenses and up to 3% of an organization's annual revenue.

"The great irony here is that every one of these problems is avoidable through thoughtful lifecycle management," said Rich Schofield, global business development manager, Network Integration at Dimension Data. "The growth of Web 2.0, Software as a Service (SaaS), video, voice and mobility applications will put even more pressure on corporate networks. To minimize risk, organizations need to conduct regular assessments via an automated service that constantly evolves and matures along with the technologies themselves. Organizations that don't get the help they need now could find they're being held hostage by their networks in the future."

Dimension Data's new TLM Assessment comprises three steps:

  • Discover: Dimension Data creates a list of network assets based on business and technical reviews with key stakeholders as well as onsite electronic discovery, ensuring that relevant information is collected and that lifecycle, security and configuration issues are identified.

  • Assess: Using automated tools, the asset list is analyzed against security, configuration and end-of-life databases. Based on this analysis, Dimension Data creates a technology roadmap, prioritizing and outlining configuration remediations, as well as security and maintenance recommendations.

  • Recommend: Dimension Data specialists and key stakeholders discuss the findings and determine how to act based on risk, cost and strategic factors. An action plan is developed.

About Dimension Data

Dimension Data (LSE: DDT), a specialist IT services and solutions provider, helps clients plan, build, support and manage their network and IT infrastructures. Dimension Data applies its expertise in networking, security, operating environments, storage and contact center technologies and its unique skills in consulting, integration and managed services to create customized client services. For more information: Call 866-DIDATA-US or visit www.dimensiondata.com/na.

    Media Contacts:
    Lisa Grimes                           Laura Sexton
    Dimension Data                        Davies Murphy Group
    (703) 217-2692                        (781) 418-2417
    [email protected]             [email protected]

SOURCE Dimension Data

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@MicroservicesExpo Stories
Don’t go chasing waterfall … development, that is. According to a recent post by Madison Moore on Medium featuring insights from several software delivery industry leaders, waterfall is – while still popular – not the best way to win in the marketplace. With methodologies like Agile, DevOps and Continuous Delivery becoming ever more prominent over the past 15 years or so, waterfall is old news. Or, is it? Moore cites a recent study by Gartner: “According to Gartner’s IT Key Metrics Data report, ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
We all know that end users experience the internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices - not doing so will be a path to eventual ...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...
"Opsani helps the enterprise adopt containers, help them move their infrastructure into this modern world of DevOps, accelerate the delivery of new features into production, and really get them going on the container path," explained Ross Schibler, CEO of Opsani, and Peter Nickolov, CTO of Opsani, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Docker is sweeping across startups and enterprises alike, changing the way we build and ship applications. It's the most prominent and widely known software container platform, and it's particularly useful for eliminating common challenges when collaborating on code (like the "it works on my machine" phenomenon that most devs know all too well). With Docker, you can run and manage apps side-by-side - in isolated containers - resulting in better compute density. It's something that many developer...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We view the cloud not as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo (http://www.CloudComputingExpo.com), held June 7-9 at the Javits Center in New York City, NY.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Archi...