Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Carmen Gonzalez, Elizabeth White, Jason Bloomberg

Related Topics: @CloudExpo, Microservices Expo, Cloud Security

@CloudExpo: News Item

Neebula Promotes Service-Centric IT Management to the Cloud

Previews SaaS-Optimized Neebula ServiceWatch to automatically discover, map and manage business services in the datacenter

Neebula Systems, a provider of business-level service modeling, management, and automated full-stack discovery and dependency mapping solutions, invites customers to preview the Neebula ServiceWatch solution in the cloud.  For the first time, IT managers will be able to use a Software-as-a-Service (SaaS)-based product to quickly and effectively discover and map IT resources - hardware and software - that make up a specific business service. This eliminates the long, labor-intensive process of installing on-premise software and then manually discovering and mapping IT resources.

Neebula ServiceWatch is a top-down, business-level discovery and dependency mapping product which leverages patented technology to automate the entire service modeling process, requiring no manual intervention. Neebula ServiceWatch employs the service models to assist IT administrators in managing the availability of business services. Legacy solutions can take months or years to do what Neebula's automated approach accomplishes in days.

Neebula ServiceWatch is now available for a free 30-day preview  at http://www.neebula.com/landing/preview-servicewatch/.

Providing all the benefits of modern SaaS offerings, including self-service configuration, low entry costs, reduced infrastructure investments, increased accessibility, ease-of-navigability, and straight-forward implementation, Neebula ServiceWatch starts the discovery process with the entry point to the business service (e.g. URL, MQ request, Citrix client etc.), automatically discovers and maps all IT infrastructure components - hardware and software - upon which the business service depends, and provides IT administrators with a single-pane dashboard view into the health of critical services. The Neebula top-down, service-centric approach frees IT managers from the need to possess detailed knowledge of server, storage, network and application infrastructure, as well as the manual, cost-intensive, and oftentimes fruitless effort to map and maintain dependencies between these items.

The next-generation of Neebula ServiceWatch is an evolution of the same proven application deployed at global enterprises, Fortune 5000 companies, and government / education customers in Europe and North America, including Amdocs, Bechtel, Ceva, EL AL Airlines, and leading firms in the financial service market. ServiceWatch is targeted at dynamic and growing organizations seeking a modern, limited footprint, turn-key service-centric IT management solution while at the same time demanding enterprise class capabilities, flexibility, and scalability.

"Neebula ServiceWatch empowers IT administrators by allowing them to focus first on the business service, which is what their end-user consumes and cares about most," said Ariel Gordon, VP Products and co-founder of Neebula. "No more frustrating 'boil the ocean' activities to discover and manually create relationships between IT assets that are immediately out-of-date.  The SaaS-delivered nature of ServiceWatch reinforces Neebula's ability to deliver immediate and actionable results by removing the onus of upfront capital expenditures, complex installations, services-assisted configurations, and the headaches of managing yet another tool on-premise."

IT Management Evolved: Service-Centric IT Management
Neebula ServiceWatch fills a major IT management gap by discovering and matching IT resources, such as hardware and software, with the business service. The latter represents what end-users in an IT environment actually use, as opposed to the individual hardware and software that make those services possible. For example: an inventory management system may be made up of myriad applications, databases , servers and routers, but the person using it only sees what's on the screen - the particular business service. Defining these relationships provides IT managers with the ability to make decisions and take actions that will decrease time-to-implement, enhance productivity, increase the efficiency of the datacenter and reduce overall costs by a substantial amount.

The traditional approach to business service modeling involves leveraging the output of discovery tools to manually construct relationships between hardware and software in the datacenter. Creating this model and keeping it up to date is an enormous task requiring extensive manual labor. Add to this the fact that datacenters are becoming more complex through the adoption of virtualization and private, public, and hybrid cloud architectures, making the task of manually building and maintaining the service map even more difficult since it is constantly changing.

More Stories By Glenn Rossman

Glenn Rossman has more than 25 years communications experience working at IBM and Hewlett-Packard, along with startup StorageApps, plus agencies Hill & Knowlton and G&A Communications. His experience includes media relations, industry and financial analyst relations, executive communications, intranet and employee communications, as well as producing sales collateral. In technology, his career includes work in channel partner communications, data storage technologies, server computers, software, PC and UNIX computers, along with specific industry initiatives such as manufacturing, medical, and finance. Before his latest stint in technology, Glenn did business-to-business public relations on behalf of the DuPont Company for its specialty polymers products and with the largest steel companies in North America in an initiative focused on automakers.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and...