Welcome!

Microservices Expo Authors: Stackify Blog, Automic Blog, Simon Hill, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, @DXWorldExpo, SDN Journal

@CloudExpo: Article

Mainframe: A Resilient Model for the Modern Cloud

The emerging cloud-based model of computing requires systems that can provide fast response times to huge volumes of requests

Technology is moving at a blistering pace. In today's era of data-centric, complex environments where the lines between business and technology are becoming increasingly blurred, organizations are moving beyond virtualization to cloud computing to meet new challenges and keep up with the pace of change. Critical investments are needed to keep companies competitive, and chief among these technologies is cloud computing. In fact, Gartner expects cloud computing to become the bulk of new IT expenditure by 2016. The bottom line is, if you're not already looking at cloud as an essential investment, you're risking your survival into the next era of computing.

The emerging cloud-based model of computing requires systems that can provide very fast response times to huge volumes of requests. And, mission critical services such as healthcare, finance, transportation, public utilities, and other industries require very high levels of availability, security and other industrial-strength capabilities. Those attributes, qualities and requirements make the mainframe the ideal platform for such mission critical cloud-based workloads.

Cloud computing is a modern extension of a concept first developed nearly 50 years ago with the mainframe. The inherent spirit behind mainframe based computing was to serve users in remote locations at the same time, on a pay-as-you-go basis. The mainframe was introduced as the most robust, scalable system ever built, and with continued innovation the system has maintained its leadership status as one of the platforms of choice to handle today's complex workloads including sophisticated public, private and hybrid cloud computing environments. At its core, the mainframe was designed around three key traits - virtualization, standardization and provisioning. Not coincidentally, these are the foundational requirements for true cloud implementation.

Most enterprises today started their cloud journey with low-risk applications and high agility requirements. This approach allows customers to ease into cloud computing, learn and adjust their management of the cloud, and build the confidence to introduce more demanding applications.  The applications tend to use web technologies and architectures that can be scaled on commodity infrastructures, using load balancing and service cloning. Batch workloads that fit with commodity infrastructures are another popular workload on clouds.

For private, public or hybrid clouds, the mainframe can provide the following key requirements:

  • Scalability - users need to scale quickly and efficiently both up and down with complete confidence and zero loss of availability.
  • Reliability - a cloud computing environment that is always accessible with guaranteed application performance, limited to no downtime with provisions for rapid recovery from failure.
  • Multi-Tenancy - allowing multiple users to access software applications on the same system, concurrently and securely, critical for cloud service providers hosting many organizations in a single cloud infrastructure and for enterprises deploying private clouds to manage growth through acquisitions to host multiple companies in the same infrastructure;
  • Cost Efficiency - consolidating a distributed x86 cloud environment onto one mainframe creates a simplified, more efficient environment with reductions in floor space and power requirements, and higher return on investment over the life of the platform;
  • Security - the mainframe has unmatched system security with ensured isolation and protection of each virtual server environment.

Companies across various industries are gaining these advantages and efficiencies by consolidating cloud environments on a mainframe, such as:

By consolidating cloud on a mainframe private cloud solution that replaced thousands of standalone servers for its daily business activities like policy verification, claims processing, and generating customer quotations, Nationwide Insurance has saved 80% in energy and facility costs. The consolidation saved the company roughly $15 million over three years and will only continue to efficiently keep costs down in the future. Additionally, this solution gives them the capacity, processing speeds and reliability to increase the pace of innovation across its products and channels as it continues to grow.

By leveraging the cloud capabilities offered by the mainframe, Marist College was able to extend its business analytics technology to its academic community including researchers and students, while extracting even more value from its IT investments. By providing its analytics technology via cloud, the college has been able to expose analytics tools to a wide variety of programs, including technical disciplines and also business, liberal arts and communications programs so students learn how to apply it to their fields of study. Marist has also realized significant financial benefits, saving roughly $350,000 by using the cloud to support the college's ERP system.

The mainframe, with its shared platform, integration, and secure design attributes combined with continuous innovation, has enabled organizations to stay ahead of changing market dynamics with a solution that embodies efficiency, economics and agility - a resilient solution for today's cloud environment.

More Stories By Jose Castano

Jose Castano is the Director for the System z Growth Initiatives in IBM’s Systems & Technology Group. He has over 25 years of experience within IBM and has held multiple key positions in System z during this tenure.

Jose has worldwide responsibility to drive new workloads on System z. This includes Cloud, Analytics, Mobile, and Security He sets the business and technical strategy and direction for the System z platform. He drives coordination and collaboration of the System z ecosystem, from marketing, sales, business partners, consultants, and most importantly customers; leading the platform through an evolution that maintains leadership and meets customer and industry requirements.

Jose has a team comprised of workload and industry architects (who focus on business trends, market and industry requirements and develop solutions/offerings). Offering managers (who are responsible for the GTM for the solutions/offerings) and ISV managers (who work with our ecosystem to support new and existing workloads). Together, these teams have responsibility for researching, designing, building and maintaining the new workload strategy and its roadmap for IBM System z, driving the plans for the next 3-5 years.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
DevOps teams have more on their plate than ever. As infrastructure needs grow, so does the time required to ensure that everything's running smoothly. This makes automation crucial - especially in the server and network monitoring world. Server monitoring tools can save teams time by automating server management and providing real-time performance updates. As budgets reset for the New Year, there is no better time to implement a new server monitoring tool (or re-evaluate your current solution)....
The benefits of automation are well documented; it increases productivity, cuts cost and minimizes errors. It eliminates repetitive manual tasks, freeing us up to be more innovative. By that logic, surely, we should automate everything possible, right? So, is attempting to automate everything a sensible - even feasible - goal? In a word: no. Consider this your short guide as to what to automate and what not to automate.
Cavirin Systems has just announced C2, a SaaS offering designed to bring continuous security assessment and remediation to hybrid environments, containers, and data centers. Cavirin C2 is deployed within Amazon Web Services (AWS) and features a flexible licensing model for easy scalability and clear pay-as-you-go pricing. Although native to AWS, it also supports assessment and remediation of virtual or container instances within Microsoft Azure, Google Cloud Platform (GCP), or on-premise. By dr...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
The cloud revolution in enterprises has very clearly crossed the phase of proof-of-concepts into a truly mainstream adoption. One of most popular enterprise-wide initiatives currently going on are “cloud migration” programs of some kind or another. Finding business value for these programs is not hard to fathom – they include hyperelasticity in infrastructure consumption, subscription based models, and agility derived from rapid speed of deployment of applications. These factors will continue to...
While we understand Agile as a means to accelerate innovation, manage uncertainty and cope with ambiguity, many are inclined to think that it conflicts with the objectives of traditional engineering projects, such as building a highway, skyscraper or power plant. These are plan-driven and predictive projects that seek to avoid any uncertainty. This type of thinking, however, is short-sighted. Agile approaches are valuable in controlling uncertainty because they constrain the complexity that ste...
Digital transformation has changed the way users interact with the world, and the traditional healthcare experience no longer meets rising consumer expectations. Enterprise Health Clouds (EHCs) are designed to easily and securely deliver the smart and engaging digital health experience that patients expect today, while ensuring the compliance and data integration that care providers require. Jikku Venkat
identify the sources of event storms and performance anomalies will require automated, real-time root-cause analysis. I think Enterprise Management Associates said it well: “The data and metrics collected at instrumentation points across the application ecosystem are essential to performance monitoring and root cause analysis. However, analytics capable of transforming data and metrics into an application-focused report or dashboards are what separates actual application monitoring from relat...
"This all sounds great. But it's just not realistic." This is what a group of five senior IT executives told me during a workshop I held not long ago. We were working through an exercise on the organizational characteristics necessary to successfully execute a digital transformation, and the group was doing their ‘readout.' The executives loved everything we discussed and agreed that if such an environment existed, it would make transformation much easier. They just didn't believe it was reali...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Many enterprise and government IT organizations are realizing the benefits of cloud computing by extending IT delivery and management processes across private and public cloud services. But they are often challenged with balancing the need for centralized cloud governance without stifling user-driven innovation. This strategy requires an approach that fundamentally reshapes how IT is delivered today, shifting the focus from infrastructure to services aggregation, and mixing and matching the bes...
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
We just came off of a review of a product that handles both containers and virtual machines in the same interface. Under the covers, implementation of containers defaults to LXC, though recently Docker support was added. When reading online, or searching for information, increasingly we see “Container Management” products listed as competitors to Docker, when in reality things like Rocket, LXC/LXD, and Virtualization are Dockers competitors. After doing some looking around, we have decided tha...
The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably. The impact of cloud adoption on test environments is a valuable advancement in both cost savings and agility. The on-demand model takes advantage of public cloud APIs requiring only payment for t...
"We are an integrator of carrier ethernet and bandwidth to get people to connect to the cloud, to the SaaS providers, and the IaaS providers all on ethernet," explained Paul Mako, CEO & CTO of Massive Networks, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
From our perspective as consumers, perhaps the best thing about digital transformation is how consumerization is making technology so much easier to use. Sure, our television remote controls still have too many buttons, and I have yet to figure out the digital display in my Honda, but all in all, tech is getting easier for everybody. Within companies – even very large ones – the consumerization of technology is gradually taking hold as well. There are now simple mobile apps for a wide range of ...