Welcome!

Microservices Expo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Derek Weeks, Mehdi Daoudi

Related Topics: Microservices Expo, Containers Expo Blog

Microservices Expo: Article

The Evolution of Computer Hardware

Where does it take us?

On a cutting-edge PC in 1990, the processor, the heart of any computer, would run at a paltry 33 megahertz.

The first Pentium processor from Intel appeared in 1993, performing calculations at a then-quick clock speed of 66 MHz. The race to build faster, more capable processors continued unabated, with milestones such as 1 GHz and 3 GHz reached along the way. Today, the emphasis is on building multiple processing cores in a single chip, with Intel's second-generation Core i7 offering four cores running at 3.4 GHz.

For memory, that 1990 computer would probably have relied on a mere 4 megabytes of RAM to run Windows 3.0. With each new release of Windows, RAM requirements increased. Windows 98 required 16MB, while Windows XP demanded 64MB of RAM, and for best results, Microsoft recommended 128MB. Windows 7 needs at least 1 gigabyte of RAM for the 32-bit version, and 2GB for the 64-bit operating system.

By the standards of the early 1990s, a 200MB hard drive was considered ample. Today, a typical digital camera user can fill that much space in few minutes' worth of shooting. By 1996, 6GB drives were available, although it was common to hear people wonder how they would ever fill that much space. Cheap, tiny USB drives now offer more storage than hard drives of the mid-1990s.

The first 750GB hard drive appeared in 2006, and today consumers can buy 3-terabyte drives.

The 1990 PC's bulky, beige 14-inch CRT monitor offered a maximum resolution of 1,024-by-768 pixels, compared to today's typical 1,920-by-1,080 resolution, with greater color depth.

For years, most home users were limited to 19-inch or less CRT monitors due to the size and expense of the displays. In the past five years or so, LCDs have simply taken over the display market, dropping rapidly in price even as the displays grew larger. Today, many users employ thin, bright 22- or 23-inch LCD screens.

It wasn't until the mid-1990s that 3-D graphics entered the mainstream computer market, the first step toward the increasingly photo-realistic images of today's computer games. When 3-D graphics cards appeared, 16 MB of dedicated video memory was high-end. Today's video cards contain their own graphics processing units and draw upon up to 2GB of video memory. Instead of the single-display support of years past, most video cards can support at least two monitors. Many offer HDMI connections to monitors and HDTVs.

For portable storage access, the 1990 PC relied on its floppy drive, which accepted 1.44MB diskettes. By 1994, computer makers were touting the arrival of CD-ROM drives and "multimedia PCs."

Today, few computer users bother installing a floppy drive, instead tapping the vastly greater capacity and usefulness of portable storage such as SD cards, external hard drives, USB "thumb" drives, and of course, DVD-RW drives, which have been a PC staple since the late 1990s.

Computer cases, too, have evolved from the days of the plain beige box. Today, cases come in a variety of colors and eye-catching designs, many offering tool-less access that makes swapping out a hard drive or optical drive a quick and painless process. Case ventilation, a key factor in keeping a computer running smoothly, has also come a long way, with multiple fans of various sizes placed around the case to provide maximum air flow.

While the keyboard and mouse remain common input devices, they have become far more versatile. Wireless keyboards and mice are preferred by many users. Solar-powered keyboards are available, and the mouse ball has given way to optical and laser mice that are far more precise. Touch screens are also making their way into the PC market.

More Stories By Anne Lee

Anne Lee is a freelance technology journalist, a wife and a mother of two.

@MicroservicesExpo Stories
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to clos...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Admiral Calcote - also known as Lee Calcote (@lcalcote) or the Ginger Geek to his friends - gave a presentation entitled Characterizing and Contrasting Container Orchestrators at the 2016 All Day DevOps conference. Okay, he isn't really an admiral - nor does anyone call him that - but he used the title admiral to describe what container orchestrators do, relating it to an admiral directing a fleet of container ships. You could also say that they are like the conductor of an orchestra, directing...
The past few years have seen a huge increase in the amount of critical IT services that companies outsource to SaaS/IaaS/PaaS providers, be it security, storage, monitoring, or operations. Of course, along with any outsourcing to a service provider comes a Service Level Agreement (SLA) to ensure that the vendor is held financially responsible for any lapses in their service which affect the customer’s end users, and ultimately, their bottom line. SLAs can be very tricky to manage for a number ...
Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things c...
Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
Some people are directors, managers, and administrators. Others are disrupters. Eddie Webb (@edwardawebb) is an IT Disrupter for Software Development Platforms at Liberty Mutual and was a presenter at the 2016 All Day DevOps conference. His talk, Organically DevOps: Building Quality and Security into the Software Supply Chain at Liberty Mutual, looked at Liberty Mutual's transformation to Continuous Integration, Continuous Delivery, and DevOps. For a large, heavily regulated industry, this task...
The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Mi...
The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday pri...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
Some journey to cloud on a mission, others, a deadline. Change management is useful when migrating to public, private or hybrid cloud environments in either case. For most, stakeholder engagement peaks during the planning and post migration phases of a project. Legacy engagements are fairly direct: projects follow a linear progression of activities (the “waterfall” approach) – change managers and application coders work from the same functional and technical requirements. Enablement and develo...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
For DevOps teams, the concepts behind service-oriented architecture (SOA) are nothing new. A style of software design initially made popular in the 1990s, SOA was an alternative to a monolithic application; essentially a collection of coarse-grained components that communicated with each other. Communication would involve either simple data passing or two or more services coordinating some activity. SOA served as a valid approach to solving many architectural problems faced by businesses, as app...