Welcome!

Microservices Expo Authors: Elizabeth White, Pat Romanski, Mehdi Daoudi, Liz McMillan, Flint Brenton

Related Topics: Microservices Expo, Containers Expo Blog

Microservices Expo: Article

The Evolution of Computer Hardware

Where does it take us?

On a cutting-edge PC in 1990, the processor, the heart of any computer, would run at a paltry 33 megahertz.

The first Pentium processor from Intel appeared in 1993, performing calculations at a then-quick clock speed of 66 MHz. The race to build faster, more capable processors continued unabated, with milestones such as 1 GHz and 3 GHz reached along the way. Today, the emphasis is on building multiple processing cores in a single chip, with Intel's second-generation Core i7 offering four cores running at 3.4 GHz.

For memory, that 1990 computer would probably have relied on a mere 4 megabytes of RAM to run Windows 3.0. With each new release of Windows, RAM requirements increased. Windows 98 required 16MB, while Windows XP demanded 64MB of RAM, and for best results, Microsoft recommended 128MB. Windows 7 needs at least 1 gigabyte of RAM for the 32-bit version, and 2GB for the 64-bit operating system.

By the standards of the early 1990s, a 200MB hard drive was considered ample. Today, a typical digital camera user can fill that much space in few minutes' worth of shooting. By 1996, 6GB drives were available, although it was common to hear people wonder how they would ever fill that much space. Cheap, tiny USB drives now offer more storage than hard drives of the mid-1990s.

The first 750GB hard drive appeared in 2006, and today consumers can buy 3-terabyte drives.

The 1990 PC's bulky, beige 14-inch CRT monitor offered a maximum resolution of 1,024-by-768 pixels, compared to today's typical 1,920-by-1,080 resolution, with greater color depth.

For years, most home users were limited to 19-inch or less CRT monitors due to the size and expense of the displays. In the past five years or so, LCDs have simply taken over the display market, dropping rapidly in price even as the displays grew larger. Today, many users employ thin, bright 22- or 23-inch LCD screens.

It wasn't until the mid-1990s that 3-D graphics entered the mainstream computer market, the first step toward the increasingly photo-realistic images of today's computer games. When 3-D graphics cards appeared, 16 MB of dedicated video memory was high-end. Today's video cards contain their own graphics processing units and draw upon up to 2GB of video memory. Instead of the single-display support of years past, most video cards can support at least two monitors. Many offer HDMI connections to monitors and HDTVs.

For portable storage access, the 1990 PC relied on its floppy drive, which accepted 1.44MB diskettes. By 1994, computer makers were touting the arrival of CD-ROM drives and "multimedia PCs."

Today, few computer users bother installing a floppy drive, instead tapping the vastly greater capacity and usefulness of portable storage such as SD cards, external hard drives, USB "thumb" drives, and of course, DVD-RW drives, which have been a PC staple since the late 1990s.

Computer cases, too, have evolved from the days of the plain beige box. Today, cases come in a variety of colors and eye-catching designs, many offering tool-less access that makes swapping out a hard drive or optical drive a quick and painless process. Case ventilation, a key factor in keeping a computer running smoothly, has also come a long way, with multiple fans of various sizes placed around the case to provide maximum air flow.

While the keyboard and mouse remain common input devices, they have become far more versatile. Wireless keyboards and mice are preferred by many users. Solar-powered keyboards are available, and the mouse ball has given way to optical and laser mice that are far more precise. Touch screens are also making their way into the PC market.

More Stories By Anne Lee

Anne Lee is a freelance technology journalist, a wife and a mother of two.

@MicroservicesExpo Stories
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Today companies are looking to achieve cloud-first digital agility to reduce time-to-market, optimize utilization of resources, and rapidly deliver disruptive business solutions. However, leveraging the benefits of cloud deployments can be complicated for companies with extensive legacy computing environments. In his session at 21st Cloud Expo, Craig Sproule, founder and CEO of Metavine, will outline the challenges enterprises face in migrating legacy solutions to the cloud. He will also prese...
DevOps at Cloud Expo – being held October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real r...
‘Trend’ is a pretty common business term, but its definition tends to vary by industry. In performance monitoring, trend, or trend shift, is a key metric that is used to indicate change. Change is inevitable. Today’s websites must frequently update and change to keep up with competition and attract new users, but such changes can have a negative impact on the user experience if not managed properly. The dynamic nature of the Internet makes it necessary to constantly monitor different metrics. O...
With the rise of DevOps, containers are at the brink of becoming a pervasive technology in Enterprise IT to accelerate application delivery for the business. When it comes to adopting containers in the enterprise, security is the highest adoption barrier. Is your organization ready to address the security risks with containers for your DevOps environment? In his session at @DevOpsSummit at 21st Cloud Expo, Chris Van Tuin, Chief Technologist, NA West at Red Hat, will discuss: The top security r...
Enterprises are moving to the cloud faster than most of us in security expected. CIOs are going from 0 to 100 in cloud adoption and leaving security teams in the dust. Once cloud is part of an enterprise stack, it’s unclear who has responsibility for the protection of applications, services, and data. When cloud breaches occur, whether active compromise or a publicly accessible database, the blame must fall on both service providers and users. In his session at 21st Cloud Expo, Ben Johnson, C...
The last two years has seen discussions about cloud computing evolve from the public / private / hybrid split to the reality that most enterprises will be creating a complex, multi-cloud strategy. Companies are wary of committing all of their resources to a single cloud, and instead are choosing to spread the risk – and the benefits – of cloud computing across multiple providers and internal infrastructures, as they follow their business needs. Will this approach be successful? How large is the ...
Many organizations adopt DevOps to reduce cycle times and deliver software faster; some take on DevOps to drive higher quality and better end-user experience; others look to DevOps for a clearer line-of-sight to customers to drive better business impacts. In truth, these three foundations go together. In this power panel at @DevOpsSummit 21st Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, industry experts will discuss how leading organizations build application success from all...
Most of the time there is a lot of work involved to move to the cloud, and most of that isn't really related to AWS or Azure or Google Cloud. Before we talk about public cloud vendors and DevOps tools, there are usually several technical and non-technical challenges that are connected to it and that every company needs to solve to move to the cloud. In his session at 21st Cloud Expo, Stefano Bellasio, CEO and founder of Cloud Academy Inc., will discuss what the tools, disciplines, and cultural...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
The nature of the technology business is forward-thinking. It focuses on the future and what’s coming next. Innovations and creativity in our world of software development strive to improve the status quo and increase customer satisfaction through speed and increased connectivity. Yet, while it's exciting to see enterprises embrace new ways of thinking and advance their processes with cutting edge technology, it rarely happens rapidly or even simultaneously across all industries.
These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
One of the biggest challenges with adopting a DevOps mentality is: new applications are easily adapted to cloud-native, microservice-based, or containerized architectures - they can be built for them - but old applications need complex refactoring. On the other hand, these new technologies can require relearning or adapting new, oftentimes more complex, methodologies and tools to be ready for production. In his general session at @DevOpsSummit at 20th Cloud Expo, Chris Brown, Solutions Marketi...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Hypertext Transfer Protocol, or HTTP, was first introduced by Tim Berners-Lee in 1991. The initial version HTTP/0.9 was designed to facilitate data transfers between a client and server. The protocol works on a request-response model over a TCP connection, but it’s evolved over the years to include several improvements and advanced features. The latest version is HTTP/2, which has introduced major advancements that prioritize webpage performance and speed.