This series of articles (After Internet) aims to pay a small tribute to these pioneers and their innovations. Through their stories, we'll explore how the first computers, often colossal and limited in capacity, laid the groundwork for the technology we now take for granted. It's a journey into the past to understand the present and appreciate the human ingenuity that brought us here.
Every robot operating today, every AI algorithm processing data, every IoT connection being established, exists thanks to these pioneers who laid the foundations of the digital world.
The Foundation of the Digital Era
In 1971, Intel released the 4004, a 4-bit microprocessor running at 740 kHz containing 2,300 transistors. It cost $200 and had less processing power than today's pocket calculator. Yet this tiny chip, no bigger than a fingernail, would forever alter the computing landscape. It marked the first step toward an era that would convert offices full of typewriters into digital workplaces, and eventually into the artificial brains we know today.
Early business computers were mechanical beasts occupying entire rooms. The IBM System/360, launched in 1964, required industrial air conditioning and specialized technical teams. Its central processor worked with 32-bit words but at speeds that would seem glacial today. Memory was measured in kilobytes, not gigabytes, and each byte cost several dollars. Hard drives the size of washing machines stored barely 5 megabytes, enough for just a few current digital photos.
The arrival of the Intel 8008 in 1972 and later the 8080 in 1974 began the democratization of computing. These 8-bit processors operated at 2 MHz and could address up to 64 KB of memory. These figures seem laughable now, but they represented a quantum leap in accessibility. For the first time, small businesses could afford to own a computer. The Altair 8800, based on the 8080, sold for $400 as a build-it-yourself kit, and enthusiasts spent entire weekends soldering components.
The breakthrough came in 1978 with the Intel 8086, the first 16-bit processor that established the x86 architecture still dominating computers today. It operated at 5 MHz, could address 1 MB of memory, and executed 0.33 million instructions per second. IBM selected this processor for its Personal Computer in 1981, a decision that would catapult Intel to global leadership. That IBM PC, with 64 KB of RAM and no hard drive, cost $1,565 at the time, equivalent to over $5,000 today.
Hard drives from that era were true mechanical engineering marvels. The first commercial hard drive, the IBM 305 RAMAC from 1956, weighed one ton and stored 5 MB across 50 24-inch disks spinning at 1,200 RPM. In the 1980s, hard drives for personal computers had capacities of 10 to 20 MB and cost several thousand dollars. The characteristic sound of a hard drive starting up, with its clicks and whirs, was music to the ears of system administrators of the era.
RAM memory was literally more valuable than gold. In 1981, one megabyte of RAM cost around $1,000. Typical computers came with 64 KB or 128 KB, and expanding memory was a serious investment. DRAM (Dynamic RAM) memories required constant refreshing, and chips were mounted on enormous boards that occupied much of the computer's internal space. The arrival of SIMM (Single In-line Memory Module) memories in the 1980s simplified installation, but prices remained prohibitive for most users.
The leap to the Intel 80286 processor in 1982 brought protected mode, enabling true multitasking and addressing up to 16 MB of memory. It operated at speeds of 6 to 12 MHz and could run multiple programs simultaneously, something unthinkable in previous generations. Companies began to see the real potential of distributed computing, where each employee could have their own workstation instead of sharing terminals connected to a central mainframe.
The 1990s brought the Intel 80386 and 80486, 32-bit processors operating at speeds from 16 to 100 MHz. These chips could handle gigabytes of virtual memory and run advanced operating systems like Windows 3.1 and the first Unix systems for PC. Hard drives reached capacities of hundreds of megabytes, and RAM became more accessible, though still expensive. A complete computer cost between $3,000 and $5,000, but could already run complex spreadsheets, advanced word processors, and the first graphic design programs.
Intel's first Pentium processor, launched in 1993, marked the beginning of the modern personal computing era. It operated at speeds from 60 to 200 MHz, included an improved floating-point unit, and could execute two instructions per clock cycle. Pentium computers popularized multimedia, enabling CD-ROM playback, digital audio, and the first computer videos. RAM memory began to be measured in megabytes instead of kilobytes, and hard drives exceeded one gigabyte capacity.
Comparing with today, a modern smartphone has more processing power than a room full of 1980s computers. A current processor executes billions of instructions per second, has gigabytes of RAM memory, and storage in the order of terabytes. Transistors have gone from 2,300 in the 4004 to over 50 billion in modern processors. Moore's Law, which predicted the doubling of transistors every two years, was religiously followed for decades, taking us from those modest beginnings to current machines capable of running complex artificial intelligence.
This evolution of basic hardware laid the groundwork for everything that followed: networks, the internet, and finally, the artificial intelligence and robotics that define current technology. Without those pioneers who soldered the first chips and programmed the first operating systems, there would be no machine learning algorithms processing petabytes of data or robots that will automate the future.