The microprocessor, also known as the CPU is the brains behind everything from the humble calculator, to the smartwatch. The birth of what we have come to know now as the CPU happened only recently in the 1970’s. Before it contained what is known as the transistor, it had been made up of large vacuum tubes and used by men in white coats. It wasn’t until the first mainframe computer was built that the birth of the modern CPU began.
Origin of The CPU
When the 1970’s started, computers were monster machines that were kept hidden in large air conditioned rooms and were attended by scientists or technicians. The mainframe computer had a component called the Central Processing Unit (CPU) and this was mostly kept in a steel cabinet bigger than a refrigerator that was packed full of circuit boards with giant transistors. At the time the idea that a CPU could ever be reduced to a chip the size of your own fingernail was thought of as science fiction.
It was as far back as 1940’s that the idea of a stored instruction digital computer was proposed, and it wasn’t by engineers, it was conceived by mathematicians, including John Von Neumann, John Mauchly and also electrical engineer John Presper Eckert. Up until then, computers were programmed using the rewiring of circuits to perform a certain calculation before having to rewire it again to perform another.
Memory gets a CPU
The concept was conceived that by giving electrical devices at the time a memory, instructions could be set, and then performed over and over again without the need for rewiring. Something was needed to fetch these instructions and pass them into the memory, and this device was the CPU. Input and output devices as well as the memory had to be connected to this CPU, as it was the center of operations. This is where its first part of its name originated. Then, because it carried out instruction execution and number calculating, it was named the Processing Unit.
Inside the CPU a program counter was located to point tot he next instruction that is due to be executed. The CPU goes through a cycle where it retrieves the instructions from the memory that are pointed to from the program counter. After this data is retrieved from memory the necessary calculations are performed, and then the result is stored back into the output memory. The same cycle then starts over again and the program counter is incremented to point out the next instruction.
Birth of the Microprocessor
While heavy iron mainframes were still the rage in 1971 a small company called Intel, based in Silicon Valley California was contracted to design a circuit that would be integrated for a business calculator manufactured by Busicom. It took the unique approach of creating a chip that could be programmed to perform almost any calculation, instead of hardwired calculations like the other calculator chips on the market.
It had been an expensive and time consuming process to design a wired chip, but this was replaced after the invention of the flexible microprocessor, known as the 4004. The instructions were stored inside a separate Read Only Memory (ROM) chip. The concept of the general purpose CPU grew up to become the heart of every computer and smartphone device.
4 Bit Limit
The 4004 CPU handled its data intruction in 4 bit chunk. These 4 bits gave you 16 possible numbers, which was good enough to handle standard decimal arithmetic on a calculator. The limitation with these 4 bits came from the other stored instruction that a computer had to do. It also had to search memory to find out where the instructions are. It would have to calculate memory positions to process the branch instructions that were set by the programs it was running.
The 4004 required 640 bytes of memory to handle the calculator functions. This was equal to 5120 bits, so there was already a huge limitation with the 4 bit chip. This constant battle to catch up with a microprocessors potential is what still drives computers today.
8 Bit Frontier
In 1972 Intel delivered its much improved 8008, with double the number of bits. This was the start of many microprocessors that would begin the revolution in home computing. The 8008 was limited to 16 kilobytes of address space, but this was not so much a limit as people at that time could not afford that much RAM.
It only took another 2 years before Intel would roll out their 8080 microprocessor which welded a massive 64 Kilobytes of memory space. Around a similar time Motorola entered the manufacturing of microprocessors and released their 6800 with similar capabilities as the 8080. The 8080 went on to become the core of serious microcomputers at the time, and Intel then released the improved 8086/8 which was used as the foundation for the IBM PC. The motorola 6800 family of microprocessors went on to become adopted widely inside the Apple II personal computer.
16 Bit Revolution
The end of the 1970’s came with a new revolution in microprocessor technology. It was 1979 that Intel delivered it’s 16 bit 8088 CPU and this really heralded the era of the computer moving from a techie toy in the garage to a fully functional business tool. The advantage of the 8086/8 was that it came with 1 Megabyte of memory address, this was the equivalent of 1 million bytes. It was with this capability that spreadsheets became of age, and large documents were now able to be read and held in the RAM memory for fast access and updates, giving IBM and Intel millions of orders around the globe as business adopted the opportunity.
With the speed of microprocessor cores getting ever faster, memory was having a hard time keeping up. Due to technical limitations the larger low powered memory modules could not go as fast as the higher power RAM chips. To make sure that CPUs could continue to operate at fast speeds engineers started to insert a few of the small and fast memory modules between the larger RAM and the CPU.
The job of this smaller memory was to hold instructions that were repeatedly executed by the CPU or data that was accessed often. This smaller RAM was named as cache RAM, and it’s this that allows the CPU to operate at full speed. If the CPU finds an instruction where it has to go and access the data from the normal RAM, it has to stop while it fetches it, compared to the faster access that it gets from having the instruction ready in cache RAM. The cache RAM has since went on to be split into 2 levels known as L1 and L2.
CPU technology has continued to develop following the same system of upgrades as the 1980’s with ever more transistors being squeezed onto chips, and already we have passed the 32 bit revolution and are on the 64 bit now, following the doubling of CPU technology every 2 years known as Moore’s Law. The next installment in CPU technology will be through quantum computing. Already scientist have been working with Google, and they have developed the first quantum processor that uses qubits.