Until late last night,I had forgotten Central Processing Unit had turned 40.However,a rare phone call from my lecturer at Royal Melbourne Institute of Technology refreshed my memory about Intel corp milestone.CPU came into the fore in 1971 and very few computer users have taken keen interest that it is now four decades into existence.They are various types of CPU and for your information they are the brains behind the likes of traffic lights,calculators,computer,mobile phones,Internet to mention but a few.Although the mainframe computers existed before CPU they were made up of a mass of wires and vacuum tubes, all of which meant that computers took up entire rooms and sucked up tremendous amounts of power and cost an astronomical amount of money to run.
However,all that changed when 4o years ago, when Intel Corp which is short for Integrated Electronics, then a producer of semiconductor memory chips, was contracted to build an integrated circuit for a range of calculators by a company called Busicom.Instead of making a simple integrated circuit hard-coded for performing calculator functions, Intel engineers created the first complete CPU on one chip. It was a multipurpose programmable CPU that could perform a variety of functions depending on the instructions it was given.
Thus, instead of being hard-coded with a limited number of specific functions, a separate read-only memory well known as ROM chip contained the instructions to tell the CPU what to do.This meant that instead of having to build a whole new chip to add new functions to the calculator, the company could simply flash a new set of instructions into the ROM chip to tell the CPU to perform other additional functions.According to technology history books,the first Intel CPU was called the 4004 and it was a 4-bit processor, which meant it handled data in four-bit chunks, thus giving it the ability to create 16 different values.
The 4004 had a then whopping 2,300 transistors, followed by some 3,500 transistors in its successor, the 8008, the first 8-bit microprocessor.Intel's latest second-generation Core processor has about 1.48 billion transistors, giving more than 350,000 times the performance of the original processor.The 8080, an improved version of the 8008, was the microprocessor that took the CPU from powering calculators to being used in the first consumer microcomputer, the MITS Altair 8800.Interestingly, the high cost of the 8080 was what prompted Steve Wozniak and Steve Jobs to opt for the Motorola 6800 microprocessor to power the first Apple computer, as the 6800 could be had for as low as US$40.
By the turn of 1990s, a CPU's performance was expressed by its clock speed, expressed in megahertz (MHz), and later gigahertz (GHz).However, while it is true that the clock speed of the microprocessor does contribute to performance, many other factors apart from clock speed alone determined how well CPUs really performed.AMD, Intel's long-time rival, developed their range of Athlon microprocessors with a model-naming convention that expressed a notional clock rate that the company thought was equivalent to higher-clocked Intel Pentium 4 microprocessors.
During the same decade, microprocessor designers were said to have reached a limit as to how far a microprocessor could be pushed in clock speed and the same time try to keep power consumption and heat at a manageable level. The media branded it the "megahertz myth" because many companies, including Intel, were locked in a battle to produce microprocessors with higher and higher clock rates.Once the megahertz wars were over, microprocessor designers started with a new idea that was instead of a having a microprocessor with a single core doing more and more work at higher speeds.
At this juncture,we are in multicore revolution in microprocessors which started with just two cores in Intel's Core Duo chip, and is now moving to four cores, and soon eight cores.Intel has already shown in labs that a microprocessor with up to 50 cores is possible and this means there could be no limit as to how many cores it can hold.My prediction is that we are going to get more cores in our microprocessors, and power consumption is definitely going to go down with each subsequent generation.This has been helped by advances in miniaturisation and power.
All images by Alison Keys ,San Francisco.
0 comments:
Post a Comment