The Microprocessor

 

 

The microprocessor is the heart and brain inside every personal computer. This tiny chip of silicon determines the speed and power of the entire computer by handling most, if not all, of the data processing in the machine.

 

All personal computers and a growing number of more powerful machines are based on a special type of electronic circuit called the microprocessor. Often termed "a computer on a chip," today's microprocessor is a masterpiece of high-tech black magic. It starts as a single slice of silicon that has been carefully grown as an extremely pure crystal, sawed thin with great precision, and is then subjected to high temperatures in ovens containing gaseous mixtures of impurities that defuse into the silicon and change its electrical properties. This alchemy turns sand to gold, creating an electronic brain as capable as that of your average arthropod.

 

As with insects and crustaceans, your PC can react, learn, and remember. Unlike higher organisms bordering on true consciousness, however, the microprocessor doesn't reason and is not self-aware. Clearly, although computers are often labeled as "thinking machines," what goes through their microprocessor minds is far from your thought processes and stream of consciousness. Or maybe not. Some theoreticians believe that your mind and a computer work fundamentally in the same way, although neither they nor anyone else knows exactly how the human mind actually works.

 

The operating principles of the microprocessor, on the other hand, are well understood. After all, microprocessor hardware was designed to carry out a specific function, and silicon semiconductor technology was harnessed to implement those functions. Nothing about what the microprocessor does is magic.

 

In fact, a microprocessor need not be made from silicon; scientists are toying with advanced semi-conducting materials that promise higher speeds. A microprocessor also does not need to be based on electronics. A series of gears, cams, levers or pipes, valves, and pans could carry out all the logical functions exactly the same way as does one of today's most advanced microprocessors. Mechanical and hydraulic computers have, in fact, been built.

 

The advantage of electronics and the microprocessor is speed and size. Electrical signals travel at the speed of light; microprocessors carry out their instructions at rates up to several million per second. Without that speed, elaborate programs would never have been written. Executing one with a steam-driven computing engine might have taken lifetimes, not to mention warehouses full of equipment. The speed of the microprocessor and its small, portable size, make it into the miracle that it is.

 

The advantage of silicon is familiarity. An entire industry has arisen to work with silicon. The technology is mature; fabricating silicon circuits is routine, and the results are predictable. Familiarity also breeds economy. Billions of silicon chips are made each year. Although the processes involved are precise and exotic, the needed equipment and materials are readily available.

 

 

How Microprocessors Work

 

Reduced to its fundamental principles, the workings of a modern silicon-based microprocessor are not difficult to understand. They are the electronic equivalent of a knee-jerk. Every time you hit the microprocessor with an electronic hammer blow-the proper digital input-it reacts by doing a specific something, always the same thing for the same input, kicking out the same function. The complexity of the microprocessor and what it does arises from the wealth of inputs it can react to and the interaction between successive inputs. 

 

Although the microprocessor's function is precisely defined by its input, the output from that function varies with what the microprocessor had to work upon-and that depends on previous inputs. For example, the result of you carrying out a specific command-"Simon says lift your left leg"-will differ dramatically depending on whether the previous command was "Simon says sit down" or "Simon says lift your right leg."

 

 

Instruction Sets

 

The simple silicon circuits that microprocessors are made from don't understand English commands, however. They react to electronic signals. With today's microprocessors, each microprocessor command is coded as the presence or absence of an electrical signal at one of the pins of the microprocessor's package. These signals-each one representing a digital information bit that can be coded as a zero or a one-together make a bit pattern. 

 

Certain bit patterns are given specific meanings by the designers of a microprocessor and thus become a microprocessor instruction. For example, the bit pattern 0010110 is the instruction that tells Intel 8086-family microprocessors to subtract.

 

The entire repertoire of commands that a given microprocessor model understands and can react to is called that microprocessor's instruction set or its command set. Different microprocessor designs recognize different instruction sets.

 

Instruction sets can be incredibly rich and diverse. For example, a simple command to subtract is not enough by itself. The microprocessor also needs to know what to subtract from what, and it needs to know what to do with the result. 

 

The microprocessors used in PCs are told what numbers to subtract by variations of the subtract instruction, of which there are about seven, depending on what you count as a subtraction. Each different instruction tells the microprocessor to take numbers from different places and to find the difference in slightly different manners. Microprocessor registers handle some of these duties.

 

 

Registers

 

Before the microprocessor can work on numbers or any other data, it must first know what numbers to work on. The most straightforward method of giving the chip the variables it needs would seem to be supplying more coded signals at the same time the instruction is given. This simple method has its shortcomings, however. Somehow, the proper numbers must be routed to the right microprocessor inputs. 

 

Either the microprocessor and the computer circuitry would have to do the routing, or you would be stuck with manually loading the numbers into different inputs. The job is best left to the microprocessor (you wouldn't, for example, want to have to designate where to put each intermediate result in a lengthy calculation). All that signal routing would substantially complicate the external circuitry leading to the microprocessor.

 

Instead of working directly with two inputs simultaneously, today's microprocessors take one input at a time. The first input pattern is loaded into a special area called a register. A register functions both as memory and a workbench. It holds bit patterns until they can be worked upon or output. The register is also connected with the processing circuits of the microprocessor so that the changes ordered by the instructions actually appear in the register. Most microprocessors typically have several registers, some dedicated to specific functions (for example, remembering which step in a function the chip is currently carrying out) and some designed for general purposes.

 

Other microprocessor instructions tell the chip to put numbers in its registers to be worked on later and to move information from a register someplace else-for example to memory or an output port. Some microprocessor instructions require a series of steps to be carried out. For example, the subtraction instruction given previously tells the microprocessor to subtract an immediate number-one in memory-from another number in the microprocessor's accumulator, a particular register favored for calculations.

 

Everything that the microprocessor does consists of nothing more than a series of these one-step-at-a-time instructions. Simple subtraction or addition of two numbers may take dozens of steps, including the conversion of the numbers from decimal to the binary (ones and zeros) notation that the microprocessor understands. Computer pro-grams are complex because they must reduce processes that people think of as one step in itself-adding numbers, typing a letter, moving a block of graphics-into a long and complex series of tiny, incremental steps.