Friday, February 25, 2005

General principles

Computers can work through the movement of mechanical parts, electrons, photons, quantum particles, or any other well-understood physical phenomenon. Although computers have been built out of many different technologies, nearly all computers today are electronic.
Computers may directly model the problem being solved, in the sense that the problem being solved is mapped as closely as possible onto the physical phenomena being exploited. For example, electron flows might be used to model the flow of water in a dam. Such analog computers were once common in the 1960s but are now rare.
In most computers today, the problem is first translated into mathematical terms by rendering all relevant information into the binary base-two numeral system (ones and zeros). Next, all operations on that information are reduced to simple Boolean algebra.
Electronic circuits are then used to represent Boolean operations. Since almost all of mathematics can be reduced to Boolean operations, a sufficiently fast electronic computer is capable of attacking the majority of mathematical problems (and the majority of information processing problems that can be translated into mathematical ones). This basic idea, which made modern digital computers possible, was formally identified and explored by Claude E. Shannon.
Computers cannot solve all mathematical problems. Alan Turing identified which problems could and could not be solved by computers, and in doing so founded theoretical computer science.
When the computer is finished calculating the problem, the result must be displayed to the user as output through output devices like light bulbs, LEDs, monitors, and printers.
Novice users, especially children, often have difficulty understanding the important idea that the computer is only a machine, and cannot "think" or "understand" the words it displays. The computer is simply performing a mechanical lookup on preprogrammed tables of lines and colors, which are then translated into arbitrary patterns of light by the output device. It is the human brain which recognizes that those patterns form letters and numbers, and attaches meaning to them. From the computer's point of view, all it "sees," assuming computers could ever be made self-aware, are electrons that are logically equivalent to ones and zeros.