Computer, digital



Computer Digital 3192
Photo by: ktsdesign

The digital computer is a programmable electronic device that processes numbers and words accurately and at enormous speed. It comes in a variety of shapes and sizes, ranging from the familiar desktop microcomputer to the minicomputer, mainframe, and supercomputer. The supercomputer is the most powerful in this list and is used by organizations such as NASA (National Aeronautics and Space Administration) to process upwards of 100 million instructions per second.

The impact of the digital computer on society has been tremendous; in its various forms, it is used to run everything from spacecraft to factories, health-care systems to telecommunications, banks to household budgets.

The story of how the digital computer evolved is largely the story of an unending search for labor-saving devices. Its roots go back beyond the calculating machines of the 1600s to the pebbles (in Latin, calculi ) that the merchants of Rome used for counting and to the abacus of the fifth century B.C. Although none of these early devices were automatic, they were useful in a world where mathematical calculations performed by human beings were full of human error.

The Analytical Engine

By the early 1800s, with the Industrial Revolution well underway, errors in mathematical data had grave consequences. Faulty navigational tables, for example, were the cause of frequent shipwrecks. English mathematician Charles Babbage (1791–1871) believed a machine could do mathematical calculations faster and more accurately than humans. In 1822, he produced a small working model of his Difference Engine. The machine's arithmetic functioning was limited, but it could compile and print mathematical tables with no more human intervention needed than a hand to turn the handles at the top of the model.

Babbage's next invention, the Analytical Engine, had all the essential parts of the modern computer: an input device, a memory, a central processing unit, and a printer.

Although the Analytical Engine has gone down in history as the prototype of the modern computer, a full-scale version was never built. Even if the Analytical Engine had been built, it would have been powered

A Bit-Serial Optical Computer (BSOC), the first computer to store and manipulate data and instructions as pulses of light. (Reproduced by permission of Photo Researchers, Inc.)
A Bit-Serial Optical Computer (BSOC), the first computer to store and manipulate data and instructions as pulses of light. (Reproduced by permission of
Photo Researchers, Inc.
)

by a steam engine, and given its purely mechanical components, its computing speed would not have been great. In the late 1800s, American engineer Herman Hollerith (1860–1929) made use of a new technology—electricity—when he submitted to the United States government a plan for a machine that was eventually used to compute 1890 census data. Hollerith went on to found the company that ultimately became IBM.

Mammoth modern versions

World War II (1939–45) marked the next significant stage in the evolution of the digital computer. Out of it came three mammoth computers. The Colossus was a special-purpose electronic computer built by the British to decipher German codes. The Mark I was a gigantic electromechanical device constructed at Harvard University. The ENIAC was a fully electronic machine, much faster than the Mark I.

The ENIAC operated on some 18,000 vacuum tubes. If its electronic components had been laid side by side two inches apart, they would have covered a football field. The computer could be instructed to change programs, and the programs themselves could even be written to interact with each other. For coding, Hungarian-born American mathematician John von Neumann proposed using the binary numbering system, 0 and 1, rather than the 0 to 9 of the decimal system. Because 0 and 1 correspond to the on or off states of electric current, computer design was greatly simplified.

Since the ENIAC, advances in programming languages and electronics—among them, the transistor, the integrated circuit, and the microprocessor—have brought about computing power in the forms we know it today, ranging from the supercomputer to far more compact personal models.

Future changes to so-called "computer architecture" are directed at ever greater speed. Ultra-high-speed computers may run by using super-conducting circuits that operate at extremely cold temperatures. Integrated circuits that house hundreds of thousands of electronic components on one chip may be commonplace on our desktops.

[ See also Computer, analog ; Computer software ]



User Contributions:

Comment about this article, ask questions, or add new information about this topic:

CAPTCHA