Computer software



Computer Software 3060
Photo by: Mike Kiev

Computer software is a package of specific instructions (a program) written in a defined order that tells a computer what to do and how to do it. It is the "brain" that tells the "body," or hardware, of a computer what to do. "Hardware" refers to all the visible components in a computer system: electrical connections, silicon chips, disc drives, monitor, keyboard, printer, etc. Without software, a computer can do nothing; it is only a collection of circuits and metal in a box.

History

The first modern computers were developed by the United States military during World War II (1939–45) to calculate the paths of artillery shells and bombs. These computers used vacuum tubes that had on-off switches. The settings had to be reset by hand for each operation.

These very early computers used the familiar decimal digits (0 to 9) to represent data (information). Computer engineers found it difficult to work with 10 different digits. John von Neumann (1903–1957), a Hungarian-born American mathematician, decided in 1946 to abandon the decimal system in favor of the binary system (a system using only 0 and 1; "bi" means two). That system has been used ever since.

How a computer uses the binary system

Although computers perform seemingly amazing feats, they actually understand only two things: whether an electrical "on" or "off" condition exists in their circuits. The binary numbering system works well in this situation because it uses only the 0 and 1 bi nary digi ts (later shortened to bits ). Binary 1 represents on and binary 0 represents off. Program instructions are sent to the computer by combining bits together in groups of six or eight. This process takes care of the instructional part of programs.

Words to Know

Computer hardware: The physical equipment used in a computer system.

Computer program: Another name for computer software, a series of commands or instructions that a computer can interpret and execute.

Computer codes

Data—in the form of decimal numbers, letters, and special characters—also has to be available in the computer. For this purpose, the EBCDIC and ASCII codes were developed.

EBCDIC (pronounced EB-see-dick) stands for Extended Binary Coded Decimal Interchange Code. It was developed by IBM Corporation and is used in most of its computers. In EBCDIC, eight bits are used to represent a single character.

ASCII (pronounced AS-key) is American Standard Code for Information Interchange. ASCII is a seven-bit code developed in a joint effort by several computer manufacturers to create a standard code that could be used on any computer, regardless who made it. ASCII is used in most personal computers today and has been adopted as a standard by the U.S. government.

The development of computer languages

The first modern computer was named ENIAC for Electronic Numerical Integrator And Calculator. It was assembled in 1946. Most programming was done by and for military and scientific users. That began to change after Grace Hopper, an American computer scientist and naval officer, developed FLOW-MATIC, or assembly language, as it came to be called. It uses short names (known as mnemonics or memory aids) to represent common sequences of instructions. These instructions are in turn translated back into the zeroes and ones of machine language when the program is run. This was an important step toward developing "user-friendly" computer software. FLOW-MATIC was one of the first "highlevel" computer languages.

Soon, other high-level computer languages were developed. By 1957, IBM had created FORTRAN, a language specifically designed for scientific and engineering work involving complicated mathematical formulas. FORTRAN stands for FORmula TRANslater. It became the first high-level programming language to be used by many computer users. COBOL (COmmon Business Oriented Language) was developed in 1959 to help businesses organize records and manage data files.

During the first half of the 1960s, two professors at Dartmouth College developed BASIC (Beginner's All-purpose Symbolic Instruction Code). This was the first widespread computer language designed for and used by nonprofessional programmers. It was extremely popular throughout the 1970s and 1980s. Its popularity was increased by the development and sale of personal computers, many of which already had BASIC programmed into their memories.

Types of computer software

The development of high-level languages helped to make computers common objects in workplaces and homes. Computers, of course, must have the high-level language command translated back into machine language before they can act on it. The programs needed to translate high-level language back into machine language are called translator programs. They represent another type of computer software.

Operating system software is yet another type of software that must be in a computer before it can read and use commercially available software packages. Before a computer can use application software, such as a word-processing or a game-playing package, the computer must run the instructions through the operating system software. This contains many built-in instructions, so that each piece of application software does not have to repeat simple instructions, like telling the computer how to print something out. DOS (Disc Operating System) is a popular operating system software program for many personal computers used today.

Application software. Once some type of operating system software is loaded into a computer, the computer can load and understand many other types of software. Software can tell computers how to create documents, to solve simple or complex calculations for business people and scientists, to play games, to create images, to maintain and sort files, and to complete hundreds of other tasks.

Word-processing software makes writing, rewriting, editing, correcting, arranging, and rearranging words convenient. Database software enables computer users to organize and retrieve lists, facts, and inventories, each of which may include thousands of items. Graphics software lets you draw and create images.

Desktop publishing software allow people to arrange photos, pictures, and words on a page before any printing is done. With desktop publishing and word-processing software, there is no need for cutting and pasting layouts. Entire books can be written and formatted by the author. The printed copy or even just a computer disk with the file can be delivered to a traditional printer without the need to reenter all the words on a typesetting machine.

Software for games can turn a computer into a spaceship, a battlefield, or an ancient city. As computers get more powerful, computer games get more realistic and sophisticated.

Communications software allows people to send and receive computer files and faxes over phone lines. Transferring files, sending and receiving data, using data stored on another computer, and electronic mail (e-mail) systems that allow people to receive messages in their own "mailboxes" are some common uses of communications software.

The Y2K hubbub

As the end of the 1990s approached, the world became preoccupied or perhaps even obsessed with the coming of the year 2000, nicknamed "Y2K" (Y for year and 2 times K, a standard designation for a thousand). Many feared that at the stroke of midnight between December 31, 1999, and January 1, 2000, computers and computer-assisted devices would come crashing down.

The so-called Y2K bug was a fault built into computer software because early developers of computer programs were uncertain that computers would even have a future. To save on memory and storage wherever possible, these developers built in standardized dates with two digits each for the day, month, and year. For instance, January 2, 1961, was read as 010261. However, this short form could also mean January 2, 1561, or January 2, 2161.

By the mid-1970s, programmers were beginning to recognize the potential obstacle. They began experimenting with plugging 2000-plus

The use of many varied forms of computer software makes the computer an indispensable tool. (Reproduced by permission of Photo Researchers, Inc.)
The use of many varied forms of computer software makes the computer an indispensable tool. (Reproduced by permission of
Photo Researchers, Inc.
)

dates into their systems and software; they quickly found the dates did not compute. However, it was not until 1995 that the U.S. Congress, the media, and the public all seemed to "discover" that the end was drawing near. As of 1999, 1.2 trillion lines of computer code needed to be fixed. Left uncorrected, the Y2K bug could have fouled computers that controlled power grids, air traffic, banking systems, and phone networks, among other systems. In response, businesses and governments around the world spent over $200 billion to reprogram and test vulnerable computers.

When the year 2000 became a reality, the anticipated computer glitches never materialized: power plants kept working, airplanes kept flying, and nuclear missiles were kept on the ground. Problems that did arise were minor and were quickly fixed with hardly anyone noticing. There were many other added benefits of the money and time spent on the Y2K problem: Businesses and governments upgraded their computers and other equipment. With the help of the World Bank and other Y2K funders, poorer countries were given machines and Internet connections they were allowed to keep. Many U.S. businesses weeded out older machines, combined similar systems, and catalogued their software and computers. In the end, individuals, businesses, and countries learned to work together to overcome a common problem.

[ See also CAD/CAM ; Internet ]



User Contributions:

1
brad
i thought this was a great source for history day please update :)
Very useful to understand about hardware device.Images are give clear info earlier time.
3
Usha
Great piece! It helped a lot in understanding hardware and software better. Thanx

Comment about this article, ask questions, or add new information about this topic: