Www.WorldHistory.Biz
Login *:
Password *:
     Register

 

18-04-2015, 02:04

Computers

The last quarter of the 20th century witnessed a revolution in computing technology that substantially changed the face of American society, especially those sectors involved in research, production, commerce, defense, communication, and entertainment. Though simple mechanical calculating devices have been around since the 17th century, the first electronic calculating machines did not appear until the 1930s. Throughout the decades immediately preceding World War II, computing technology developed slowly through a combination of private sector and academic experimentation. The Moore School of Electrical Engineering at the University of Pennsylvania built the first successful electronic digital computer called the Electronic Numerical Integrator and Computer (ENIAC) in 1946. It weighed 60,000 lbs., measured 18 feet high and 80 feet long, and required 500 miles of wiring to link nearly 18,000 vacuum tubes together. It used 180,000 watts of electrical power to produce a computing power of 100,000 operations per second. One of its practical applications was to compute the feasibility of the hydrogen bomb. The cost of these giant computers led to the development of a cooperative relationship between land grant universities, major corporations, and government research agencies. The first significant advance from these large machines occurred in 1948, when physicists John Bardeen, Walter H. Brattain, and William B. Shockley at Bell Telephone Laboratories invented the transistor, which replaced costly and fragile vacuum tubes that limited the capacity of computers. In 1957 Jack Kilby and Robert Noyce designed the first true integrated circuit, or “chip,” which permitted the giant computers developed during and immediately after World War II to be scaled down by including transistors, resistors, and capacitors on the same germanium wafer. In 1971 American engineer Marcian E. Hoff successfully incorporated hundreds of components onto a smaller silicon chip, which he called a “microprocessor.” The reduction in size and expense of complex computing power opened the way for the gradual shift of computing technology from the large mainframe computers of research agencies to personal computers for small businesses and homes.

In 1974 Micro Instrumentation Telemetry Systems introduced the Altair 8800, which became the first truly personal computer to come to the market, even though

Personal computers on display in a shopping center (Getty Images)



 

html-Link
BB-Link