Www.WorldHistory.Biz
Login *:
Password *:
     Register

 

8-05-2015, 01:02

Science and technology

Since the 1970s, personal technology has transformed America with what many have called a “computer revolution.” There were thousands of small electronic innovations that contributed to the technological advances in the 1970s and 1980s, but some of the most important arose from discoveries made in electricity and light.

In 1958 Jack Kilby invented the smaller, more durable “integrated circuit,” and 13 years after that, Marcian E. Hoff combined hundreds of integrated circuits on small silicon “chips,” which he called “microprocessors.” These inventions serve as the basis for all subsequent computing and electronic technologies. During this same period, scientists at Columbia University invented the “maser” in 1954, which used ammonia to produce coherent microwave radiation. In 1960 Theodore H. Maiman, of Hughes Research Laboratories, built the first laser (Light Amplification by Stimulated Emission of Radiation). Lasers have since been adapted and enhanced for a multitude of applications, including nuclear physics, medicine, guidance systems, and mass storage. By the 1970s, the basic components were ready for a technological revolution in home appliances, information processing, and communication.

By the 1960s many consumers already had high-fidelity phonograph systems, but by the 1970s the list of common electronic equipment included magnetic tape-based eight-track and cassette decks, transistor-based stereo receivers, color televisions, and digital alarm-clock radios. Sony introduced the first Betamax home-use VCR in 1975, which inspired JVC to introduce the competing VHS format a year later. As prices dropped, the units became so widespread that the Supreme Court was called in to decide on whether home recordings of televised programming constituted a copyright violation; in Sony v. Universal (1984), they held that it did not.

The prevalence of electronics in the home encouraged a similar trend in specialty appliances for the family kitchen; Mr. Coffee introduced the first automatic drip coffee maker in 1972. Though Percy Spencer, of Raytheon Corporation, invented the Radar Range microwave in


Satellite dishes and the satellites they communicate with have revolutionized the flow of information around the world. (Boyle/ Getty Images)

1946, it was not until 1967 that Amana introduced the first countertop model for home use. By 1975, sales of microwave ovens exceeded the sale of gas ranges, to become, by the end of the decade, more commonly owned than dishwashers. Throughout the 1980s and 1990s, appliances based on microprocessor chips appeared for the American home, including compact disk (CD) players, introduced in 1982, and digital video disk (DVD) players in 1997. Other common devices include analog and digital video cameras, 5.1-channel Dolby Surround Sound speaker systems, and satellite cable connections. Digital thermostats, smoke and carbon dioxide detectors, and alarm systems have transformed even the home’s infrastructure. Many of these advances are only decades old, and yet most have become so common that few Americans notice their presence.

Advances in information processing first developed in the late 1960s among government agencies and large corporate BUSiNESSes; computers often filled entire rooms and required large staffs of programmers to operate. By the 1980s, however, personal computers entered the home, and by 2000 more than half of American households owned at least one. The first home computer, the Altair 8800, was sold through hobby magazines starting in January 1975. Two years later, Stephen Wozniak and Steve Jobs introduced the Apple II computer, which legend says they first built in Wozniak’s garage. In 1981 IBM introduced the IBM PC (for personal computer), which was so successful that the computer industry adopted the PC moniker and it became the generalized term for all subsequent computers, no matter who produced them. New companies arose to augment existing computing systems with peripherals that have since become standard attachments, including the hard disk drive system, graphics and sound cards, inkjet and laser printers, and combination fax and modem cards.

Throughout the 1980s, the rate of innovation in computing technology grew so fast that many consumers feared purchasing lest their model become obsolete before year’s end. The resulting struggle for common standards forced some of the initial pioneers of home computing, like Tandy, Commodore, and Texas Instruments, to redesign their product lines to ensure 100 percent IBM compatibility, because consumers relied on the PC standard to judge the quality of their software and hardware purchases. Despite these fast changes, some standards emerged based on the Microsoft Windows operation system and the Intel-compatible microprocessor. Apple computers remained the only significant alternative, with a market share hovering between 3 and 5 percent. The consolidation of standards made the introduction of new peripherals much easier, and by 2000, common peripherals included rewritable CD and DVD players, full-page color scanners, video-capture cards, and digital cameras. With so many options for capturing information, in the 21st century there will be very little that will pass undocumented.

The prevalence of home computers during the 1980s paved the way for similar advances in communication. Bell Laboratories invented fiber optics in 1977, when it experiment with thin strands of glass to transmit pulses of light between telephone exchange centers. Fiber optic connections permit digital transmission with greater bandwidth at much faster rates than traditional analog connections. Though telephone companies installed fiber optic connections throughout the 1980s, private consumer demand remained limited because there was little need for large bandwidth at home. This changed, however, after the World Wide Web (WWW) was released over the Internet in 1991.

Though the entry of powerful personal computers during the 1980s made private access to these networks more feasible, the lack of simple navigation software inhibited widespread public access. The situation improved, however, when Tim Berners-Lee developed information management software called the World Wide Web, while working at Conseil Europeen pour la Recherche Nucleaire (CERN). It stored information using random associations called “links,” based on Hyper Text Markup Language (HTML), to provide an easier and more systematic way of accessing the various information threads over the Internet. The following year, Marc Andreessen, a 22-year-old student at the University of Illinois, developed the Mosaic browser, which attached a graphical interface to the Web, allowing visitors to navigate using a few mouse clicks. Shortly thereafter, the popular version, called Netscape Navigator, was released to the public creating an immediate demand for Internet access.

In 1992 there were only 50 pages on the Internet; by 1993 the number had grown to 341,000, and within three years, the number of pages had grown so great that accurate figures were no longer possible. The consumer demand for Internet access produced an equally great demand for faster communication from home, including fiber optic connections and even cable and satellite options. Americans quickly grew accustomed to instant communication with relatives, friends, and colleagues around the world through e-mail, and later, through instant messaging.

Not surprisingly, the demand for portable communications grew by similar proportions. The Federal Communications Commission repealed restrictions on mobile phones in 1983, which allowed Bell Telephone to introduce the first one later in the year, though the high price initially restricted mobile phones to businesses. The 1996 Telecommunications Act deregulated local telephone service, opening the market to increasing competi-tion between telephone providers. By 1995, 32 million Americans made up more than a third of the mobile phone users worldwide. By 2001 the number had increased to 115 million.

The creation of wireless local area networks marked the next advance in home and business communications. Since the inception of such networks in 1997, people have used the term “Wi-Fi” to describe the generic wireless interface of mobile computing devices such as laptops. A person with a Wi-Fi-enabled device such as a personal computer, cell phone, or other such device can connect to the Internet when in the proximity of any access point, called a “hotspot.” Hotspots can range in size from a single room to many square miles. Wi-Fi also enabled devices to connect directly with each other. This connectivity is especially useful in consumer electronics and gaming applications. Free Wi-Fi has been growing in popularity and by 2007 was available at more than 100,000 locations in the United States. Businesses such as Starbucks, McDonalds, and a variety of hotels offer paid Wi-Fi. This is a growing trend at businesses with high customer turnover.

Internet access became available through wireless telephones with the development of devices such as the Blackberry and the iPhone. These devices, called “smart phones,” came equipped with keyboards to allow the user to communicate as though with a computer. With this technology, users gained instant access to communications, email, news, and music without being confined to specific hotspots.

As personal technology becomes more common in American homes, the size of the nation and the world appears to shrink. When Boeing introduced the 747 jumbo jet on January 21, 1970, observers reacted with a mixture of confidence and awe at the extent of human achievement; the New York Times wrote, “The 747 will make it possible for more and more people to discover what their neighbors are like on the other side of the world.” The new jet could travel 4,600 nautical miles without refueling and could carry 490 people, almost four times that of its closest rival, the Boeing 707. It also forced existing airports to modernize their runway and terminals to accommodate the giant aircraft, which had the positive effect of improving safety throughout the airline industry. Unfortunately, the jumbo jet also made accidents much more tragic; one of the worst aviation disasters in history occurred when two 747s collided on a runway on March 27, 1977, killing more than 570 people. Despite these singular incidents, air travel remains the safest form of transportation in the United States, and the 747 significantly improved traffic between coasts and overseas; it also provided farmers and manufacturers with a fast alternative to trucking, thereby expanding their available market base to a global scale. Though the basic style of the jumbo jets has remained unchanged since 1970, the interior has been routinely adapted to reflect the advances in information processing and communication. By 2000, most American jets included telephones and entertainment consoles for each passenger.

More traditional forms of transportation have tried to maintain relative parity with airlines. Though the United States has lagged far behind European and Asian nations in its high-speed railroads, automobiles have benefited greatly from the computing revolution. As early as 1966, William Lear, founder of the Lear Jet Aviation Company, invented the eight-track tape deck as an option in luxury cars. By the mid-1980s automobile electronics expanded far beyond the stereo system. As fuel efficiency became a growing concern for consumers, most cars included microprocessors to regulate fuel management and suspension systems. Later innovations included computer-monitored airbags, digital speedometers, and complex alarm systems. In 1996, Cadillac introduced the OnStar system, which provided motorists with remote access to their vehicle, emergency roadside assistance, and a computerized navigation system that used Global Positioning System (GPS) technology to pinpoint the vehicle’s exact location. By 2000, CD players, keyless entry, and automatic door lifts became common options, while some minivans included miniature televisions, with VCR and DVD players, and laser-guided reverse warning systems. Though no company has introduced a driverless automobile, many of the necessary components are already in production.

One of the most directly relevant applications of computing technology occurred in the field of medicine. Computer-aided microscopes with attached robotic lasers that cut with near-cellular precision enabled physicians in the 1990s to successfully perform surgery on neonatal arteries the size of a toothpick. In 2001 AbioCor used microprocessors with long-lasting power sources in the first fully contained artificial heart, which Dr. Gary Laman of Jewish Hospital in Louisville, Kentucky, implanted in Robert Tools; its success promises a future where artificial organs will be cheaper and more commonly used. Researchers involved in nanotechnology work with tiny machines that can be as small as three atoms wide; some of the projected applications include projects involving information gleaned from the complete map of the human genome, which might enable scientists to develop a direct interface between computer technology and human biology. The prospect of cybernetic organs or “intelligent” plastic surgery promises to expand the lifespan of humans.

Some technology advocates hope that science will provide solutions for some ancient social problems, such as disease, old age, poverty, and even crime. Others, however, are not as optimistic; some religious leaders fear that an influx of technology, without an equivalent emphasis on philosophical and religious values, may result in a breakdown of familial and personal relationships, which form the basis for civic responsibility. Other social critics worry that the continuous introduction of new technology creates an artificial demand that only encourages a materialistic society. One point, however, seems clear—the true impact of the computer revolution will not be fully understood for many decades to come.

See also COMPUTERS; Human Genome Project; Internet; space policy.

Further reading: Charles Flowers, A Science Odyssey: 100 Years of Discovery (New York: Morrow, 1998); Edward Nathan Singer, 20th Century Revolutions in Technology (Boston, Mass.: Nova Science Publishers, 1998); Don Tapscott, Grown Up Digital: How the Net Generation Is Changing Your World (New York: McGraw-Hill, 2008).

—Aharon W. Zorea and Stephen E. Randoll



 

html-Link
BB-Link