Www.WorldHistory.Biz
Login *:
Password *:
     Register

 

14-08-2015, 21:30

Medicine

Throughout the 20th century, the field of medicine enjoyed a constant rate of development. Though technological advances related to the computing revolution of the century’s last three decades provided more sophisticated tools with which to further research and diagnosis, most of the midcentury innovations related to DNA, vaccination, and psychotropic medication remain unsurpassed as groundbreaking discoveries. But the single greatest breakthrough, the Human Genome Project, may determine the direction of medical research far into the 21st century.



Americans enjoyed a gradual decrease in the death rate throughout the 20th century, with a marked improvement in the 1970s. Per 100,000 residents, the death rate fell from 760.9 in 1960 to 714.3 in 1970, after which the rate of decline almost tripled; 583.8 in 1980, and 479.1 in 1997. Although a variety of reasons unrelated to health care could explain the general decline, including fewer wars and safer working conditions, indicators such as maternal death rates and infant mortality rates reveal the impact of increased access to health care for a larger portion of the American public. In 1950, 2,960 women died giving birth, and the rate of infant mortality was 29.2 deaths for every



100,000 live births. Ten years later in 1960, the infant mortality rate had dropped only three points, but the number of maternal deaths dropped to 1,579, and to 803 in 1970. Throughout the 1980s and 1990s, the number fluctuated between 270 and 340 per year. In addition, the infant mortality rate dropped six points in the 1960s, and eight points in the 1970s. In 1990 there were 9.2 deaths per 100,000 live births, and in 2005 there were only 6.9 deaths per 100,000 live births.



Generally speaking, medicine and health care gradually consumed a greater portion of the gross domestic product (GDP) to become the nation’s largest economic sector. The number of physicians nearly doubled between 1960 (260,484) and 1980 (467,679), and continued to grow to 756,710 in 1997, while the number of health service professionals grew from 76 million people in 1970 to 131 million in 1998. The amount of money spent on health research grew from $2.8 billion in 1970 to $13.5 billion in 1985, to $35.8 billion in 1995, and to $95 billion in 2005. As a percent of the GDP, health expenditures rose from 5.1 percent in 1960 to 9.4 percent in 1980, 13.5 percent in 1997, and 16 percent in 2005.



As health care assumed a more prominent role in American society, it generally became more specialized, with greater support from the private sector, and financing from the public sector. Though all age groups tended to visit their doctors more frequently in the 1990s than they did in the 1960s, the number of contacts per person increased most for those Americans aged 65 and older, and for children aged 14 and younger. Also, those visits were more often made to a specialist than a general family practitioner. In 1970 less than 1 percent of active physicians were specialists, by 1980 the percentage increased to 3.5, and to 5 in 1990. In 1970, 63.4 percent of health professionals worked in hospitals, only 11.2 percent in offices and clinics, and 12 percent in nursing homes. The number of physicians working in hospitals plummeted to 44.5 percent in 1998. Improvements in medicine, techniques, and technologies have lessened the need for general hospital space; there were 1.4 million beds in 7,156 hospitals in 1975, and only a few more than 1 million beds in 6,097 hospitals in 1997. Yet, the percentage of occupied beds also fell from 76 to 65 during the same time span.



During the 1980s, the focus of health care and research shifted from government sponsorship to private sector. This privatization is less pronounced in the decline in hospitals; the number of privately owned, for-profit hospitals remained relatively constant (775 in 1975 as compared with 797 in 1997). During the same period, however, the number of state and federal hospitals declined by a third (2,143 to 1,545), and the number of private, nonprofit hospitals declined by 10 percent (3,339 to 3,000).




Note: Average life expectancy in 2004 was 77.8 years.



Source: "National Vital Statistics Report: Fast Facts A to Z,‘ October 2006, National Center for Health Statistics © Infobase Publishing



Industry accounted for only $795 million, compared with $1.6 billion from the federal government. The relative ratio remained the same in 1980 ($2.4 billion to $4.7 billion), but changed dramatically in 1985, with private industry investing $5.3 billion compared with $6.7 billion by the federal government; by 1995 the ratio was reversed with $18.6 to $13.4 billion. The public money that had previously been devoted to research was reallocated to pay for increased insurance coverage. In 1970 the federal government paid $6.6 billion, while private insurance paid out $30.9 billion. In 1985 the federal government assumed a slightly larger share, paying $174.2 billion to $254.5 billion paid privately, and in 1997 the amounts paid were almost even; $507 billion to $585 billion, respectively.



The investment in health care was not without its benefits. After World War II, scientists redirected their attention toward another, longer war against disease. The subsequent development of effective vaccines not only impacted American health but also reduced select contagions on a global scale; diseases that had plagued society for centuries were all but eliminated in the following decades. Vaccines for diphtheria, whooping cough, and polio had a significant effect on the health of America. The annual number of diphtheria cases dropped from 5,796 in 1950 to 918 in 1960, 435 in 1970, and only three or four a year in the 1980s and 1990s. Whooping cough afflicted more than 120,000 people in 1950, but was reduced to 14,809 cases in 1960, 4,249 cases in 1970, and only 1,730 in 1980. Similarly, polio cases fell from 33,300 in 1950 to 3,190 in 1960, to only 33 in 1970, and less than a dozen a year in the 1980s and 1990s. The number of measles cases actually increased during the 1950s, jumping from 319,124 to 441,703 cases in 1960. A new vaccine, however, cut the rate by 90 percent; there were only 47,351 cases in 1970, and 13,506 cases in 1980. Another common childhood disease, mumps, also fell from 104,943 cases in 1970, to 8,576 cases in 1980, and 5,292 cases in 1990.



Vaccinations provide a better alternative to antibiotics, which have been linked to global increases in the number of drug-resistant pathogens. Antibiotics are designed to kill bacteria that cause illness. Unfortunately, bacteria evolve and adapt to their environment at a remarkable rate. Bacterial cells that survive antibiotic treatments can often acquire new genes that make them resistant to future doses of the same antibiotic. For example, in the late 1980s hospitals began seeing more frequent occurrences of vancomycin-resistant enterococci (VRE), which are only minimally responsive to existing antibiotics. Similarly, penicillin had near a 100 percent success rate against pneumococci in the 1960s, but by the 1990s, the rates of success had decreased to 70-80 percent, and in some parts of Europe, the rate had fallen to 50 percent. In 2001 the World Health Organization addressed the problem of bacterial resistance with the first Global Strategy for Containment of Antimicrobial Resistance. In contrast to antibiotics, vaccinations avoid the problem of resistance by offering protection for the human immune system before pathogens have a chance to adapt. In 1975 the possibility of side effects was significantly reduced by the discovery of monoclonal antibodies, which enabled doctors to target specific disease cells. This innovation allowed scientists of the 1990s to focus on vaccinations as a primary method of controlling disease. Initial trials include an experimental AIDS vaccine, announced in 1999, and other prototype vaccines for malaria, hepatitis C, and tuberculosis. Biomedical engineers are working on infusing vaccines in certain strains of potatoes, tomatoes, and bananas, so they will mature with vaccines present. Research arising from the full map of the human genome may lead to customized cancer-vaccines tailored to a patient’s exact DNA makeup.



Over the course of the 20th century, life expectancy in the United States has increased by nearly 30 years, from 47.3 years to 76.5 years. In 2004 life expectancy was estimated at 77.9 years. While the numbers of all age groups grew steadily, older Americans became significantly more numerous after World War II than they had been in previous decades. In 1950, there were only 577,000 people over 85 years old; in 1960 the number had increased to 929,000 with similar increases every decade thereafter; 1.5 million in 1970, 2.2 million in 1980, and 3 million in 1990. The number of nursing home residents has similarly increased, jumping from 961,500 to 1.3 million over a 10-year period from 1974 to 1985. As the number of American seniors continued to grow, the number of prescribed drugs available to combat cardiovascular disease, high cholesterol, memory loss, and cancer similarly increased. Americans spent more than $12 billion on prescription drugs in 1980, nearly double that amount in 1985 ($21 billion), and in 1993 the figure had reached $48 billion. The American Association of Retired Persons (AARP) lobbied heavily for reforming Medicare to include universal coverage for prescription medicine for senior citizens. Lawmakers responded and in December 2000, Congress passed an amendment to Title XVIII of the Social Security Act to provide coverage for outpatient prescription drugs under the Medicare program.



Mental health also became a priority in the postwar years. Psychotropic, or mind-altering, drugs were first used for therapeutic purposes during the 1950s and 1960s. The Food and Drug Administration (FDA) classified some experimental drugs under their Investigative New Drug (IND) regulations; some of these included lysergic acid diethylamide (LSD) and thalidomide. After a rash of birth defects around the world were linked to the use of thalidomide in 1962, the FDA tightened up its IND regulations and banned the use of thalidomide and LSD. Though the


Medicine

A high-speed CAT scan machine (Darren McCollester/Getty Images)



FDA has restricted experimental use of psychotropic, or mind-altering drugs, it did not altogether limit their use. Psychiatrists had been relying on electroconvulsive therapy (ECT), more commonly known as electroshock, to treat symptoms of depression and mental illness. In the late 1960s and early 1970s, the public reacted to ECT as an example of the inhumane conditions of most mental hospitals, prompting most psychiatrists to pursue psychotropic alternatives, including antidepressants and antipsychotic drugs such as lithium, chlorpromazine, haldol, and clozapine. Chlorpromazine was so successful in treating hallucinations and delusions that it was championed as “the drug that emptied the state mental hospitals” during the 1970s and 1980s.



In 1980 public attention shifted from the state hospitals to childhood behavior, after the Diagnostic and Statistical Manual of Mental Disorders included Attention Deficit Disorder (ADD) and, later, Attention Deficit and Hyperactivity Disorder (ADHD) in their lists. Family counselors relied on methylphenidate, a derivative of amphetamine sold under the name of Ritalin, as the most commonly prescribed treatment for ADD and ADHD. Later in 1987, selective serotonin reuptake inhibitors (SSRI) were introduced under the name of Prozac and used as an antidepressant for both children and adults. Alternative varieties of these medications quickly followed, and many educators worked with doctors to promote medication as a popular solution for student behavior problems; between 1990 and 1997, 10 percent of children aged 6-14 years were using Ritalin, and 1 percent of preschool children aged 1-5. A third of those children were also prescribed SSRI treatments such as Zoloft, Luvox, and Paxil. Some parents and pediatricians became concerned about the side effects of these drug combinations, which can include mania, seizures, anorexia, toxic psychosis, and cardiovascular failure. Opponents were met with staunch opposition from education and counseling groups, many of which received funding from pharmaceutical corporations directly; one such lobbying group, Children and Adults with ADD (CHADD), received 10 percent of its funding from the makers of Ritalin. In 1998 and 1999, state school boards in Texas and Colorado passed resolutions discouraging educators from promoting the drugs, and in 2000 Connecticut passed legislation forbidding it. Proponents of psychotropic medication argued that the new drugs allow millions of people with chemical-based mental illness to lead normal lives in the mainstream of American society. Opponents argued that the prevalence of such licit drugs unavoidably promotes a large subculture of illicit drug use by training people to seek pharmaceutical solutions to problems that have social and interpersonal causes.



For internal medicine, innovative drug treatments permitted great advances in organ transplants. In 1959, scientists used the drug Imuran to suppress the human immune system, which allowed foreign organs to be transplanted into a host body. The first successful pancreas transplant occurred in 1966 and was followed the next year by the first successful heart and liver transplants. In 1981 doctors performed the first successful combination heart and lung transplant. Scientists developed a new antirejection drug, cyclosporine, which aided in the first successful lung transplant in 1983. Discoveries in microprocessing technologies during the same period permitted doctors to embark on the first artificial heart implant; although it only functioned for 112 days, it set a precedent for continued research in artificial organs. In 2001 doctors transplanted the first fully self-contained artificial heart in Robert Tools. Similar advances included artificial cochlear (inner ear) implants, eyes, muscle and blood, skin, kidneys, lungs, pancreas, and liver. Currently, artificial organs only provide a bridge to transplantation, which helps patients who may have to wait long periods before an acceptable donor organ becomes available. Pharmacologists of the 1990s focused on methods of preventing transplant rejection on the cellular and molecular level; their goal was to find a safe and cost-effective way to allow multiple transplants of natural or artificial organs for less than life-threatening reasons. The completion of the human genome map may lead to further innovations in this area.



Other aspects of internal medicine benefited from the computer revolution. In 1972 Godfrey Hounsfield recorded hundreds of X-ray images on a computer at slightly variant angles to produce a three-dimensional computed tomography (CT) scan. The procedure proved invaluable to doctors needing precise views of an area that would otherwise be too dangerous to access though invasive surgery. In 1984 Raymond Damadian augmented the CT scan when he invented magnetic resonance imaging (MRI), which provides more precise images of soft tissues; MRIs became especially useful in examining brain and spinal cord injuries. The later development of radioisotope tracers in the 1990s provided physicians with a means of tracking information about metabolism and the functions of enzymes and hormones through the injection of material that can be recorded by computerized sensors. Advances in medical techniques have decreased the time needed for recovery. During the 1990s, advances in minimally invasive laparoscopic surgery techniques, which use very small incisions, often reduced the length of hospital stays by 10 percent; in some cases, cutting the recovery time from 4 to 6 weeks to 2 to 7 days. Robotic lasers using computer-aided microscopes cut with near cellular precision; with these tools, physicians in the 1990s successfully performed surgery on neonatal arteries the size of a toothpick. When robotic surgery is combined with innovations in telecommunication, doctors can perform long-distance operations for patients who are unable to travel.



In 1998 a breakthrough occurred as scientists succeeded in isolating stem cells from human embryos and growing those cells in a laboratory. This research quickly generated controversy, specifically in the area of embryonic stem cell research. On August 9, 2001, President George W. Bush attempted to achieve a compromise by allowing federal funding for research using embryonic stem cells developed from embryos that had already been destroyed, while prohibiting research that involved creating new lines from embryos after that date. In the November 2, 2004 general election, California voters approved $3 billion to fund stem cell research without the Bush administration restrictions. Four other states followed California’s lead. When the U. S. Congress enacted legislation allowing embryonic stem cell research on stem cells derived from embryos at in vitro fertilization clinics in 2006, President Bush vetoed the bill on July 18, 2006, the first veto of his presidency. Opponents of embryonic stem cell research supported the president’s veto, arguing that they only opposed embryonic stem cell research but favored other kinds of stem cell research. Supporters of embryonic stem cell research, which include a majority of Americans in recent polls, argued for its potential to lead to cures of debilitating diseases such as Parkinson’s. Stem cell researchers downplayed the emphasis on cures for various diseases. The scientists argued the real potential of embryonic stem cell research lay in the creation of a new model for understanding disease processes. New therapies could be developed to combat disease based on this new model. In much the same manner, the Human Genome Project holds great potential for medicine in the 21st century.



The human genome map promises a host of solutions for many common ailments, including cancer and congenital disorders. Though Oswald Avery discovered in 1943 that DNA carried genetic information, it took 30 years until 1972, for scientists Paul Berg, Stanley Cohen, and Herbert Boyer to develop a technique that allows researchers to decipher the information. By splicing fragments of DNA from one organism onto the DNA string of another, controlled experiments can isolate and reveal the functions of specific DNA segments. It took another 10 years before Eli Lilly & Company used the new technique to produce the first genetically engineered human insulin in 1982.



Throughout the 1970s, researchers conducted ad hoc clinical trials, according to the specific needs of their research; a systematic mapping of the human genome seemed an impossible task. By the mid-1980s, however, with the advent of powerful information processors, Charles DeLisi of the Office of Health and Environmental Research felt convinced that a systematic map was possible. In 1990, after years of interagency discussion, the U. S. Department of Energy joined the National Institutes of Health in a 15-year effort to identify each of the 30,000 genes in human DNA. The program was called the Human Genome Project, and Congress committed $3 billion toward its completion. The Human Genome Project completed a final draft of DNA sequencing in 2002, three years ahead of schedule. Scientists expect the specific benefits to include improved diagnosis and early detection of diseases; more rational drug designs leading to individually customized drugs and effective gene therapy; better understanding of the vulnerabilities of specific genetic diseases; enhanced protection from biological warfare; and significantly improved forensics capabilities. Already, significant advances have been made in the development of effective treatments for breast cancer, as well as improved gene therapy for cystic fibrosis, Duchenne’s muscular dystrophy, and sickle cell anemia.



See also COMPUTERS; narcotics; science and technology.



Further reading: Edward I. Alcamo, DNA Technology: The Awesome Skill (San Diego, Calif.: Academic Press, 2001); John Duffy, From Humors to Medical Sciences: A History of American Medicine (Urbana: University of Illinois Press, 1993).



—Aharon W. Zorea and Stephen E. Randoll



Meese, Edwin C., III (1923- ) government official Edwin Meese served as U. S. attorney general in Ronald W. Reagan’s administration from 1985 to 1988. Born in 1923, Meese received his B. A. from Yale University and his LL. B. from the University of California Law School. In 1967 he became deputy district attorney of Alameda County in northern California, where he undertook the successful prosecution of student protesters at the University of California at Berkeley, and Oakland Black Panthers. When Reagan was elected governor of California in 1966, he appointed Meese as executive assistant and chief of staff, a position Meese held from 1967 to 1974. From 1977 to 1981 Meese was a professor of law at the University of San Diego.



During the 1980 presidential campaign, Meese served as chief of staff and senior issues adviser for the Reagan-Bush committee and, following the election, headed the Reagan transition team. He served as counselor to the president from 1981 to 1985, functioning as chief policy adviser. As a member of the cabinet, Meese was responsible for the administration of the cabinet, policy development, and planning and evaluation.



As U. S attorney general, he tried to roll back some of the constitutional interpretations issued by the Earl Warren and Warren E. Burger Supreme Court. He denounced judicial activism and called for a return to the “original meaning” of the Constitution as he believed the Founding Fathers understood it, a position Associate Justice William J. Brennan, Jr., criticized as “arrogance cloaked as humility.” Meese disregarded such criticism, and he urged Reagan to appoint federal judges and Supreme Court judges who shared his views.



During his seven and a half years in the White House and the Justice Department, Meese was investigated on a number of ethics violations. In 1984, before he became attorney general, his failure to report reimbursements on more than 30 trips as White House counsel led the Office of Government Ethics to conclude that he had violated conflict-of-interest rules. A larger scandal emerged after he became U. S. attorney general, when independent prosecutor James C. McKay investigated Meese’s efforts to help a friend, E. Robert Wallach, to secure U. S. government backing for an oil pipeline from Iraq to Jordan. As this scandal was breaking, Deputy Attorney General Arnold Burns and Assistant Attorney General William Weld resigned their positions at the Justice Department complaining that morale had plummeted under Meese because of the scandal. After 14 months of investigation, independent prosecutor McKay concluded in July 1988 that Meese had “probably violated criminal law” on four occasions, but he decided that Meese had not been motivated by “personal gain.” Though Meese believed he had been vindicated by the investigation, the next month he resigned his office.



After leaving office, Meese worked briefly in the aerospace industry. Today he serves on a number of boards of conservative organizations.



See also independent counsel.



Further reading: Lou Cannon, President Reagan: The Role of a Lifetime (New York: Simon & Schuster, 2000); Edwin Meese, With Reagan: The Inside Story (Washington, D. C.: Regnery, 1992).



 

html-Link
BB-Link