From 1929 to 1945, significant advances occurred in medical knowledge, treatment, and TECHNOLOGY, in employer-sponsored health insurance programs, and in the role of the federal GOVERNMENT in medical research and health care. WORLD War II played an especially important role in bringing such progress in medicine and public health, which contributed not only to increased life expectancy (from 57.1 years to 65.9 years between 1929 and 1945) but also to enhanced quality of life.
A number of important breakthroughs came in medical equipment, surgical procedures, and pharmaceuticals. In 1929, Philip Drinker invented the iron lung, an airtight metal tank using electrically powered bellows for artificial respiration for patients who could not breathe because of illnesses such as polio and a variety of injuries. X-ray machines enabled improved diagnosis and surgery. To reduce mortality from blood loss, Dr. Bernard Fantus of Chicago’s Cook County Hospital established in 1937 the first repository in a hospital to store and preserve blood from donors, and called it a “blood bank.” Then, in 1938, Dr. Charles Drew established a system to collect blood plasma, which the American Red Cross later relied upon to establish massive wartime blood drives and to distribute the blood needed for the war effort at home and abroad.
In cancer research, the most striking discovery of the era came from Dr. Charles B. Huggins, who in 1941 demonstrated the impact of the endocrine system in the functioning of the prostate gland. Enabling Huggins to successfully treat prostate cancer through hormone therapy in some patients, this pioneering research was a milestone in the use of chemotherapy for prostate and breast cancer and led to a significant drop in the mortality rate for these types of cancers.
In the early 1940s, Dr. Alfred Blalock and technician Vivien Thomas developed a shunt to bypass obstruction of the aorta. Using this technique, Dr. Blalock performed the first open-heart operation in 1944 at the Johns Hopkins University Hospital in Baltimore, Maryland, on a 15-month-old baby girl (who successfully recovered from “blue baby” syndrome), thus opening the door to using the technique for other heart diseases.
During the World War II era, the U. S. made significant contributions in the production, packaging, and delivery of medical services, which led to saving countless numbers of lives throughout the world. The combination of wartime necessity and governmental support led to marked progress and innovation in many areas, particularly sanitation, infection control, pain management, evacuation procedures, blood collection and storage, and rehabilitative medicine. The progress in sanitation came in part as a response to the high incidence of malaria in the Pacific Islands. Antimalaria efforts included aggressive prevention and treatment measures. Because quinine, the traditional treatment, was unavailable (it was made from tree bark found in some Pacific islands, which were held by the Japanese), the United States used Atabrine, a substitute that sometimes caused violent side effects in the troops.
In the area of infection control, U. S. GIs carried sulfa powder to use on open wounds, which greatly reduced the rate of infection during the war. To fight bacteria, the government in 1941 challenged pharmaceutical companies to develop methods to mass-produce penicillin, the highly effective antibacterial drug discovered by the British bacteriologist Sir Alexander Fleming in 1928. Leading the way, Pfizer devoted three years to improve penicillin production using a deep-tank fermentation method, and the company produced 90 percent of the penicillin held by the Allies in 1944. The availability of penicillin, considered the first “wonder drug,” helped save thousands of lives. In 1943, the microbiologist Selman Waksman discovered the natural antibiotic streptomycin in soil, which was effective against two debilitating and often fatal diseases of the day—tuberculosis and meningitis.
The United States also made great strides in alleviating the suffering of the wounded. The American pharmaceutical company Squibb developed a styrette for medics to use to quickly administer morphine, an addictive but potent painkiller. Because of new rapid evacuation techniques (primarily airlifting the wounded), almost all war casualties who were evacuated from northwest Europe recovered. Important advances also came in rehabilitative medicine, and in 1944 President Franklin D. Roosevelt officially gave federal support to rehabilitation medicine for war casualties. Building upon work in rehabilitation in World War I, Dr. Howard Rush promoted a “whole person” aggressive “reconditioning” rehabilitation model that included acute care, physical and psychological rehabilitation, and vocational training designed to return the injured to civilian life as productive citizens.
With respect to more general government involvement in health care from 1929 to 1945, the federal government at first reduced and then expanded its role in public health, research, and health care and became involved in controversy over the most hotly debated health care issue of the day—health insurance. Although much important ground was covered in extending benefits to the poorest segments of society, the notion of government-sponsored, universal health care for all citizens remains controversial down to present times.
The onset of the Great Depression proved something of a watershed and a catalyst in the history of American health care, both in the magnitude of problems encountered and the scope of governmental responses it engendered. Economic dislocation and loss of income was manifested in the inability of many middle - and working-class families to afford medical care, which was often dropped altogether. This was particularly true throughout the rural South, long the nation’s poorest region in terms of disposable income, while cash-strapped urban centers also slashed expenditures for health and preventive sanitation. Consequently, large segments of society began contracting chronic illnesses usually associated with the poorest classes. Their plight was further complicated by another factor, malnutrition, exacerbated when many former middle - and working-class people refused to avail themselves of food and RELIEE programs, apparently out of shame. This malady was also highly visible among rural schoolchildren, but it was not until the ascent of Franklin D. Roosevelt to the White House in March 1933 that the government began marshaling resources to bear on this and other medical problems.
In health policy, partly because of criticism from the American Medical Association (AMA) that it was socialistic, the 1921 Sheppard-Towner Act providing funds for health care education for mothers and children was allowed to expire in 1929. Soon after Roosevelt assumed the presidency in 1933, Congress restored provisions of the Sheppard-Towner Act and began to make very limited provisions for assistance with medical expenses for people on relief in the Federal Emergency Relief Act. Such assistance was expanded in 1935 in the Social Security Act, which gave grants-in-aid to states for maternity and child health care as well as for several categories of health-impaired groups. Other governmental programs focused on expansion of infrastructure by building and improving hospitals, sanitoriums, and clinics nationwide, with many new buildings rising in previously unserved regions.
Poster promoting sanitary facilities as a means of halting the spread of disease (Library of Congress)
1937 it established the National Cancer Institute, which awarded grants for cancer research. In 1938, Congress established a grants-in-aid program to help states study and investigate venereal disease, and that same year the Food, Drug, and Cosmetic Act extended the authority of the Food and Drug Administration (FDA) to monitor pharmaceuticals.
Because rural areas were the hardest hit, they continued attracting the most attention from governmental health programs. Between 1935 and 1946 the Resettlement Administration and the Farm Security Administration fostered improved care in rural regions for farmers and migrants alike through the creation and maintenance of medical cooperatives. Concentrated throughout the South and West, these catered to some of the poorest counties in the nation and brought a wide range of drugs, obstetrical care, emergency surgery, and hospitalization to the most needy. Doctor participation in the program remained voluntary, but women nurses involved received increased responsibilities in the fields of education and prevention among indigent migrants and wielded clinical and administrative responsibilities they might never have otherwise assumed. The end result proved both cost effective in terms of outlays and pathbreaking in extending treatments to poor Americans.
One government-funded study, however, became perhaps the most unethical and immoral program in the history of U. S. medical research. In 1932, the U. S. Public Health Service began a study on how syphilis would affect the cardiovascular and neurological systems of African Americans. This experiment, which included some 400 blacks in advanced stages of syphilis, was conducted at the Tuskegee Institute in Alabama. The men were told that they would be given medical treatment for “bad blood,” yet were denied penicillin, a known cure, so that the experiment could continue. By 1972, when the study ended, 128 of the men had died of syphilis or complications from syphilis, and 40 wives and 19 children had become infected as a result of the participants not having received treatment.
As World War II approached, the NIH directed its attention to war-related issues. For the military, the NIH reported on categories of medical deferments, studied the effect of high altitude flying, prepared vaccines, and developed first-aid, burn, and trauma therapy for the injured. For the war effort at home, the agency researched hazardous substances and their effects on war-industry workers in order to improve working conditions. In 1944, Congress passed the Public Health Service Act, which included provisions to expand the National Cancer Institute’s research grants program and made it part of the National Institute of Health. (In 1948, when Congress authorized the National Heart Institute and made it part of the NIH, it formally changed the agency’s name to the National Institutes of Health to reflect the several research institutes it included.)
Perhaps the most controversial areas for health policy during this period involved national health insurance and prepaid health plans. The primary issue was whether the federal government or employers should sponsor or supplement health insurance. Employer-sponsored health insurance for hospitalization made a giant leap forward in 1929 when schoolteachers signed up for the first Blue Cross program, as did insurance for medical care in 1939 when the first Blue Shield plan was founded in California. In 1938, the industrialist Henry J. Kaiser enrolled 6,500 workers on the Grand Coulee Dam into his employer prepaid health insurance plan, entitled Permanente, a forerunner of today’s HMOs. During the war buildup, more than 200,000 members of Kaiser’s shipbuilding plants in California and Washington enrolled in the plan.
Interest in government-funded, national health insurance increased during the Great Depression and World War II, particularly as advances in medicine caused corresponding increases in the cost of medical care. During the depression, President Roosevelt seemed to support national health insurance, at least in principle, but it was not included in the bill sent to Congress in January 1935 that became the Social Security Act. In 1944, FDR spoke of the right to adequate medical care as part of the Economic Bill of Rights, but again did not send Congress any legislation for national health insurance. Stronger support at the federal level came from such liberal Democrats in Congress as New York senator Robert F. Wagner, and subsequently from President Harry S. Truman. In 1935, 1939, 1943, and 1945, Democrats introduced bills for federal subsidies of state medical care and national health insurance supported by taxes. In 1948 Truman, a strong and vocal supporter of national health insurance, proposed comprehensive national health insurance through a 4 percent increase in the Social Security tax. All these proposals sparked strong conservative opposition, led by the AMA, which vehemently argued that passage of national health insurance would be an “end to freedom” and would introduce socialized medicine. (The AMA also strongly opposed prepaid health plans and corporate medicine, both forerunners of HMOs.) National health insurance bills did not get through Congress, and the tradition of employer-sponsored health insurance in the United States continued.
By the late 1940s, the United States found itself a healthier nation, due to new discoveries, advances in technology, government sponsorship of medical research, and expanded private health insurance. In the postwar era, the government began to address new problems, such as the need for more medical facilities and personnel as demand for doctors, nurses, hospitals, and equipment increased because of the postwar baby boom, the increased life span of Americans, and rising expectations about medical care.
Further reading: Emily K. Abel, Tuberculosis and the Politics of Exclusion: A History of Public Health and Migration to Los Angeles (New Brunswick, N. J.: Rutgers University Press, 2007); Michael R. Grey, New Deal Medicine: The Rural Health Programs of the Farm Security Administration (Baltimore: Johns Hopkins University Press, 2002); Davis W. Houck, FDR’s Body Politics: The Rhetoric of Disability (College Station: Texas A&M University Press, 2003); John E. Murray, Origins of American Health Insurance: A History of Industrial Sickness Funds (New Haven, Conn.: Yale University Press, 2007).
—Michelle M. Hall and John C. Fredriksen