When humans go without salt in the diet, or lose it because of illness, the major symptoms are apathy, weakness, fainting, anorexia, low blood pressure, and, finally, circulatory collapse, shock, and death. Sir William Osler (1978: 121-2), observing dehydrated cholera patients in the late nineteenth century, provided a classic description of the condition:
[P]rofuse liquid evacuations succeed each other rapidly. . . there is a sense of exhaustion and collapse. . . thirst becomes extreme, the tongue white: cramps of great severity occur in the legs and feet. Within a few hours vomiting sets in and becomes incessant. The patient rapidly sinks into a condition of collapse, the features are shrunken, the skin of an ashy gray hue, the eyeballs sink in the sockets, the nose is pinched, the cheeks are hollow, the voice becomes husky, the extremities are cyanosed, and the skin is shriveled, wrinkled and covered with a clammy perspiration. . . . The pulse becomes extremely feeble and flickering, and the patient gradually passes into a condition of coma.
Many cholera patients in the past could have been saved with rehydration therapy, and it is a central tenet in modern medical treatment that lost body fluids should be replaced with others of the same composition. Replacing a salt loss by giving water or a water loss by giving salt can be fatal. Although a history of the illness and an examination of the patient can provide clues to the type of loss, the best method is to test the blood and urine chemically - a method that only became possible in the 1930s, with the most useful test that which determined the amount of chloride in urine. Accomplished by simply mixing 10 drops of urine with one drop of an indicator and then adding silver nitrate, a drop at a time, until the end point was reached, this was called the “Fantus test” after Dr. Bernard Fantus at the University of Chicago.
This test proved so useful in treating salt - and water-depleted British soldiers in India and Southeast Asia during the mid-1930s that Dr. H. L. Marriott, in his classic text on fluid replacement therapy, stated: “It is my belief that the means of performing this simple test should be available in all ward test rooms and in every doctor’s bag” (Marriott 1950: 56).
Most early studies of sodium depletion in humans were prompted by diseases. One was Addison’s disease (in which the adrenal gland that makes the sodium-retaining hormone for the body stops working), and another was diabetes (in which a high level of blood glucose forces the excretion of large amounts of water by the kidneys). Such studies were also conducted in cases of extreme depletion brought on by starvation or acute diarrhea. In the 1930s, however, R. A. McCance (1935-6) published his report of a series of experiments that established a baseline on the clinical nature and physiology of sodium depletion in humans.
To induce salt depletion, McCance (1935-6) employed a sodium-free diet combined with sweating. (Because laboratory animals do not sweat, he used humans as his test subjects.) There was no “research-quality” kitchen available, the food was prepared in the McCance home, and the subjects of the experiment - all volunteers - slept and ate there. The diet consisted of sodium-free “casein” bread, synthetic salt-free milk, sodium-free butter, thrice-boiled vegetables, jam, fruit, homemade sodium-free shortbread, and coffee. During recovery periods, the volunteers ate weighed quantities of high-sodium foods (such as anchovies and bacon) and small, weighed amounts of sodium chloride. Sweating was induced by placing the subjects in a full-length radiant heat bath - for two hours with the heat on and then 10 minutes with the heat off. Their sweat was collected in rubber sheets, and a final washing of each subject with distilled water ensured that even small amounts of lost sodium would be accounted for. The subjects’ average sweat loss was 2 liters, and they commented that the washing procedure was “not uncomfortable” after 2 hours in the hot bath (McCance 1935-6).
By reducing sodium in the diet, along with inducing sodium losses through sweating, McCance and his colleagues found that only a week was required to make healthy subjects seriously sodium depleted. They maintained 4 volunteers in this condition for an additional 3 to 4 days, so that the total period of deprivation lasted about 11 days.
Detailed measurements of intake (food and water) and output (sweat, urine, and feces) recorded that the subjects lost 22.5 g of sodium and 27.2 g of chloride - or about 50 g of salt. Their body weights dropped by about 1 kilogram (kg) per day, and sodium excretion averaged 3,400 mg of sodium per day for the first 4 days. Weights then stabilized, but sodium loss continued.
As the deficiency progressed, the volunteers all experienced feelings of physical fatigue, anorexia, nausea, difficulty in urinating, and extremely weak pulses. Muscle spasms and cramps - especially cramps in the fingers - were common. The subjects’ faces became drawn and “ill-looking,” and they slowed mentally, becoming dull and apathetic. McCance was struck by the similarity of a number of these symptoms to those of Addison’s disease, but the symptoms and signs all rapidly cleared up when the volunteers resumed consumption of sodium (McCance 1935-6).
Both before and during World War II, as many in the Allied armed forces were severely disabled by heat - and water-related illnesses, there was intense interest in understanding the mechanics of water and salt metabolism and the effects of heat on the human body. Research was even undertaken to see how long a man could survive on a raft in the ocean, or in the desert, so that restrictions could be placed on certain military activities (such as limiting the duration of searches for lost aviators, who, after the specified survival time had passed, might reasonably be presumed dead). Other studies examined the conditions that servicemen could be forced to work under and defined safe limits. For example, after 5 hours of marching in the heat, with full packs and no water, even well-conditioned marines could not continue (Ladell 1949).
During and after World War II, there was also interest in the effects of diarrhea. J. L. Gamble (1945) showed that intestinal secretions contained more sodium than chloride, and D. A. K. Black (1946) reported a series of experiments on 10 men with blood pressures averaging 94 mm Hg SBP/59 mm Hg DBF, who were victims of tropical sprue (a disease characterized by chronic diarrhea). The patients were bedridden, listless, and incapable of exertion. But these symptoms disappeared - and blood pressure rose to normal - with sodium supplementation (Black 1946).
Excess Sodium
Humans, as noted, have evolved complex “redundant systems” to regulate sodium and other essential minerals. For marine animals, deriving sodium from the sea was a relatively easy matter. As evolution progressed, however, satisfaction of sodium needs became a more complicated task. Land dwellers had first to locate sources of sodium, then ingest the mineral, and, further, conserve it within their bodies. To achieve this, physiological and behavioral mechanisms evolved that were designed primarily to protect against a life-threatening deficit of sodium, such as can occur with vomiting, sweating, diarrhea, or kidney malfunction.
But although the body’s systems are reasonably effective against sodium deficit, evolution did not do as well in protecting humans against an excessive intake of the mineral. There are two different kinds of excessive sodium intake: (1) acute ingestion of salt without water, or of very salty water (such as seawater or other briny water); and (2) chronic ingestion of toxic levels of sodium in the food supply.
It seems likely that the former was never a major problem in the history of humankind and probably occurred only when people felt forced to drink seawater or the water from salt springs or salt lakes. Chronic ingestion of excess salt in food, however, is both a recent and a very real problem. Until the past few centuries, salt intake was primarily determined by the amount a person chose to add to food (“active intake”). Increasingly, however, as foods were preserved in salt, and especially today with foods processed with salt, the great majority of salt intake has become “passive,” meaning that food processors and manufacturers - not consumers - decide the quantity of salt to be included in the foods they produce.
Indeed, it has been estimated that in prehistoric times, the daily human intake of sodium was about 690 mg, with 148 mg derived from vegetables and 542 mg from meat (Eaton and Konner 1985). By contrast, today the U. S. Food and Drug Administration (FDA) recommends keeping dietary intake to 2,400 mg of sodium per day (the amount contained in 6 g - or about 1 teaspoon - of table salt). This is roughly the same amount accepted by an advocacy group that promotes a low-salt diet, Consensus Action on Salt and Hypertension (CASH). Needless to say, even such a “low-salt” diet still delivers 3.5 times the amount of sodium provided by the meat and vegetables of the Paleolithic diet.
Although there is also concern over high salt intake by children, food processors only relatively recently halted their practice of adding sodium to baby food. (Presumably, the mineral was meant to improve the food’s flavor for parents.) This change for the better followed the observation by L. K. Dahl, M. Heine, G. Leitl, and L. Tassinari (1970) that young rats with a high sodium intake became hypertensive and remained so for the rest of their lives. But despite this discovery, the average sodium intake by two-year-olds in the United States remains higher than the amount recommended by the FDA and by CASH (Berenson et al. 1981).
Moreover, the average intake throughout much of the industrialized world today is about 10 g of table salt (3,900 mg of sodium) per person per day (James, Ralph, and Sanchez-Castillo 1987; USDA/USDHHS 1995). But only about 10 percent of the sodium consumed occurs naturally in foods; another 15 percent is added by consumers (“active intake”), and the remaining 75 percent is added to food by manufacturers and processors. Therefore, based upon the average industrial diet, only 390 mg of sodium is naturally occurring, 585 mg is added by the consumer, and a substantial 2,925 mg is derived (“passive intake”) from sodium added during processing (James et al. 1987).
Obviously, then, a low-salt diet like that of our ancient ancestors seems an impossible goal; increasingly, the population is at the mercy of the food indus-try. Yet, that industry gains several advantages by using excessive salt in food processing. Salt, added to food, improves flavor and palatability, especially for those who have become addicted to its taste. Salt increases the “shelf-life” of food (although, with today’s effective packaging and refrigeration technology, there is little need for the presence of near-toxic levels of sodium in the food supply), and salt adds very inexpensive weight to the final product (by helping to retain water), thereby increasing profit margins. Finally, increased salt consumption stimulates thirst and thus increases consumption of the beverages sold by many of the major food companies (McGregor 1997).