1
The Nature of Medical Knowledge
In diabetes . . . the chief difficulty lies in the fact that the danger is one of the future. This is the insidious peculiarity of diabetes. We do not at all disturb for the present the general well-being of the diabetic if we treat him badly and overweight his weakened functions; yes, we may even improve by psychical influence his momentary well being if we permit a more liberal diet. We are, however, playing a dangerous game. We are thinking only of the present and forgetting the future, the fortunes of which depend upon the vigilance of the practitioner.
—Carl von Noorden, New Aspects of Diabetes: Pathology and Treatment, 1912
In the writer’s experience, there is nothing more disturbing than the diabetic who acquires the disease in childhood; who apparently is a picture of robust health—who looks and feels perfectly well—but whose blood vessels have been degenerating insidiously for years; who, in the early 20’s or 30’s and probably married and with a family, is beginning to feel the effects of the degenerative changes, either because of a progressive hypertension, kidney failure, disturbance of sight due to retinitis, or a sudden attack of coronary thrombosis. . . . To prevent such cases, or at least reduce their occurrence, is the purpose of this report.
—Israel Rabinowitch, Canadian Medical Association Journal, 1944
A 32-year-old white male was seen at the [Mayo] Clinic in July, 1921, by Dr. [Russell] M. Wilder. His symptoms were polyuria, polyd[i]psia, polyphagia, weakness and loss of weight.” The patient, a farmer from “the hinterland of Montana,” had severe diabetes. The diagnosis was not difficult to make. He was urinating abundantly (polyuria), had an unquenchable thirst (polydipsia), and was constantly hungry (polyphagia). But no matter how much he ate, he was losing weight.
The remarkable aspect of the case, though, isn’t what happened at the time of diagnosis, or even eighteen months later, when the patient was started on insulin therapy, but what happened when he reappeared at the Mayo Clinic in June 1950, twenty-nine years later. “Since 1921,” wrote the two Mayo Clinic physicians who reported on the case at the staff meetings of the clinic, “he had faithfully and strictly followed a diabetic diet which by modern standards seems almost unbelievable.” The situation, they said, was “unique” in their experience. That’s why they were writing it up as a case study.
When the patient had initially been hospitalized, the Mayo staff, led by Wilder, began its procedures for treating the diabetes. They fasted the patient for “several days” until all signs of sugar disappeared from his urine. “Desugarizing the urine,” as these physicians called the procedure, was the primary goal in therapy. Then the patient was served very small amounts of carbohydrate foods daily to establish how much he could metabolize without the sugar reappearing. Once his doctors had established that level, they added protein and fat to his diet. When the combination of protein, fat, and carbohydrates led to the appearance of ketones (technically ketone bodies) in his urine, it was seen by physicians at the time as a sign of imminent danger.
The Mayo doctors assumed that the patient had now reached the limit of how much fat he could eat safely. His protein consumption was kept at the minimum considered necessary for a healthy man of his size and weight. If sugar reappeared in his urine, he would be fasted again. If ketones appeared, “all fat was omitted from the diet,” which meant most of the food he was allowed to eat, and he would restart the process. The Mayo physicians hoped by this approach to make the load on the patient’s “weakened sugar-using function as light as possible in order to rest it and thus favor its restitution.” Restitution, in fact, had rarely, if ever, been reliably documented, but that was their hope.
When the patient was released from the hospital after a month of these dietary manipulations, he was allowed to eat 15 grams of carbohydrates a day (the amount in a single thin slice of bread, although bread was not among the foods his diet allowed), 45 grams of protein (the protein in about half a pound of lean ground beef), and 150 grams of fat. This added up to a total of 1,590 calories a day, a meager ration for a hardworking farmer. He was also instructed to fast one day a week. Complicating matters further, the carbohydrate-containing foods he could eat—green vegetables—had to be boiled three times before serving to remove most of the digestible carbohydrates. He was allowed to eat bran muffins made from specially purchased bran, also boiled three times. “It was calculated,” according to his doctors at the Mayo Clinic, that “there was no food value in the muffin prepared in this manner.” This was life for a diabetic patient before the discovery of insulin.
The rigid diet did not restore the Montana farmer’s health. When he returned to the Mayo Clinic a year and a half later, in December 1922, insulin had become available. He would be among the first patients at the Mayo Clinic to receive it. He would also be the beneficiary of the recent work of two physicians at the University of Michigan, Louis Newburgh and Phil Marsh, who had experimented with a high-calorie, high-fat diet for diabetic patients and reported that they fared remarkably well. They didn’t have to live on a near-starvation regimen but rather could eat to satiety. This approach had been embraced by Russell Wilder at the Mayo Clinic. Now he and his colleagues taught their patient how to inject himself with 30 units of insulin before breakfast every morning. They also instructed him on the new diet, allowing him to eat more than 2,700 calories a day. He was still boiling the green vegetables three times and eating his food-value-free bran muffins, but over 80 percent of his calories came from fat. To eat that much fat, the Mayo physicians reported, it was “necessary for him to consume large amounts of rich cream and butter.” He religiously followed that diet, along with his insulin, as though living in a time capsule, for very nearly three decades. “He had never tasted such common foods as potatoes or bread,” they wrote. “He still fasted from a half to one day a week.”
By 1950, when the Montana farmer returned to the Mayo Clinic, the context of how his physicians understood diabetes had changed dramatically. They had come to realize that they were in the midst of a new kind of diabetes epidemic, not just an “appalling increase” in diabetes prevalence, as the already-legendary diabetologist Elliott Joslin described it that year, but the diabetic patients confronting them were different. The discovery of insulin and the initiation of insulin therapy in the early 1920s had changed things. On average, Joslin’s patients in Boston were living three times longer. “With insulin,” Joslin had written prophetically in 1923, in the third edition of his famous textbook (this edition dedicated to Banting and Best, the University of Toronto researchers who had discovered insulin just shortly before), “we shall learn the remote rather than the acute results of the disease.” And they had.
Now diabetic patients were living long enough to die of the complications of the disease itself, and because it had struck many of them in their childhood or adolescence, they were still quite young when they died; the damage to their veins and arteries caused at least in part by their bodies’ inability to control their blood sugar. By 1934, Wilder and his Mayo Clinic colleagues were reporting lesions of the retina, hemorrhages, obscuring vision in young diabetic patients who had been on insulin for ten years. By 1936, Harvard pathologists were reporting a new type of kidney disease in patients with diabetes; by 1950, Joslin’s colleagues were reporting that 80 percent of their patients who had been using insulin for at least twenty years had hypertension and more than 20 percent had kidney disease. “Of those who died between 1944 and 1950,” as the British diabetologist and medical historian Robert Tattersall reported it, “more than half had advanced kidney disease.” In one 1949 study from the University of Minnesota, kidney lesions had been a hundred times more common in diabetics than nondiabetics.
Coronary artery disease in patients with diabetes had by then become so common that Joslin was suggesting to cardiologists that they should study these young diabetics on insulin therapy to understand the fundamental cause of the disease. Those patients who developed diabetes as adults, who had the less severe, chronic form of the disease, type 2, were living long enough to suffer heart attacks, kidney failure, blindness, strokes, nerve damage, and gangrene. Physicians were reporting the “extraordinary frequency” of lesions in the arteries of the heart, the legs, and the kidneys of those with diabetes, “especially when the diabetes was of long duration.”
This was what the Mayo Clinic physicians expected to confront in the Montana farmer when he returned to the clinic in June 1950 for an examination. The fact that he was still alive after thirty years with the disease would have been reason enough for his doctors to want to know more. The most important question in the field of diabetes had become whether or not these tragic complications could be prevented or minimized if diabetic patients would make the necessary effort to control their blood sugar.
The results of the patient’s examination provided the second remarkable aspect of his case. The results were normal. The patient appeared to be thriving. He had maintained a healthy weight (he was five feet eleven inches and 143 pounds). His blood pressure was normal. “Neurologic and ophthalmoscopic findings were normal,” his physicians noted. His blood vessels were “open.” Normal. The urine showed a little sugar but was otherwise normal. There was no sign of the protein albumin, which would be symptomatic of kidney lesions. His cholesterol level and blood fats were normal. The Mayo physicians x-rayed his pelvis and discovered some minor calcification in the femoral arteries, but “no more than could be expected without diabetes at age 61.” The patient had no sign of vitamin deficiencies despite the fact that boiling his green vegetables three times before eating them would have effectively removed all the vitamins they contained along with the carbohydrate calories.
So how to make sense of it? The Montana farmer had religiously complied with the advice he had received in 1922—but that was very much not the advice the Mayo Clinic physicians were giving in 1950. “A diet such as he maintained,” they wrote, “containing very little carbohydrate and protein and large amounts of fat, is diametrically opposed to much of the current feeling which proposes a low fat diet for diabetes. . . . Whether his diet had any part to play in the freedom from complications is, of course, unknown.”
Despite the fact that the patient was thriving, the Mayo physicians would now change his treatment. Once insulin therapy had become standard of practice, the high-fat, high-calorie diet for diabetics had been discarded. Patients were instead instructed to eat carbohydrate-rich foods, on a strict schedule, to prevent the insulin from causing low blood sugar—insulin shock—which also could be fatal.
As the epidemic of patients with crippling diabetic complications had started to fill waiting rooms, physicians and diabetologists argued for more and more carbohydrates in the diabetic diet and less and less fat. The patients wanted to eat the starchy and sugary foods they had always eaten, and their physicians, including Joslin and his colleagues at his clinic in Boston, worried that the fat their patients had consumed pre-insulin was making them fat and causing the heart disease they were seeing ten, twenty, and thirty years later. They worried that it might precipitate coma because their patients, pre-insulin, eating fat-rich diets, died in a coma more often than not. This is why the Mayo physicians told the farmer he could eat far more carbohydrates, for which he could take higher doses of insulin. And he’d have to eat far less fat to balance out the calories. “For the first time in twenty-nine years he ate a normal meal,” his physicians reported in the Mayo Clinic case study. Had they lengthened his life by changing what he ate, when his previous diet may have been responsible for keeping him so remarkably healthy? Had they done no harm? They had no idea. They had followed the prevailing fashion.
At the heart of any medical progress is a question that is simple only in the asking: How does the physician know enough to decide which therapy to use or prescribe? If the patient’s ailment is acute or the patient is on the brink of death, then anything that can restore relative good health would seem to be a good thing. In the years immediately before and after the discovery of insulin and the introduction of insulin therapy, anything that could prevent diabetic coma for an indefinite period could seem worth the risk. “When I was a student and young doctor” in the 1920s, as the University of Edinburgh diabetologist Derrick Dunlop (later Sir Derrick) described thirty years later, “we were entirely occupied, so far as diabetes was concerned, with endeavoring to keep the patient alive for a while . . . we were taught little of its ultimate complications, for relatively few patients had by then lived long enough to develop them.”
If an intervention restores the patient to health, but is associated with premature death or disease months or years later from side effects or complications, the treatment might still be readily justified. This was the case with insulin therapy for type 1 diabetes, as it is with many cancer therapies today. But the “ultimate complications” cannot be ignored. If the physician has a choice of two treatments that will restore the patient to health in the short run, then the long-term effects must be weighed before a choice is made.
For a chronic disease or disorder, one like heart disease or type 2 diabetes that disables and kills prematurely but does so only years or decades in the future, physicians had been forced to speculate on whether what they were doing would cause more good than harm. But with the invention of the modern clinical trial in the late 1940s—technically known as a randomized controlled trial—these doctors benefited from one of the great advances in medical science. For the first time, they had a means to assess the long-term risks and benefits of medical interventions and to compare interventions to establish—for an idealized, average patient—what would most likely be the safest, most effective one. The proliferation of clinical trials by the 1980s had launched the era of evidence-based medicine.
With its reliance on clinical trials to dictate accepted practice, medicine left behind the notion of basing therapeutic decisions on clinical experience and observations. But by the time diabetologists started to embrace the notion of an “evidence base” for their beliefs on the nature of a healthy diet, they had already succumbed to the biases formed decades earlier.
Copyright © 2024 by Gary Taubes. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.