But at my back I always hear
Time’s wingèd chariot hurrying near.
And yonder all before us lie
Deserts of vast eternity
.

Andrew Marvell

For most of us in the Western world, premature death is no longer imminent. The death of infants is unusual, the death of children rare, and the death of young adults so improbable that it must be removed from the realistic possibilities of young life.

There are few new things in our world with such wide personal, social, and cultural meaning as the loss of the imminence of premature death, a phenomenon which is unique to the generation born since the 1940’s. For young men’s lives, “Time’s winged chariot” no longer hurries, and the picking of rosebuds may be done at one’s leisure. In essence, the young now have “limitless” time.

But this dramatic change, won by the control of disease—primarily infectious disease—has a further implication: if disability or disfunction occurs in an individual now, the cause will more probably arise from within the man than from disease striking from without.

I do not think that the meaning of this change in the pattern of disease and the availability of time can be overemphasized. To document it, let us look at what has happened to the patterns of disease in the last few decades, and sharpen the point by contrasting our present disease pattern with that of the so-called “developing” nations.

Before the 1930’s, the chance that a patient would have his disease improved decisively by a visit to a physician was very small. Except for the surgeon, whose basic tools have changed little, the physician’s bag held more comfort than cure. The drug formulary contained a few specific cardiac drugs, thyroid extract, insulin (in 1925), something for gout, a few other specifics, and then a thousand things for the relief of symptoms, some effective, some not. Men made reputations as pneumonia doctors for their ability to sit through the night at the patient’s bedside, waiting for the crisis to pass. Ears drained endlessly, the mastoid scar was commonplace, and sinus trouble held the danger of meningitis and brain abscess. Postoperative pneumonia was a dread complication, and infection and the real limitations of anesthesia tied the hands of surgeons. Even in the 1930’s, syphilis was still the great mimic, and the majority of urban adults had evidence of past tuberculosis, although both diseases were already on their way out. Bellevue at that time had a whole ward for erysipelas (a disease most people have never even heard of today).

In short, the everyday burden of illness was considerable and an expected commonplace of life. The role of the physician was to comfort, to relieve, to diagnose, and to prognosticate. There was, however, a close correspondence between what patients expected of the physician and what he could in fact deliver.

Today, by contrast, the physician can help as never before. The chances are overwhelming that the course of a patient’s disease will be decisively affected by a visit to a physician. From antibiotics that banish infections to tranquilizers that banish strait jackets, an aura of effectiveness has been created. But in this time of greatest accomplishment, the widespread expectation of cure has created an oddly paradoxical gap between what is expected of physicians and what they can deliver. We shall return to this point later.

The incidence of infant and child mortality in a society provides a useful criterion for the comparative study of disease patterns. In many developing nations, the death of children is pitifully commonplace. In the United Arab Republic, for example, one out of five children dies between ages one and fifteen. In Mexico, one out of 30 will die in the same period. But in the United States, only one child in 130 dies between the ages of one and fifteen.

What are the causes from which these children die? When a child dies in the United Arab Republic and Mexico, the death is generally due to an infectious disease (bronchitis, intestinal infections and diarrhea, and pneumonia). When a child dies in the United States, he generally dies of an accident, a congenital malformation, or cancer. We have been made aware of infant mortality in this country because we are distressed that ours is not the lowest rate in the world (and it is not). The infant death rate for the United States as a whole would be lower were it not for the fact that among the indigent it is much higher than among the remainder of the population. Thus here, as in other examples of disease, it is possible to have an underdeveloped area exist as an island within a highly developed society. (A striking example of such backwardness existing amid advance is the shocking and primitive disease rate of the American Indian.) It is equally possible, of course, to have islands of advance amid general backwardness, and that is why it should be noted that in discussing premature mortality, we are talking about the Western world wherever it may exist—in the United States, England, or even among the wealthy in India.

_____________

 

The difference in causes of death between countries is also seen in the United States between generations. Our daily experience tells us how different things are from what they were before. Although the infection itself is still common among both children and adults, who can remember somebody dying recently of a “strep” throat? There is some evidence that, while streptococcal infection may occur in all classes of the population, serious complications arising from it are virtually limited to the indigent. The mortality statistics for pneumonia are similarly revealing: in 1935, the death rate from pneumonia in children ages one through four was one in 1,100; in 1961, the death rate in the same age group was one in 7,000.

Not only have the infectious diseases become less common, but where they remain, their fatal potential has often been diminished. Measles is an excellent example. In 1964, the United Arab Republic reported 14,000 cases of measles with almost 8,000 deaths. In the same year, in the United States, 458,000 cases of measles were reported with only 451 deaths. (A note of caution is necessary before such data can be taken too literally. The difference in numbers of cases is due to population differences, but also the relationship between actual numbers of cases and reported cases tends to be less meaningful where public health services are not highly developed. Nonetheless, the trend and the lesson are clear.)

It is generally believed that differences in mortality rates from diseases such as measles are primarily due to nutritional differences. Certainly, the child suffering from protein-calorie malnutrition has a much greater risk of dying or suffering serious complications from the common contagious diseases of childhood than do our children. But it is important to realize that other factors are at work here too. The recent epidemic of “Hong Kong” influenza makes the point most sharply. There are many now alive who remember the influenza epidemic of 1917-1918. They would have no difficulty recalling young friends or relatives who died during its ravages. But what young person (and for the purposes of this argument, we can define “young” as under forty) does any of us know who died during the present epidemic? The fact that we have to search our minds for an answer is itself an indication of how radically things have changed. Those few people who did die during the 1968 epidemic were probably already suffering from some other condition to which influenza was terminally appended. Statistics confirm this observation: the death rate from influenza in the period 1921 through 1925 was one out of every 1,000 population; in the period 1956 to 1960 (encompassing the last previous Asian influenza epidemic) the death rate was one out of 50,000.

Although there is some dispute over the matter, in general scientists do not believe that the 1918 variety of influenza virus was more deadly than the present variant. Rather, in the post-World War I setting, which included crowding, poor hygiene, and other factors that seem to make disease worse, influenza infection prepared the way for bacterial complications. In the days before antibiotics, those bacterial complications (pneumonia, bronchitis, ear infections, meningitis, for example) were not infrequently fatal. It is hard to document which feature of society was most responsible for the high fatality of the “flu” of that era, because so many things in our lives contribute to our pattern of illness. Although it sounds unscientific to say so, it appears to be true that present times are simply healthier than past times. Health is a concatenation of effects, with good health promoting good health and disease promoting disease. At this juncture in history, our society’s “resistance” to infectious diseases is high.

Still, while our illness pattern has changed drastically, and the change comes from the way we live rather than from any single factor in our lives (including antibiotics—though they have certainly made a big difference), it should be stressed that the change is reversible. In a sense, like “the antiseptic baby and the prophylactic pup,” we live in a large “astrodome” with the infectious diseases kept out by an invisible shield. The shield, however, is as fragile as our social stability. The infectious diseases lie in wait for war, natural disaster, or starvation to prepare the way for their inevitable return.

The changes that have occurred in the patterns of illness and death are not limited to the young. For the aged, too, things have changed. That we all die remains true, but the when and the how have shifted.

What is “old” now? A generation ago, the seventy-year-old man was a relatively uncommon creature, considered old by himself and his family. The life expectancy of the aged has increased considerably in the last generation. A person of sixty can now expect to live eighteen more years. A person of seventy has a life expectancy of almost eighty-two years. Although life itself has not been significantly extended, more people are living longer than before.

“Pneumonia may well be called the friend of the aged,” Sir William Osier once said. “Taken off by it in an acute, short, not often painful illness, the old man escapes those ‘cold gradations of decay’ so distressing to himself and to his friends.” Pneumonia is still the old man’s friend, but the man is now older, and the pneumonia is a late complication of some other disease, not the unexpected sword of fate. Here again, the death rates serve to make the point. In 1935, the death rates from pneumonia for ages twenty-five through forty-four were 39 per 100,000; for ages forty-five through sixty-four, 94 per 100,000; for ages sixty-five through seventy-four, 242 per 100,000. In 1961, at ages twenty-five through forty-four, six per 100,000 population died from pneumonia; at ages forty-five through sixty-four, the rate was 24 per 100,000; at ages sixty-five through seventy-four, the rate was 95 per 100,000.

Further, and perhaps more important, the extension of life has meant the extension of useful life, functional life. The old among us may have many diseases (they generally do), but they often have little real illness. They may walk more slowly, or even with a cane, but they walk—and frequently to work. They may have glasses for their eyes, amplifiers for their ears, but they see and hear and function. They work and play for a long time. Indeed, perhaps the most disabling thing to happen to the old in recent times has been the mandatory necessity to retire at age sixty-five. Previously functional individuals are laid aside and frequently stagnate, perhaps even die, from their own uselessness. They have not been trained to combat the stress of retirement or to meet the demand for new resources of creativity and growth. Biologically speaking, however, there is no such thing as standing still; there is only function or atrophy.

_____________

 

With the change in disease patterns has come a change in the meaning of disability. We no longer equate illness with disability, nor do we equate the disability of an organ with the disability of the whole man. Organ disability, in other words, does not mean functional disability. We replace the arm with pincers, and the leg with a fancy stump. If a man cannot work at his original job, we retrain him. We train the blind to work in darkness and the deaf to transcribe stenotyping. As a culture, we have become guorous form of the tuberculosis patient became function.

This does not mean that disability has altogether disappeared. On the contrary, it is as widespread as ever. But there has been a decisive change, all too seldom recognized as such, in the direction from which it comes. Disability is now more likely to arise from within the man or from his society (or his interaction with it) than from exogenous disease. It is now primarily emotional, genetic (birth defects, etc.), or social.

For who are the disabled if not the non-functional? And who is less functional than the “misfit”—the alienated person, the addict, the alcoholic? Perhaps even our fabled “Corporation Man”—tightly jacketed by a set of rules and procedures which control the direction of his creative growth—might be called a disabled man. Creativity and growth are tender things. They emerge slowly and tentatively as we age, and wither quickly if discouraged or stifled. But they are the stuff of which an era is made.

It would not be too difficult to show that the most common disabling diseases of our time—heart disease, certain cancers, and automobile accidents—are in themselves born of our culture. Coronary heart disease, our number-one cause of death and disability, seems to have risen to that prominence because of the habits of our society: a diet rich in fat and refined sugars; lack of physical exercise; some psychological factors (quickly accepted by laymen, but difficult to prove scientifically); and cigarette smoking. Automobile accidents, which are the leading cause of death among young adults, have now been shown to have their roots primarily in alcoholism and in the social and emotional ills of drivers. (In a fatal automobile accident, the overwhelming probability is that one of the drivers will have been drinking and that further, he—because it is usually a male—will have had previous contact with social agencies, either because of alcohol or past misdemeanors.)

This is surely a new kind of disability, whose roots are intertwined with the changes that have been wrought in the disease patterns of our generation. We have seen how the threat of imminent death in the young has left us as a meaningful probability, and how the whole fabric of our culture is involved in the change. Old and young alike have become the beneficiaries of a new gift made possible by the changes in the patterns of disability. That gift is time, time in which to live. But the gift of time has not been without its price. The threat of a limited life imposes a corresponding demand, or need, to perform; the expectation of death imposes a sense of control and restraint. In this era, the price exacted by the gift of time has often been a loss of that sense of control, and a consequent anxiety, or—even worse—a loss of the ability to function creatively.

It is interesting in this regard to contrast the present with the late 18th and early 19th centuries, when death at a young age was frequent. Tuberculosis, one of the most common causes of early death at that time, was greatly romanticized. The literature of the era is pervaded by a sense of the beauty of early death, and the pale, thin, languorous form of the tuberculosis patient became the fashion ideal of beauty—often copied, one might add, by today’s adolescent.1 But what is of greatest interest is the fact that sufferers of tuberculosis were thought to be endowed with some special genius. Their drive to produce seemed enormous, as though a strong wind blew within, hastening them to their artistic task and fanning the flame of their creativity. But they all knew of their impending death (including some who didn’t die, almost to their disappointment), and I think that knowledge imposed the final deadline of disease, against which they all worked so feverishly.

I know a young graduate student, by contrast, who had been pursuing his career in a dilatory manner. Then he was found to have a fatal disease. When told so, he suddenly began to perform and finish his work—to become something in the limited time left to him. The demand to perform was imposed by the diagnosis of imminent death. How many of us have not played the game of, “What would you do if you knew you were going to die in—?” Generally, rather than playing out their lives in the pursuit of futile games, as a popular television program would suggest, people feel the need for accomplishment in the short time left to them.

_____________

 

A sense of time develops as we grow up. Children seem to have little or no such sense. A child’s total concern seems to be with the moment. An hour is meaningless; next week may be a forever away. A real sense of time begins to appear in adolescence. Adolescence is also the period when fantasies of early death occur. “I know that I’m going to die young,” is so commonly heard among adolescents that it must be taken as an expression of group feeling rather than of individual experience. As a cultural phenomenon, of course, this feeling stems from the knowledge that in earlier periods it has happened: the list of talented men who died young is long. Its attraction for today’s youth, however, lies more in its beauty as an idea: a short, post-adolescent catharsis, then death. One has done one’s work—given the world one’s talents, and then, before being challenged by continued existence, one expires. What would one do if one had to go on—and on and on?

The fantasy of premature death in today’s adolescents may be seen as a response to the tensions arising from the loosening of parental control. It is also, in my view, a response to the perception that premature death is in fact among the unlikeliest of eventualities.

One might stop at this point and ask whether the young know that they are freed from the threat of death. I think they do, because despite the stories of their parents and the literature, as they look around them, their friends are all alive. Some have been sick (and some very sick) but virtually all are alive. Whether or not we wish to acknowledge them consciously, the facts of life are known to us and influence our behavior. The absence of death is one of the facts of life to the young. (This awareness of the absence of death should not be confused with that wonderful feeling of omnipotence in men that denies the possibility of their own death. Omnipotence is a necessary psychological mechanism that may operate most strongly in situations such as war where death is common.)

Now, while the gift of time must surely be marked as a great blessing, the perception of time, as stretching out endlessly before us, is somewhat threatening. Many of us function best under deadlines, and tend to procrastinate when time limits are not set. Time limits are controls, and some controls seem to make people more comfortable. Thus, this unquestioned boon, the extension of life, and the removal of the threat of premature death, carries with it an unexpected anxiety: the anxiety of an unlimited future.

In the young, the sense of limitless time has apparently imparted not a feeling of limitless opportunity, but increased stress and anxiety, in addition to the anxiety which results from other modern freedoms: personal mobility, a wide range of occupational choice, and independence from the limitations of class and familial patterns of work. Scientists have a certain abhorrence of unitary theories of causality, but it is tempting to look at the rebellion now open and active among the young all over the world and suggest that, in part, it has its origins in the anxieties born of the new freedom from death and the consequent need to choose how to live. Many observers of the young radical movement have noted that its members tend to be more prepared to offer criticism of the present than to put forward a program for the future. My own contact with many individual young adults leads me to the same conclusion, especially for the males among them. A certain aimlessness (often ringed around with great social consciousness) characterizes discussions about their own aspirations. The future is endless, and their inner demands seem minimal. Although it may appear uncharitable to say so, they seem to be acting in a way best described as “childish”—particularly in their lack of a time sense. They behave as though there were no tomorrow, or as though the time limits imposed by the biological facts of life had become so vague for them as to be nonexistent. To be sure, there are, as there have always been, young people whose own inner drive and goals require no outside demands. But they are, as they have always been, in the minority. For the others, the classical biosocial structure has failed to provide those limits that in the past have brought forth function.

All this is probably less true for women, because the reproductive aspects of the biosocial structure have not changed. The criteria for femininity, that is, seem more easily met at present than the criteria for masculinity. Nevertheless, women provide the most clear-cut example of the way in which the pattern of our lives has been influenced by the changes in the pattern of our diseases in past decades. Women have gained time directly, by the extension of our own lives, and indirectly, from the gains of their children. It may be said that one of the major social changes in this century has been the emergence of women—an emergence that might not have been possible were it not for the survival of the young. Because it no longer takes seven full-term pregnancies to produce five living children, and five living children to produce three adults, but merely 3.1 pregnancies to produce three adults, the time required for women to discharge their biological function has been markedly reduced. They are free to return to their own non-maternal functions very much more quickly. Late marriage becomes feasible (not merely as a birth-control device as in mainland China) because there will still be adequate time for childbearing. But late marriage also means individual development—college and careers—and that means aspirations apart from marriage. When frustrated (or “unperceived”), those aspirations mean boredom and unhappiness and the panoply of problems for the modern healthy woman that have come under popular discussion in recent years.

The difficulty in handling the newly won time is best demonstrated by young women because they are assaulted by the advance from all sides. An illegitimate pregnancy is an unnecessary absurdity in the 1960’s because of the easy availability of effective birth control. Despite this, and to the continuing despair of physicians, illegitimate pregnancy among the comfortable—people who know how to avoid it—remains a problem. I observed before that modern disability most commonly arises from within the individual, and in that sense, many of these pregnancies can be regarded as self-induced disability—as accidents, that is to say, only in the loosest sense of the term. They serve to remind women of their (no longer) “secondary” role and frequently even to cripple their aspirations through guilt.

Legitimate pregnancies, too, can be utilized to avoid the promise and decrease the threat of an unlimited future. Coming too soon, or too close together, they can convince a woman (in denial of fact) that her biological function must necessarily override her own personal (or even marital) creative needs. Similarly, a woman who raises her children in constant fear of their impending death from every little fever (despite the informed boredom of her pediatrician about the same fever) is brought further along in life without having given heed to herself as a person. Now, if only some convenient female disease could occupy her remaining years, all would be unthinking bliss. Unfortunately, for the most part, gynecological disability has gone the way of the infectious diseases: increasingly rare, avoidable, or easily treatable. Thus the children, few in number and healthy, grow up, leaving her healthy, young (by modern standards)—and frequently useless.

The aged again serve to clarify the issues I have been discussing, primarily because they have been freed by the passage of time, and by their own aging, from some of the problems that face the young. Oddly, they do not have such problems because they do not perceive them as problems. They know they are going to die, and it doesn’t seem to bother them much. I have had conversations with old people in which they discussed their approaching death with great calm, while their children seated nearby nearly fainted in distress. Being a burden—that is truly frightening; being an invalid, and being unable to do for oneself, causes real and well-based anxiety. But the fears of unlimited time have been removed. Now, great blessing, there isn’t enough time. The aged have terrible disability problems. Disability is a real threat—but even so, the aged are apparently safe from themselves. When disability comes, it will come from outside themselves. (Unfortunately, of course, that is not always true. The aged are still their own worst enemy, along with the rest of us. Some are disabled by diseases that allow their fellows to carry on. But at least, unlike the young, they have more outside themselves on which to place blame.)

_____________

 

I have used the phrase “changes in patterns of disease” in speaking of what many people would refer to as changes in our general cultural climate. I have done so, of course, because I believe that in some measure our new cultural ills can be traced precisely to the prodigious successes which the medical profession has achieved in our lifetime. I think it fair, then, since physicians have been an intimate part—if by no means the only one—of the revolution in disease, that we look to them for some recognition of our new problem, and some indication of the direction in which we must all move toward its solution.

In medicine, there are, and always have been, two basic priorities. The first is the defense against imminent death. The second is the defense against disability.

Priority One thinking, the defense against imminent death, operates properly in the period immediately following a heart attack, during and right after surgery, immediately after serious trauma, and during war or similar situations. The excitement of Priority One thinking was the basis for those Ben Casey-type television shows, so popular a few years ago: the camera panning from the cardiac monitor bearing its (hopefully) reassuring message to the nurse’s hands passing an instrument from her tray to the deft fingers of the surgeon (whose character was drawn as though the surest defense against death were arrogance). The same kind of thinking, pervades emergency rooms, volunteer ambulance services, and all those situations where man seems separated from death by the softness of one heartbeat. And Priority One thinking is in the dreams of every boy who wants to be a doctor—the medical expression, perhaps, of the universal Walter Mitty fantasy.

Now, the basic threat in Priority One thinking, death, has not changed over the years; it has simply become less probable, and among the young, uncommon indeed. Nevertheless, death is death—final. And because nothing can change the awesome finality which is the meaning of the threat of death, it has become possible to maintain artificially the belief that the importance of Priority One thinking is just as great as it ever was. This in turn allows such thinking to be carried over into an area like that of the infectious diseases, where the threat of death has in fact become virtually nonexistent.

Thus, we frequently see among physicians, as well as among patients, an overreaction to minor infectious diseases, apparently based on the fear that the “minor” will become “major”—that a cold, say, will become pneumonia. But does it really matter (except in the aged) if a cold does become pneumonia? There is an old story about a patient with a cold who says to his doctor, “What if it turns into pneumonia?” The doctor is said to have answered, “Better if it were pneumonia. That, I can cure.” If there were no antibiotics, it would be different. If there were no hospitals and operating rooms and surgeons and ambulances—but for most of us in the Western world, there are all these things. Just as the complex whole of our society seems to have changed our disease pattern, so the complex whole of our society has changed the meaning of potentially fatal diseases. (Again, in discussing the change in the meaning of disease, just as in discussing the change in the pattern of disease, concepts like the ones I have been using are justified only so long as the society remains stable.)

As a physician, I must admit that there is a certain romance in Priority One defense-against-death beliefs. There is a certain excitement in seeing the red streaks that run up the arm of a patient whose hand infection has begun to spread (a morbid excitement, perhaps, but that is a doctor’s work). And to the patient, a very real fear-fear of illness and pain and even death. But the excitement all ends at the drugstore, when simply purchased tablets promptly end the threat in the majority of instances. (Untreated, the disease is dangerous, but then so is a car without brakes.)

_____________

 

Priority two of medical care is the defense against disability. We have seen how the direction of disability for the young has been changed by the conquest of the infectious diseases and other recent advances. But curiously, in this most difficult task in medicine—defense against disability—the burden has increased. The young who do not die, grow old and suffer the diseases for which we still have no cure: arthritis, diabetes, heart disease, cancer. Children with previously fatal diseases now live disabled lives, needing continuing care.

This priority—defense-against-disability—contains the classical functions of physicians: to comfort and to relieve. Both are curiously inadequate for patient and doctor alike in the present era of the expectation of cure. It is far easier to escape to simpler, more basic, fears, to reduce the anxiety and danger that time has given us by maintaining the pretense that premature death is imminent and beyond our control.

The “romance” of acute disease, stubbornly maintained in the face of steadily mounting fact, can be explained in several ways. For the patient, the sense of an outward “threat” saves him from having to recognize that the real threat is life and that the real source of disability is within himself. And as for the physician, he, too, is a person with all the same fears as his patient. (Judging by the increased frequency of depression, suicide, and divorce among physicians, they know more than they let on about the real facts of life.) But there are other reasons why physicians need to keep the compact with their patients, and maintain the “romance” of acute disease, the importance of Priority One defense-against-death thinking, in the face of all evidence to the contrary. All the long and continuing training of physicians has been, and (sadly) still is, oriented toward solutions to Priority One problems: the defense against imminent death, the treatment of acute disease, and the protection against imminent danger. But as we have seen, these are no longer the greatest dangers facing us, or those for which we require most help. Dealing with life is difficult and painful, and that is most often where our physicians fail to help us. They have solutions, but not to our most pressing problems. If one has no solutions to the real problems of this world, it is easier to continue to maintain, no matter how artificially, the primacy of the problems for which one does have solutions.

Let us look at two areas in which inappropriate solutions have been advanced to new problems. Pediatrics is the specialty where the conquest of the infectious diseases has produced the greatest changes. Many pediatricians are plainly bored, and some even question the need for their specialty. Where Priority One problems previously demanded their time and their skill, now such problems can often be handled by telephone. Pediatricians have responded to this new situation in various ways. Some have sought to place the burden of everyday care in the hands of nurses or other paramedical personnel, reserving for themselves the truly challenging disease problems. Others have attempted to make the ordinary interesting by a complex search for basic biochemical dynamics in the simplest malady. Still others have tried the harder, but more appropriate, route of dealing with the behavioral problems of growth; they are striking out on new paths for which their original training ill-suited them.

To take a second, more striking example of inappropriate solutions: organ transplants (particularly heart transplants) are clearly not the answer to the disease problems which they attack. If all the cardiac surgeons, with an unlimited supply of donors, stood side by side and operated constantly, they could not keep up with the need generated by the amount of arteriosclerotic heart disease of our times. Clearly, the answer must come from preventive medicine, as preventive medicine has answered most of our great health needs in the past. Nonetheless, transplants continue to capture the imagination, to say nothing of the research funds. They have a special attraction, despite their obvious inapplicability to our problems; they allow physicians and patients alike to perpetuate the compact that the real threat lies in acute disease. (In the continuation of artificial organ and transplant research and development, to the virtual exclusion of equivalent efforts in preventive medicine, we are also witnessing a certain technical overshoot that is not unique to medicine. We have come, as a society, to do things simply because we know how to do them, not because they need to be done.)

_____________

 

There is, then, a need for change in the role of the physician. During a period in which the automobile industry has moved from the Model-A Ford to the Mustang, medicine has moved from the horse and buggy to the jet airplane. But while the automobile industry has in the process transformed entire concepts of production, marketing, and transportation, the basic changes that might have been expected in the underlying concepts of the physician’s role, therapeutic goals, and the delivery of medical care, have failed to come about.

It has become necessary for physicians to examine what they really do. The trouble is that physicians do not act in a vacuum, but rather in a framework determined by the interlocking complex of man, his society, and his prevalent diseases. Thus, our very notion of cure is itself derived from the threat which infectious disease used to represent. Disease has a start, runs a course, and has an end. Such a view of disease is appropriate to pneumonia or meningitis; it drives physicians to act to terminate the disease, and to define the goals of cure and treatment in terms of the disease, not the man. But infectious diseases have become less common and more easily treated, and we are now presented with more and more diseases which do not fit the familiar patterns—diseases which have social and cultural determinants. To apply the usual criteria of onset, course, and termination is hardly useful in trying to define the proper function of physicians with respect to arteriosclerotic heart disease, stroke, automobile accidents, or some of the biosocial and psychosocial disabilities I discussed earlier. When, after all, does arteriosclerotic heart disease start? Autopsies on soldiers have shown hardening of the coronary arteries even at their young age. And a heart attack, which might appear at first glance to be a disease with a classical history of onset and course, is actually only one adverse episode in the course of an underlying disease, arteriosclerosis.

Because of the extension of medical effectiveness, it has become more important than ever to define what physicians really do, and always have done, apart from their technology. For despite the lack of a basic change in the structure of medicine, there has occurred a basic change in the expectation of patients. People with diseases may now expect to be cured, and their expectation is based on the successes that have been achieved in the treatment of bacterial diseases and the control of disease symptoms. However, although physicians do commonly control such diverse manifestations of disease as vomiting, irregularities of heart rhythm, water retention, tremor, anxiety, depression, inflammation, blood clotting, etc., it would be an error to confuse this control of symptoms with the “cure” of disease.

Physicians comfort, relieve, diagnose, and cure. But threaded through all these things is another process—nonverbal—that is also part of the classic mantle of the physician, whether he knows it or not, and whether he likes it or not. Physicians represent the force of life within us, the force for health and the return to function. Actively or unconsciously—acting with them or against them—they mobilize these forces in patients; in their persons, they represent an almost unfulfillable demand for health. Every culture has its healers, and in every culture they heal. But in every culture, the meaning of disease, health, and function is defined in terms of the culture. The changes in the patterns of disease that have resulted in limitless time for the young have both changed our culture and changed the definition of health and function. They must, therefore, change the function of the “healer” as well.

1 Tom Moore, in his diary for February 1828, reports a conversation between himself and Byron:

“I look pale,” said Byron, looking in the mirror, “I should like to die of a Consumption.”

“Why?” asked Moore.

Because the ladies would all say, ‘Look at that poor Byron, how interesting he looks in dying.’ (See “Consumption and the Romantic Age” by René and Jean Dubos in Curiosities of Medicine, edited by Berton Roueché.)

+ A A -
You may also like
Share via
Copy link