Drugs and youth—each, taken separately, is the occasion of illusion, and taken together, they multiply our illusions. Drugs make for illusion in two ways: they are vehicles of self-deception, providing false but gratifying visions of the self and its prospects, visions of peace, excitement, grandeur, transcendence. Such are the illusions of the user. They also induce myth-making among those observing the users of drugs—among researchers, pundits, moralists, all of whom provide information but beyond that give meaning to what we see around us.
Indeed, the drug crisis of today has its origins in yesterday’s “meanings,” in the many errors generated in and by the 60’s. Here is Fay Weldon, on the first page of her recent novel The Hearts and Lives of Men:
Back in the Sixties! What a time that was! When everyone wanted everything, and thought they could have it, and what’s more had a right to it. Marriage, and freedom within it. Sex without babies. Revolution without poverty. Careers without selfishness. Art without effort. Knowledge without learning by rote. A dinner, in other words, and no dishes to clean up afterwards.
To which one can add, drugs without consequences: alcohol without hangovers, accidents, or cirrhosis; LSD without bad trips or flashbacks; cocaine without addiction or depression or coronary death; marijuana without anything save deepened spiritual insight. The drug illusions were provoked by that sensibility, that lurching “out of earnestness into frivolity” (Weldon again).
Illusions about the young were even more widespread than illusions about drugs, so much so that false beliefs have become more common than not, and in fact seem to be accepted more frequently by experts in mental health than by the public at large. These beliefs are of long standing and hold on stubbornly, resisting all efforts at correction. The most common error is to be found in the casual use of such terms as “youth,” “the young,” “adolescents,” and the like. Strictly speaking there are no such entities, the young varying in age, gender, class, religion, ethnicity, intelligence, accomplishment, ambition, social and political attitudes, etc., etc. One would not want to belabor so obvious a point were it not that so much current discourse treats adolescents as though they were uniform in outlook, feeling, and condition. That tendency achieved its peak during the Vietnam years, where much was made of a (putative, fictive) generation gap, in values and politics, wherein the young were seen to stand for a higher level of moral sensibility. The tendency persists: the best current example is seen in the antinuclear movement, which speaks of the young globally, and pictures them as terrified and demoralized as they contemplate the nuclear Armageddon—presumably unlike the rest of us.
We have here, in short, an ideological portrayal of the young, wherein they represent or reflect elements of a larger social drama. These tacit depictions fall into three major groups:
- Youth enragé. Probably the most common image of the young, stressing rebelliousness and opposition. Adolescence is seen as a desocializing period; the aim of the social system is to contain the impulse to disorder. In another variation, adolescent aggression is not so much inherent as provoked by the deprivations imposed by the system—poverty, restricted opportunity, inequality, and so on.
- Youth dégagé. The stresses of adolescence, from within or without, lead the youngster to a profound emotional detachment. He may be indifferent, cynical, isolated, narcissistic, reclusive, anhedonic, moody, depressed. He rejects the social system, which may seek to recapture his interest and affection.
- Youth engagé. The most sanguine image of the young, which emphasizes a moving toward a closer, more gratifying tie to others, and to the larger social system. Interests and affections expand beyond the family, ultimately to social institutions. In this development, the youngster does not flee the family or turn against it and its values; to the contrary, benevolent internalized “others” survive, and guide the relationship to the world.
These last are the “healthy-minded,” in William James’s memorable phrase describing one of the various modes of religious affiliation. As we will see in a moment, it is the most common pattern of adolescence (just as it is and was the most common type of religious engagement). But we would not think so in reading much of the learned commentary, which is based on the persistent though mistaken belief that adolescence is normally a period marked by psychic storms.
That view has its origins in G. Stanley Hall’s emphasis on upheaval as the normative reaction to adolescence. Its current popularity derives from Anna Freud’s writings on adolescents, and those of her heirs, above all Peter Bios. Their argument—their assumption, really—is that the adolescent years are normally marked by a regressive movement, a return of the repressed, wherein the ghosts of the past reemerge, in an effort to resolve the issues of early childhood once and for all. The drives lying dormant during the latency period reappear, the defenses are under strain, archaic forms of relations to others—primitive dependencies, incestuous temptations—are evoked and must be fought off. Adjustments are fragile, equilibrium hard to achieve. The youngster keeps himself afloat by extreme measures—profound withdrawals, defiant acts of independence, and the like. It is an extraordinarily trying time for adolescents themselves, and for those who must deal with them. Little wonder, then, that the period is marked by a high degree of emotional disturbance.
_____________
So runs the view of adolescence usually held by experts on the young and their troubles. A clever study by Daniel Offer and his associates has suggested how conventional a view it is, and how misleading. They asked mental-health professionals to take a personality test designed for adolescents as they imagined a typical teenager would answer it. Their simulated scores were higher (more pathological) than those achieved by genuinely disturbed teens. In short, the “normal” adolescent is imagined as more deviant than even disturbed youngsters are in reality.
That remains the dominant view, though it has begun to give way recently, as empirical findings become available and as they, ever so slowly, penetrate professional awareness. Interestingly enough, the most effective critique has come not from academic psychology—which was indifferent to the issue for many years—but from researchers whose affiliations have been with clinical psychology and psychiatry. In most cases the investigator’s original intention has been to support and extend the Freud-Bios theory of adolescense, and the methods have been chosen to capture pathology when present. Yet in every study, using whatever instruments—questionnaires, projective tests, interviews—the same picture emerges: genuinely disturbed youngsters constitute a minority—only about 20 percent—of the total. Although the “normals” are by no means paragons of mental health, they are able to make their way through the stresses of adolescence without serious symptoms.
Equally important is the recognition these studies provide that most youngsters maintain amiable and even admiring feelings toward their parents, that the period is not at all marked by the fear and loathing so often thought to be normative. On important questions youngsters value their parents’ advice above that of their peers. They tend to choose the education, the training, the vocations their parents suggest or approve—not that American families are normally coercive in these matters. There is not now nor has there ever been a generation gap, not for most youngsters, and not on the significant issues; to the contrary, we find a substantial degree of continuity in values, politics, religious sentiments, and so on.
There is another continuity worthy of note. Much of the newer research has been longitudinal in bias, an effort being made to capture the evolution of behavior from adolescence onward. These findings tell us quite clearly that early tendencies persist into young adulthood. Disturbed teens evolve into disturbed adults—not invariably, but on the whole—and the placid adolescent will most likely become a placid adult. Studies which examine trends from childhood to adolescence also show a strong degree of continuity—for example, adolescent delinquency is prefigured by disruptive school behavior earlier in childhood. As these findings accumulate, they are eroding the earlier—tacit—assumption of adolescent exceptionalism, the tendency to see the teen years set off in important ways from the rest of the life cycle—more brittle, or more sensitive, or more intense, or more idealistic, or more explosive. The new look stresses the continuity through the life span, in temperament, emotional stability, intellectual talents and habits, traits of character.
That new look is in fact a revival of an older look which had become subject to a collective amnesia. We have known for some years that there are high correlations between traits of personality and temperament measured in adolescence and in late middle age. Most of the newer studies which follow samples over time report roughly similar trends, some of them of startling import, as in the recent discovery by Christopher Peterson and his colleagues that pessimism, measured by the content analyses of statements in young adulthood, is a predictor of illness and premature mortality later in life.
So we have had a “normalizing” of the theory of adolescence. The teens are not as a group significantly more disturbed; they are not in revolt against the family, or against conventional values, or against social institutions. We see an evolution of given dispositions, most of which will persist into adulthood and old age.
Yet having debunked the (now-fading) view of the young as, let us say, normally abnormal, we ought not to promote its Panglossian opposite. Some forms of pathology are in fact more common among the young. The acting-out disorders—crime and delinquency in particular—are associated with age (and gender) to a marked degree, and in all societies for which we have reliable records. There is a very strong relationship between antisocial behavior and high levels of illegal drug use, and though the causal directions are not entirely clear, there is little question that each intensifies the other. Disturbed behavior among the young is perhaps more troublesome than later in life, since it so often interferes with learning and the acquisition of skills. The youngster’s development may be held back, or disabled, or distorted.
Moreover, the antisocial youngster consumes a large share of the community’s resources, in policing, special education, and the like, and has a disproportionate effect on such institutions as the high school. And bear in mind that the 20-percent figure translates into an enormous number of individuals, literally millions, most of whom have a significant impact on others—their families, friends, teachers, and so on. For these reasons alone it makes good sense to give much of our attention to adolescent disorder, its effects, and its remediation, while not forgetting that this disorder is atypical, and not a problem characterizing “the age.”
_____________
Let me offer a cautionary example, which I choose because it occurs in a very good article on drug policy—John Kaplan’s essay, “Taking Drugs Seriously,” in the Summer 1988 issue of the Public Interest. Kaplan writes skeptically about whether education will do much to reduce illegal drug usage among the young, then offers these observations:
Young people are notoriously resistant to their elders’ efforts to get them to live less risky, more forward-looking lives. Well into adolescence they tend to retain what psychiatrists refer to as “remnants of infantile omnipotence.” Moreover, entirely apart from the question of risk, young people often take pleasure in things that adults tell them are bad. They are constantly told that they should avoid things (like sex or junk food) that they enjoy. At best they tend to disregard such advice—when they do not actually seek out occasions to disobey their elders’ counsel. Indeed, the real mystery is how we have managed to convince significant numbers of youths in inner-city school systems . . . to avoid taking drugs, since at least in the beginning they would find drug use so enjoyable.
One ventures to say that most readers—even the sophisticated—would peruse Professor Kaplan’s article without pausing to question this paragraph. Yet every one of the statements contained in it is highly dubious. As we have already seen, there is little reason to believe that “the young” are quite that resistant to their elders’ importunings—if they were, we would not see so few of them now smoking cigarettes, nor would so many of them follow traditional paths in education, social life, and the like. There is even less reason to believe that they choose risky or self-destructive options for the sheer joy of disobedience. That is an idea, commonly accepted, for which there is absolutely no evidence, leaving aside horseback psychiatric opinions. Nor is there any “mystery” in the fact that significant numbers of inner-city youths do not commit themselves to the drug world—most of them are straight arrows, or trying to be; unfortunately they are just about invisible so far as public opinion is concerned.
What is even more unfortunate is that this highly arguable notion of the young leads toward mistaken policy choices. For if we believe that such opposition exists between adolescents and their parents, we will avoid efforts at drug dissuasion which involve the parents. Indeed, if we believe youngsters are defiant of authority almost as a matter of principle, we will eschew most efforts at education.
_____________
This may be the moment to raise the question: why do these illusions continue to maintain their grip? The short answer is that they persist in the absence of better knowledge; at the same time, they work against acquiring that knowledge. It is not generally recognized—even in the profession, even among developmental psychologists—how little we actually know about fairly fundamental matters. Basic empirical data are so meager that “knowledge” is made up of some scattered information, some theory, some clinical observations, some tacit assumptions, some wishful thinking, and no doubt some old wives’ tales as well.
Curiously enough, that thinness is especially evident in our grasp of the basics—for example, the psychological processes in the ordinary middle-class family. On the other hand, we are flooded with information—a great deal of it dubious, to be sure—about topics which capture the collective imagination, eating disorders being the great recent example. It would not be at all difficult to compile a bibliography of several hundred items on the family dynamics of anorexia or bulimia, all published within the last decade; yet one could gather only a handful on the ordinary family. That disproportion has its effects—among other things, we generalize from the known to the unknown, hence tending to see anorexic or bulimic family dynamics as a more extreme instance of the norm. There is now an entire genre of ideological writings on the eating disorders, based on very limited or anecdotal evidence but claiming representative significance: as one book’s subtitle has it, “The Anorectic Struggle as a Metaphor for the Age.” Such intoxicating illusions flourish in the absence of genuine knowledge.
Another important source of illusion is that the phenomena we want to study may change, sometimes rapidly, without our being aware of it. That lag in recognition has been especially evident regarding adolescent pathologies. These grew sharply from the early 1960’s to about 1980. During that time there were phenomenal increases among adolescents and young adults in almost every index of disturbance for which we keep records—increases of two to four times in the rates for suicide, homicide, out-of-wedlock pregnancy, and various measures of delinquency. For those conditions for which we do not have reliable statistics over the same period of time—e.g., substance use and abuse, eating disorders, borderline and other severe disorders—there was probably an equivalent rise.
Yet in no case—except perhaps for drug use, which received media attention early on, much of it glamorizing—was that rise apparent to professional observers or to the public at large. Adolescent suicide did not become a matter of urgent concern until a few years ago, after the rate had begun to stabilize. Among most clinicians the resurgence of anorexia and bulimia, now common knowledge, was simply not evident until it had reached epidemic proportions among students in elite colleges. The steep decline in SAT scores did not come to public attention until the numbers had nearly reached bottom. Indeed, those working for educational reform in the mid-to late 1970’s will recall the incredulousness with which their efforts were then received, it being the wisdom of the time that the American young were the brightest in the nation’s history.
It is quite possible that these steep increases and decreases are a historical anomaly, a result of generational crowding, as some economic demographers have argued. Or it may be that we have been in a historical era inducing rapid change throughout the social system, hence psychological changes in vulnerable populations. Time will tell; at least, time may tell. What we have to keep in mind is that writing in adolescent psychology and sociology has on the whole been insensitive to the effects of recent historical change. We do not like moving targets, and prefer to pretend that they are not moving, or that their movement is of little moment. That, however, is an optimistic assumption.
_____________
In short we know very little about adolescence, and much of what we think we know is false, incomplete, or out of date. Thus the young readily serve to reflect any image we want to project or to voice any program we hope to promote. “What are the young trying to tell us?” When we hear that grandiloquent question we suspect, rightly, that the speaker already knows the answer, and in preparing to speak for the young he has at hand a doctrine he plans to impose, on all of us but especially upon them.
At the outset of the 1960’s the self-deputed tribunes of youth were offering what Edward Shils has termed “the antinomian temptation”:
The highest ideal of antinomianism is a life of complete self-determination, free of the burden of tradition and conventions, free of the constraints imposed by institutional rules and laws and of the stipulations of authority operating within the setting of institutions. . . . All human beings . . . are entitled to whatever any individual is entitled to. All human beings are entitled to be gratified as the promptings of the self require it.
Though Shils was here addressing radical politics, his observations can be applied pari passu to other issues of the time, particularly to the psychoactive drugs, which were to be both the exemplar and instrument of total entitlement.
No one will want to argue that the antinomian outlook “caused” the drug epidemic; but there is little question that it rationalized its early stages and beyond that helped undo the immune system which had kept drugs—and much else—at bay. Nor did it limit itself to politics and drugs; it affected almost all realms of public life. Here is a quotation from a paper on education written ten years ago:
Among the values of traditionalism are merit, accomplishment, competition, and success; self-restraint, self-discipline, and the postponement of gratification; the stability of the family; and a belief in certain moral universals. The modernist ethos scorns the pursuit of success; is egalitarian and redistributionist in emphasis; tolerates or encourages sensual gratification; values self-expression as against self-restraint; accepts alternative or deviant forms of the family; and emphasizes ethical relativism.
That paragraph was written by me in an effort to explain the deterioration of American schooling. A new progressivism in American education had begun at about the same time as the change in sensibility—antinomian, modernist—which helped usher in the early phases of “enlightened” drug use; and the two were coincident not merely in time but philosophically as well. The doctrines justifying drug use were precisely those justifying the liberation of the schools from coercive authority: that the “true self” was imprisoned by the strictures, schedulings, dress codes, homework, and the like of focused schooling, or, more diffusely, by whatever it was that chained, corrupted, degraded, destroyed the human spirit, and kept all of us from being what we were meant to be, namely, totally free of limitation, hence reaching exalted levels of achievement. Those who led the psychedelic movement believed that the hallucinogens would liberate not merely goodness and spiritual wisdom, but also an epiphanic creativity previously seen only among the greats.
It should be apparent that drug tolerance did not succeed fully, and that enlightened thought did not accept the antinomian credo in its entirety. Nevertheless, its essential ideas became part of elite opinion, both with respect to the schools and in relation to drugs. Those views are now in retreat, as their malevolent effects become apparent—in the case of education, by the mounting evidence of failure, as we see in the dreadful international comparisons in school achievement; and in the case of drug tolerance, by the crack crisis and the devastation of the ghetto.
Yet modernist doctrine has so thoroughly captured elite opinion that its influence has not yielded easily, not until failure has become painfully plain, and even then grudgingly. The educational-excellence movement was derided at its beginnings, characterized as reactionary, simple-minded, and so on, even after the 1983 publication and unexpected success of A Nation at Risk. As late as 1984, our leading university presidents were sniffing scornfully at the notion of a genuinely serious problem in the schools, and that attitude has changed little since.
We find somewhat the same history in regard to drugs—an attenuated version of the antinomian doctrine, to the effect that though some drugs may be harmful, others are tolerable—they do no harm, indeed they do good, in relieving the onerous pressures besetting the young. To oppose their use is to yield to the forces of “hysteria.” Until a few years ago that position was dominant among those of advanced opinion. Norman Zinberg’s Drug, Set, and Setting (1984) sees the drug epidemic as a “vast social experiment” against which we interfere at some risk to ourselves collectively. Opposition is termed “prohibitionist”; support is called “anti-hysterical.”
While Zinberg writes as though his position is a beleaguered one, it has in fact been favored by most academics specializing in adolescence, mental health, and so on. Reading through the discussion of drug use in recent adolescent-psychology textbooks, one finds, first, that much of the time very little is said, most texts devoting little space to the topic; and, second, that the tacit editorial voice carefully avoids opposition, especially to the use of such soft drugs as marijuana. In the most recent text to come my way, the following points are made: that neither premarital sex nor drug experimentation can be eliminated; that adults must not aim at “scaring the young away from experimenting with either”; that it is anyway hopeless, since such experimentation cannot be monitored or policed; that given the “inherent pleasure and the immediate gratification” of sex and drugs, the best approach is an honest sharing of information in which the pros and cons are presented.
One wonders what the author is talking about. What are the pros of heroin use? Why should we believe that parents are powerless to influence their children? Or that adolescents are so indifferent to their parents’ concern? We have here a residue of the modernist doctrine, applied to sensuality and the sumptuary ideals, which are thought to be nearly inviolate. It is that view, usually unspoken, which has until recently held sway in our ideologies of public life.
_____________
But why belabor the issue? Is it not history, done for, finished, kaput? Have we not turned the corner? We have indeed entered a period of good intentions and high resolve, but it is not at all certain that these will persist. What will persist is the modernist temper, though it lies low at the moment. It will sustain itself by calling attention to the certain failures of “prohibitionism.” It will thereupon explain—patiently, a bit condescendingly—that we have interfered with a natural process, perhaps a vast social experiment. It will in any case mock most efforts to inhibit drug use.
One sees this even at this moment—a peak moment for anti-drug sentiment. Many of the academics I talk to about the drug problem make sure to remind me, a bit patronizingly, that “Just Say No” won’t work, being too simplistic; that the problems are too deep, too complex, too rooted in an unjust social system, and so on. Some echo the views discussed above, that the drug-seeking drives are not far from inherent, being based in the pleasure principle, hence in human nature, and cannot be stamped out.
It is a curious view, since one hears it from the very people who believe fervently that other evils, some with a far longer history—e.g., ethnocentrism, religious zealotry—can be stamped out fairly easily. It is an especially curious view to hear from cosmopolitans, who should know that other industrialized countries have been able to keep themselves and their youngsters far less enslaved by drugs, and without imposing tyranny on their citizens, and that this nation was able to do so for many years, and not so long ago. Curious or not, myopic or not, the view survives, in suspended animation perhaps, but easily revived.
_____________