The Myth of the First Three Years: A New Understanding of Early Brain Development and Lifelong Learning
by John T. Bruer
Free Press. 244 pp. $25.00

In April 1997, Bill and Hillary Clinton co-hosted something called the “White House Conference on Early Childhood Development and Learning: What New Research on the Brain Tells Us About Our Youngest Children.” This event, widely reported in the media, was designed in part to send an important message to America’s parents: a child’s experiences and environment during his first three years play a crucial role in determining the course of his later life, directly affecting how his brain will develop and thus his intelligence, his ability to learn, and his lifelong mental health.

At the White House conference, the idea that “the first years last forever” where the brain is concerned was not a point of debate but a point of departure—and one, moreover, that was said to be well established by cutting-edge research. Yet as John T. Bruer shows, this view of infant development, which has become an ever more deeply entrenched piece of conventional wisdom, is a myth. Bruer is the president of the James S. McDonnell Foundation in St. Louis, which supports research in psychology and neuroscience. He happened to attend the White House conference—where, despite the ritual genuflections to brain research, only one neuroscientist was included among the speakers and the relationship between neuroscience and most of what was discussed was merely rhetorical. He was disturbed enough by what he observed, and by the larger phenomenon it illustrated, to produce this significant book.

_____________

 

The belief that early experience is important for the developing child is not false; indeed, in some obvious (and therefore trivial) ways it must be true. A child raised alone in a dark closet for 36 months will not emerge looking and acting like a normal, healthy three-year-old, and the effects of the isolation will never be fully reversed. But, as Bruer persuasively argues, aside from offering a superfluous injunction against such extreme cases of neglect, research in neuroscience, in and of itself, tells us little about effective child-rearing.

Not that this area of research is without promise; quite the contrary. What exercises Bruer is, rather, the misappropriation of its findings to serve banal public-policy purposes. Examples of such misappropriation abound, and he has collected a good number of them here. They include the call that went out from the White House conference, in the name of science, for initiating or expanding a variety of government programs aimed at helping disadvantaged children, and the lament by the Boston Globe columnist Thomas Oliphant that “the undeniable fact that the human brain is almost entirely formed in the first three years is mocked by the fact that hardly any social resources are aimed at this critical period.” All such pronouncements tend to be built on overgeneralization from and misinterpretation of three basic findings that have emerged from brain science over the past few decades.

First, synapses—the connections among brain cells that enable them to communicate with one another—are formed at a very fast rate as the brain develops during the early years of life. Second, there are “critical periods” for brain development, during which the child must receive appropriate stimulation if catastrophic interruption is to be avoided. Third, the environment in which an animal is raised can affect the development of its brain.

Contrary to those who have made a myth of the first three years, however, none of this indicates the existence of some grand critical period for the entire brain, or that “enriching” a child’s environment while new synapses are forming will create more connections and “build a better brain.” Nor does it mean that if the alleged critical period is allowed to expire without the benefits of enrichment, the number of synapses will be fixed and the child’s experiences will no longer affect how his brain works or how he functions in the world.

For one thing, there is no known link between the number of synapses in the brain and the intellectual capacity of a child (or, for that matter, an adult). To the contrary, massive “pruning” or removal of synapses occurs during the later stages of childhood when new mental abilities are developing at a rapid clip. Moreover, although Bruer does not raise this point, increasing ability in performing a mental task leads to less use of brain resources for that task. For still another thing, critical periods in development apply neither to the entire brain nor to the entire interval when new skills are being acquired but only to a limited range of basic brain functions, as when the visual system begins to integrate the slightly different images registered by our two eyes.

Finally, extrapolating from studies performed on animals to draw conclusions about human behavior—let alone about social policy—is an enterprise fraught with risk. Experiments on the brain development of rats, though widely cited as proving the claims of the myth-makers, have not compared “normal” and “enriched” environments but what are best described as “deprived” and “less-deprived” environments. Whatever limited implications these experiments might have for human beings, the benefits of an environment that is indisputably richer in stimulation—exposing an infant, say, to fifteen minutes of music per day as opposed to placing him in complete silence for the same interval—cannot be proved by studying rats, whose brains have evolved to face a very different set of environmental challenges and developmental events from our own.

_____________

 

Bruer explores all these points at length, delving deep into the history of the early-years’ myth, analyzing its historical antecedents and founding documents (which consist primarily of thinly reasoned magazine articles, popular-science books, and foundation reports), and tracing the way it has established itself in the popular mind and marketplace.

His presentation drags at times, but that may be because there is no breezy or entertaining way to refute an appealing myth by means of sober facts of history and science, which is what Bruer has done. Much more regrettable is that he has needlessly chosen to denigrate Richard J. Herrnstein and Charles Murray’s The Bell Curve rather than to discuss the growing evidence of genetic influence on intelligence, personality, and behavior—evidence that would only strengthen his own case. Still, Bruer has performed a valuable public service, for the myth he demolishes and the public mindset in which it has thrived have costs that extend beyond dollars and cents and the misallocation of government resources.

The more we depend on science to shape our choices, the more we lose sight of other perfectly valid reasons to act as we should. If parents come to play with their children, read to them, and talk with them on purely instrumental grounds alone—because doing so “builds better brains”—what will they do if those grounds should be disproved, as Bruer has disproved the myth of the first three years? Will that then give parents a reason for neglect?

The question may seem frivolous, but it points to where the real danger lies: in the ever-greater reliance of ordinary people on alleged scientific facts to guide decisions and behavior that should rather be directed by their common reason. In the face of the flood of new claims and counterclaims about health, education, and public policy, what we need is not an unreflective dependence on what “new studies show” but the application of skeptical logic and an appreciation that many scientific facts are still subjects of debate, not eternal truths, To those ends, The Myth of the First Three Years makes a signal contribution.

_____________

+ A A -
You may also like
Share via
Copy link