The approach of the year 2000 is certain to be attended by a greater fanfare of predictions, prophecies, surmises, and forewarnings than any millennial year in history. In the past twelve months, at least four books on this subject have appeared, all of them concerned with the probable shape of American and world society in the year 2000.1 How many articles have appeared I cannot even guess. But books and articles are in any event only the exposed part of the iceberg. There are today centers, institutes, and bureaus, not to mention specific commissions, whose principal business it is to forecast or predict the future. There is with us, in short, as part of the already huge knowledge industry, the historical-prediction business; and this business is certain to become ever larger, ever more ramified. Through every conceivable means—game theory, linear programming, systems analysis, cybernetics, even old-fashioned intuition or hunch—individuals and organizations are working systematically on what lies ahead during the next thirty-two years, and indeed during the century or two after that. Nor is this an American enterprise only. In France there is the Futuribles project under the distinguished direction of Bertrand de Jouvenel. In England, the Social Science Research Council has established the Committee on the Next Thirty Years. There will be other forays into the future, in this country and abroad, for the lure of the game is spreading fast. An official of the State Department's Bureau of Educational and Cultural Affairs, Frank Snowden Hopkins, has already proposed the organization of an institute in which “rising young government administrators would each year spend some nine or ten months . . . studying the American future in all its aspects,” to which one can only say, nice work if you can find it—the future, that is.

Why the fascination with the future at a time when so many of us are preoccupied with the roots of our not always clear cultural identity? Daniel Bell, whose knowledge of what is going on in this matter is vast, suggests two rather different forces operating in human consciousness today. There is, first, the magic of the millennial number. Men have always been attracted, Bell reminds us, to the mystical allure of the chiloi, the Greek word for a thousand, from which we acquire our religious word chiliasm, the belief in a coming life free of imperfections.

There is, certainly, no question of the power of the chiloi on the human, especially Western, consciousness. As Professor Bell notes, Plato, in the Myth of Er with which he concludes The Republic, foretold that departed souls would return to earth after spending a thousand years in the netherworld. And there were widespread expectations in the early Christian era of a Second Coming at the end of a thousand-year period. To which we might add that St. Augustine, although he was much too shrewd to assign a precise date for the ending of the world, seems to have been convinced that the inevitable holocaust would take place approximately a half a millennium after his time, which would have put it at about 1000 A.D.

(It is worth observing parenthetically that the fire which Augustine foresaw as eventually consuming the world has, in addition to Stoic intellectual roots, a certain flavor of Caltech modernity. We are told in The Next Ninety Years by Norman Brooks, Professor of Engineering at Caltech, that one of the prime dangers faced by contemporary society is “thermal pollution,” the discharge of heat generated by technology. Professor Brooks estimates that the total amount of heat generated in Los Angeles alone is now 3 per cent of the incident solar radiation. And, independently, we are informed by two Rutgers scientists that at present rates of discharge of heat into American rivers “some of the rivers could reach their boiling point by 1980 and even evaporate by 2010.” Here, surely, amid all the plights of modern man, is the most exasperating. The hotter our technology makes the atmosphere, the more we will use air-conditioning machinery to cool it, which means—under the unrepealable second law of thermodynamics—the more heat.)

By the late 17th century, Western philosophers, noting that the earth's frame had still not been consumed by Augustinian holocaust, took a kind of politician's courage in the fact, and declared bravely that the world was never going to end (Descartes, it seems, had proved this) and that mankind was going to become ever more knowledgeable and, who knows, progressively happy. Now, of a sudden, the year 2000 became the object of philosophical speculation. One of the more charming manifestations of this in the 18th century was a play, L'An 2000, written by the bohemian man of letters Restif de la Bretonne, which has been described as a heroic comedy representing how marriages would be arranged in the year 2000, at which time some twenty nations would be allied to France under the wise supremacy of “our well-beloved monarch Louis Francois XXII.” (I almost wrote Charles de Gaulle.) A few years earlier Sébastien Mercier had written a small volume, much read and discussed, titled L'An 2440, in which a more or less ideal future is limned, one that we have no difficulty seeing as an extension into the future of economic, political, and intellectual “trends” that no doubt seemed as real to Mercier as current “trends” do to us at the present time.

It would not do, while still on the millennial magic of the year 2000, to forget the redoubtable Edward Bellamy in this country. His Looking Backwards, which deals with life as it might be expected to be in the year 2000, has just reappeared in a new edition by Harvard University Press, and has a splendid introduction by its editor, John L. Thomas. Considering the immense appeal that this book exerted for years on the minds of American citizens, politicians, and business men, it can be regarded as one more example of the occasional superiority of the “soft data” of ideas and fancies over the “hard data” of demography in shaping the future.



The second reason Daniel Bell gives for the recent upsurge of interest in the forecasting of the future is technology. As he writes, there is in our society a kind of “bewitchment of technology.” So many other matters have been settled by technology, so much of space—including planetary—has been conquered by technology, why should not the the marvelous skills of the computer conquer time by providing us with increasingly accurate glimpses of the future? As Bell puts it, “the possibility of prediction, the promise of technological wizardry, and the idea of a millennial turning point make an irresistible combination.” There is no question of this.

What is in question, however, is whether the marvels of electronic technology, cybernetics, linear programming, systems analysis, game theory, and the like, do add anything, can add anything to our success in an enterprise that is at least two hundred years old in the West: the serious, conscious business of predicting the future by observation of real or imagined continuities of event, change, and circumstance in time.

The wizardry of contemporary technology not-withstanding, the essential and lasting methodology of future-predicting was set forth in the early 18th century by the great Leibniz. One sentence, taken from his “Principles of Nature and of Grace,” will suffice to express the crucial elements of Leibniz's law of continuity: The present is big with the future, the future might be read in the past, the distant is expressed in the near.

I will come back in a moment to some of the premises and assumptions of Leibniz's law, for they are vital to our belief that predicting and forecasting are possible. For the moment, however, I want only to stress the fact that not all the marvels of computer technology and related devices have in any way supplanted our reliance upon this profound, if questionable, Leibnizian view of the relation among past, present, and future. Either the future does lie in the present, and hence is subject to observation through dissection, or it does not. And if it does not, all the computers and systems analysis and linear programming in the world will not help us. For it is sheer delusion to suppose that anything short of H. G. Wells's Time Machine can in fact get us into the future, as technology gets us across space to the moon.

For at least a couple of centuries, the essential meaning of Leibniz's law of historical continuity has been axiomatic in Western thought. It has been the basis of all that we call philosophy of history and social developmentalism. From it has come the widely accepted notion that there is an entity called civilization or culture, that this entity obeys certain immanent principles of growth in time, that the continuity of time is roughly the same as the continuity of this growth, that past, present, and future have not merely a chronological relation but a genetic relation, and that through sufficient study of the past and the present it is possible to foresee the future simply by extending or extrapolating ongoing processes.

This is precisely the mode of investigation, the framework of analysis, that we find in such titans of the 19th century as Comte, Hegel, Marx, and Spencer. One and all they were operating in terms of Leibnizian assumptions about the genetic relation of past, present, and future. We call them philosophers of history. But they could as properly be called future-predicters. For, it was not antiquarianism or preoccupation with the present that generated the works of these individuals. Obviously, they were interested in the past and present—and provided their readers with a great deal of enlightenment about past and present, just as do our Herman Kahns and Stuart Chases today—but the motivating interest was the shape of the future. The vision of future society, organized around the trinity of scientist, statesman, and industrialist, that Comte gave us in his The Positive Polity was far from being simple, speculative utopianism. It was a vision, in its root essentials, formed of projections of trends and tendencies that Comte supposed (and not without much reason) to be sovereign in his present and the recent past. Marx spent decades in the British Museum studying economic history—of England chiefly—not because he was enamored of English economic life but because, under Leibnizian assumptions that he no more questioned than had Comte, he discerned something called “capitalism,” something that was universal in type, actual or potential, and that would, he thought, obey laws of development which if correctly identified and clearly understood would make prediction of the future as scientifically unassailable as prediction of the movements of the earth around the sun. Marx and Engels, as we know, attacked the problem in two ways: genetically, by assessment of “iron” trends extending from past to present and, therefore, into the future; analytically, by study of the forces at hand, such as capitalism's supposed internal contradictions. The first gave overview; the second insight into mechanisms of change. Both, for Marx, were essential to the business of future-predicting.

There is no need of elaborating this. It is enough to stress that what Comte and Marx were engaged in exemplifies perfectly what other philosophers of history down to Spengler, Toynbee, and Sorokin (whose recent death took from us a mind matchless in this century for breadth of historical learning and insight) have been engaged in: prediction of the future in terms of extrapolation of discerned tendencies or trends in the present and recent past.

Nor are matters very different today. As one reads such a book as Herman Kahn's and Anthony Wiener's, The Year 2000, or their account in the Daedalus issue of the nature of their enterprise at the Hudson Institute, one has—if he has read the 19th-century philosophers of history mentioned above—almost the feeling of déjà vu. I would be the last to disparage anything simply because it has been done before, or because it exists in a line of like enterprises over a period of several centuries. The trouble is, there are so many references of a “hard data” sort, so many allusions throughout the contemporary literature on the future to all the puncture-proof, self-sealing devices of models, programming, and systems, that the unwary reader may be deceived into thinking that projections and forecasts of the future have the same secure relation to these devices that our accounting systems, traffic controls, and market analyses do. It would really be a shabby trick if we somehow left the inference around, to be picked up by the public, that computers and systems-analysts do look into the future in ways that were denied to a Tocqueville or a Marx.

“At the Hudson Institute,” Herman Kahn writes, “we have used three interrelated devices to facilitate making systematic conjectures about the future.” I will paraphrase the description of the “devices.” First, the Institute identifies “those long-term trends that seem likely to continue,” such trends as secular humanism, institutionalization of scientific and technological innovation, and steady economic growth. Second, the Institute personnel “cluster significant events by thirty-three year intervals, starting with 1900.” The purpose of this, we are informed with a straight face—though come to think of it, it may be deadpan—is to see which combinations give rise to new clusters, and to identify “emergent properties.” Third, the Institute constructs “significant baselines, statistical where possible, to project key variables in society—population, literacy, GNP, energy sources, military strength, and the like.”


I will not go into all the details that fill a four hundred-page book about what the future may be expected to look like. Suffice it to say that, cast in the terminology much favored by our “hard data” brethren in the social sciences today—“surprise-free projections,” minimum versus maximum assumptions, “canonical variations,” and the like—we have what turns out to be the by now familiar forecast of impending proliferation of bourgeois, bureaucratic, and democratic elites; continued accumulation of scientific and technological knowledge; worldwide industrialization; increasing affluence, urbanization, literacy, and education; increased capability for mass destruction; increasing tempo of change; and, capping this edifying assemblage, “the increasing universality of these trends.”

Or take the indefatigable Stuart Chase. Obviously, he assures us solemnly, “this present book is not in the Utopian tradition . . . It takes a hardheaded look at ten current trends, all deriving from science since Galileo, and then attempts to project them into the next few decades, say to the year 2000.” Mr. Chase admits that he is compelled to use his imagination like any writer of Utopias, “but there is very little that is imaginative about the description of the trends.” No, indeed. Perish the thought. Let no contemporary apostle of the hardheaded look be thought guilty of imagination when there are computers and linear programmers to transpose old-fashioned imagination and guesswork into hard science. After all, look at the remarkable predictions regarding the course of events in Vietnam that the hardheaded Secretary McNamara furnished us with through the wizardry of technology.

Let me now state two equally important points: (1) contemporary forays into the future are no better, and generally worse, ceteris paribus, than the forays into the future that our great grandfathers—Tocqueville, Comte, Marx, et al—made; (2) the only real utility of these fast accumulating reports and books on the future is the often enlightening, generally informative, sometimes brilliant perceptions they contain about the present. No doubt this point alone makes “future-predicting” worthwhile, for there is nothing like an assignment to gaze into the future for sharpening one's awareness of what lies around him in the present.


It is in these terms that the Daedalus volume and the Caltech symposium are especially fascinating and worthwhile. Why shouldn't they be, given some of the participants: in the first-named, Daniel Bell, Karl Deutsch, Theodosius Dobzhansky, Erik Erikson, Fred Iklé, Ernst Mayr, Wassily Leontief, Wilbert Moore, Roger Revelle, David Riesman, Herman Kahn—these are but a few; and in the Caltech symposium, Harrison Brown, James Bonner, J. George Harrar, Norman Brooks, John Weir, Thayer Scudder, Athelstan Spilhaus.

What we have in these two volumes is a great deal that is important to know about present conditions, present structures, and present rates and their apparent relation to the recent past. We have much speculation about what the future might be, we have a good deal about historical “trends,” and of course vast amounts of material on rates: birth rates, death rates, production rates, rates of air miles flown, entries into the national parks, investment rates, rates of just about everything in any way amenable to quantity-statement.

Reading all of these volumes, we can thrill with repressed horror at the thought of the mantle of too too solid flesh that will one day cover the earth (a mantle that the physicist-population expert, Sir Charles Darwin, once told an audience, much in the manner of the old fashioned temperance lecture, would reach, present rates continuing, one mile in height by the year 3500, or was it 2500?). One can feel his toes trampled on as he reads that by the year 2000 there will be two people for every foot of waterline in the U.S. One can hypnotize himself into a state of driver-fury by merely reading about the 250 million automobiles (we now have about 59 million) on American streets and highways. The thought of 225 billion passenger miles to be flown by the airlines in the year 2000, in contrast to a 1960 figure of 35 billion, is enough to keep everyone home, which would indeed be a change. But change is not, alas, what these books are predicting; they are only extrapolating present rates, many of which remind one of a mad physiologist predicting giants at age twenty on the basis of growth rates at age ten.


Only the unwary will be deluded into thinking that any of this is in fact the future. There have been statistician-soothsayers, I am certain, in all ages. In ancient Egypt there must have been such individuals to compute the number of pyramids there would be on the earth two thousand years later; before that someone to compute the number of pterodactyls; after that, to compute the number of knights on horseback, wayfarer chapels, not to mention witches. It is a great game for the statistically-minded (like predictions year by year in the Pentagon of that infinitesimally small chunk of time represented by our engagement in Vietnam), and, as I say, I do not for a moment disparage it. It tells us about the present.

But to pretend, as many will pretend, that we are being treated to a look into the future—a look that is “hardheaded,” that is technologically beyond what lay in the capacity of a Marx or Spencer—is to pretend nonsense. We do not even, apparently, have it in our technological power to make “predictions” that will themselves remain steady within the computer-priesthood for more than the shortest periods of time. Thus, when Daniel Bell opened one of his sessions by referring to the quite recent Rand projections of what the year 2000 would be like, he was promptly counter-punched by Fred Iklé with the horrendous disclosure that Rand predictions “already look a little foolish”; that among other inaccuracies, the population of the world will be “seven or eight billion” by that year instead of the “five billion” projected by Rand. Well, one could, of course, write quite an essay, if he were sufficiently bad-tempered, about the predictions made, the hard data predictions, during my short lifetime by the demographers.



Let me be very clear. I am not finding fault with, much less depreciating or scoffing at what demographers and other statistical analysts do by way of explicating the world we live in, its structures, processes, and rates. It is indeed important to know sometimes how big “big” is, how fast “fast” is. These activities are valuable, as are the technological devices that today accompany and reinforce them. They will not, however, (and Daniel Bell makes this point in one of the frequent observations—aloof but committed, skeptical but engaged—with which he interlards the proceedings in Daedalus) supplant speculation, raw speculation, when it comes to predictions that go more than a very very short time from the immediate present.

And the reason for this—I now go back to the Leibnizian law of continuity which remains as powerful today in its impact upon future-predicters as it did in the 18th and 19th centuries—the ineffaceable, unconquerable reason for this is that the present is not big with the future. Nor, let it be well understood, was the past ever big with what is now the present. We are confusing continuity of chronology with continuity of circumstance and event. We are mistaking our metaphoric reconstructions of the past, by which we assauge the pain of need for temporal order, for causal connection.

In the 18th century a distinguished physical philosopher (Laplace, I believe it was), declared that if he could but know everything in the present—every thought, motive, reflex, need, etc.—he could with ease and accuracy predict everything in the future. The surface plausibility of this is matched only by its lurking naivete. Since it is precisely the kind of naivete that reigns today, however, I do not apologize for introducing it. Nothing, I assume, could seem more certain to the individual for whom reality consists of the hard data of atoms, molecules, reflexes, social-security numbers, and the like than that absolute knowledge of the hard data of the present should yield—properly processed in the machines—knowledge of the future. But it won't and it never will. And the reason, I repeat, is that the present does not contain the future, the far is not to be found in the near, nor was our present ever contained in the past. Not if what we are concerned with is change.

True, institutions, structures, established ways of behavior, will extend themselves into the future just as geographic terrain will. There will be, barring catastrophe or the appearance of the Genius, or the Prophet, or the Maniac, or the Random Event, technology, schools, kinship systems, magazines like COMMENTARY, and so on. This is safe predicting, for all we are predicting is persistence. And I do not depreciate this, for we would be a lot farther along in our understanding of the actual dynamics of change if there were real understanding, real acceptance, in the theoretical terms of social science, of the phenomena of stability and persistence of ways of social behavior. Wilbert Moore, in a wise utterance in the Daedalus issue, calls attention to our typical exaggeration of change in society and to our common confusion of change—actual, significant change—with mere motion, activity, and movement.

For some reason it is difficult to discuss stability and fixity with most social scientists. They do not believe in it. I am invariably told, gently, condescendingly, that what I mistake for hard, unchanging persistence in time is my inability to discern the actual changes—subterranean, microscopic—that go on incessantly, in the form of minute adaptations, conflicts, stresses, and strains which in the long view might be seen to be as directional and cumulative as the infinitesimal variations studied by the geneticist. But conflicts, stresses, and strains do not by any means bespeak change; all they can confidently be said to be are—conflicts, stresses, and strains! It is remarkable, as Lewis Coser has shown us in his classic study of the functions of social conflict, how much conflict actually turns out to be not merely containable within social structures but actually supportive of such structures. How many billions of role tensions and conflicts there have been in the long history of Western kinship it would be impossible to guess; but changes, basic, fundamental changes, in kinship structure in the West have been few and far between, and these changes have clearly had far more to do with the external and adventitious impacts of the Random Event, the Prophet, the Genius, and the Maniac in history than with any accumulation of minute role—or status—changes within the kinship system or with what are generally called trends of change in society.

“Trends” are particularly suspect. A trend, the dictionary tells us, is the general direction taken by a stream, a shoreline, etc.; it is an underlying or prevailing tendency or inclination. These are all tempting words for the historian or predicter of societal development. How easy it is, as we look back over the past—that is, of course, the “past” that has been selected for us by historians and social scientists—to see in it trends and tendencies that appear to possess the iron necessity and clear directionality of growth in a plant or organism. We think of these “trends” as cumulative movements, as genetic sequences, as actually causal. We forget that they are, one and all, a posteriori constructs, frequently metaphoric in character, always post hoc, propter hoc.

But the relation among past, present, and future is chronological, not causal. As one looks at the various sequences of events—“clusters,” as Herman Kahn calls them—there is not a sequence or linkage that would not make just as much sense in causal terms were it in fact the obverse, were it any one of the literally hundreds of “patterns” that events and changes might as easily—and, in causal or genetic terms as “rationally,” “logically”—have taken.

What Emile Durkheim wrote on all this is profound and, obviously, still relevant: It is said, writes Durkheim, “that history has for its object precisely the linking of events in their order of succession. But it is impossible to conceive how the stage which a civilization has reached at a given moment could be the determining cause of the subsequent stage. The stages that humanity successively traverses do not engender one another.

“All that we can observe experimentally . . . is a series of changes among which a causal bond does not exist. The antecedent stage does not produce the subsequent one, but the relation between them is exclusively chronological. Under these circumstances all scientific prevision is impossible.” (Italics added.) Pace Marx, Comte, Spengler, Toynbee, and, with them, any of those currently concerned with the year 2000 who might believe that sufficiently intensive analysis of the present will yield knowledge of future change.


That one can, through religious or philosophical metaphor, summarize, encapsulate, on the basis of skillful selection of a few events and changes from the past, is not to be doubted; that one can form in his mind, through this type of selection and through one or another metaphor, “trends” that have vivid meaning to his mind is not to be doubted either. We may with Tocqueville see Western history in the terms of a leveling that sweeps away all class differences; with Marx we may see the same Western history in terms of class conflict that sweeps away all equalities. Or we may see history in terms of magnification of power, rationalization of power, emancipation from power. All of these are philosophical constructions with which we seek to impose meaning upon the essentially meaningless. To confuse “trends,” whether Marx's, Tocqueville's, Stuart Chase's, or Herman Kahn's, with processes that have in fact genetic continuity and causal connection in time is, however, to take the metaphor of growth much too seriously in its application to human behavior in time. The hypothesis of growth is useful only in the understanding of entities that do actually grow and develop—such as plants and organisms—and for all else it is either naive or dangerous.

As one who enjoys an occasional roll of the dice at Las Vegas, I have a model of change in time that seems to me much better than models drawn from organic growth. My model is based on what happens with the throw of the dice. I may, in a given evening, fancy all manner of trends and tendencies present in my dice-throwing. Hypnotized occasionally by either surpassing good luck or bad luck, I may imagine the existence of continuities too extraordinary to be explained by chance. Such continuities can even seem genetic. But my good sense usually comes quickly to the rescue. I know that such continuities are random and chronological, not causal. Even if I were to throw twenty naturals in succession I would know that no genetic trend existed; only that the laws of probability had been stretched. I would know that twenty or a hundred naturals would not, either in the experience or in retrospect, have, in their unbroken succession, the slightest influence upon what the next throw would yield. To predict anything on the basis of my five or twenty naturals in succession would be, plainly, impossible.

Could we therefore predict nothing? Not at all. We could confidently predict that the dice game would go on and on (we could make this prediction on the basis of surprise-free assumptions with, I should think, quite a high coefficient of probability), that the gambling casino would also go on and on, that Las Vegas would continue to sprawl ever farther out into the desert (just as the pyramids once went sprawling over the Egyptian desert), and that the number of Americans visiting Las Vegas, at present annual rates of increase, would by A.D. 2100 reach the point where there would be twenty-eight little old ladies in front of each slot machine in contrast to the two invariably there now.

Somehow this kind of predicting isn't very exciting once your hopes have been raised by apparent continuities of the dice, but it is still prediction, isn't it?, and it is the kind of prediction, like it or not, that you will get in books on the year 2000.

The crucial point, though, is this: What is significantly different and novel about American society in 1968 did not “grow out of” American society of 1868, fond though we may be of the lovely, thought-narcotizing metaphor of growth with all its comfortable words regarding genetic continuity, trends, causality, and the like. (One of the Daedalus participants observes that what is new and vital in science today—that is, what has changed in time—could not possibly have been predicted a generation ago. Of course not. What is manifest today was not then latent, was not “in” the science of that time. Why should matters be different elsewhere in culture?) True, a really surprising amount of what existed in 1868 is with us still, and it is this similarity that leads us to think of it all in terms of growth and continuity of change. But it is not continuity of change that has operated; only continuity in the sense of continuity of the river's bed, that is, persistence of the unchanging or the only slightly and infrequently changed. This is continuity too, but not the kind that is summarized in visions of trends and tendencies.


Let us be clear on two points. (1) Events do not marry and have little events that grow into big events which in turn marry and have little events, etc.; (2) small social changes do not accumulate directionally and continuously to become big changes. We pretend in our histories and sociologies that such is the case, but it is all a posteriori, suffers badly from an affliction known as the pathetic fallacy, and does more to assuage the pain of intellectual disorder than it does to throw light on the actual processes of social change. Yes, the theory of natural selection might also be called a posteriori and poor in predictive power, but somehow the evolutionary biologist has been more successful at linking micro-changes in their additive and cumulative succession to macro-changes of speciation than we in the social sciences have (or ever will!).

And, of course, the biologist doesn't have to deal as we have to with the Random Event, the Maniac, the Prophet, and the Genius. Maybe there are equivalents in the timeless, structureless, typeless world of the biologist's “population thinking,” but I doubt it.

It is very different with studies of change in human society. Here the Random Event, the Maniac, the Prophet, and the Genius have to be reckoned with. We have absolutely no way of escaping them. The future-predicters don't suggest that we can avoid or escape them—or ever be able to predict or forecast them. What the future-predicters, the change-analysts, and trend-tenders say in effect is that with the aid of institute resources, computers, linear programming, etc. they will deal with the kinds of change that are not the consequence of the Random Event, the Genius, the Maniac, and the Prophet.

To which I can only say: there really aren't any; not any worth looking at anyhow.

1 Toward the Year 2000: Work in Progress. This is a special issue of Daedalus reporting the first study papers and proceedings of the American Academy of Arts and Sciences' Commission on the Year 2000, of which Daniel Bell is chairman. The Year 2000, by Herman Kahn and Anthony J. Wiener (Macmillan, 1967); The Next Ninety Years (California Institute of Technology, 1967); The Most Probable World, by Stuart Chase; (Harper and Row, 1968). In these books are represented some of the foremost minds today working in the sciences, physical and social.

+ A A -
You may also like
Share via
Copy link