Every civilized person will recall the following dialogue between Christ and his judge:
"To this end was I born, and for this cause came I into the world, that I should bear witness unto the truth ...."
Pilate saith unto him, "What is truth?"
No answer is recorded.
In the nineteen centuries since Pilate asked his flippant question, thousands of replies have been given, no two much alike. Something seems to be loose; it is scarcely credible that there can be as much truth in the world as the professional purveyors of that rare luxury would have us believe. It is just possible that Pilate's question received no answer because it was meaningless. Christ was not the man to lose his temper and refuse to answer merely because the question was asked in a spirit of jesting cruelty rather than in an honest endeavor to learn something.
Until recently those who have taken it upon themselves to tell us what truth is have for the most part been theologians, philosophers, metaphysicians, cranks, and visionaries suffering from some disorder of the ductless glands, or combinations of two or more of these. Of late a new claimant has entered the list, and science now makes more noise, if no more sense occasionally, than all the others combined.
The claims of the newcomer are impressive, to say the least, the more so as much of the jargon in which they are expressed is unfamiliar to most of us. Particularly is this the case when the ultimate justification for some wild speculation or dismal prophecy is an airy wave of the hand in the general direction of mathematics. "Of course you wouldn't understand any of that; but you can take my word for it that it is all perfectly sound, and you may believe with full confidence that it is true. Why, it has been proved mathematically."
Having noticed many such lofty snubs being handed out to those who have no opportunity to look into the matter for themselves, I decided it was about time for someone who makes his living at this mysterious stuff called mathematics to see whether the latest answers to Pilate's question make any better sense than the old. It is not necessary to fling the wave equation of quantum mechanics, the indeterminacy principle, or even the field equations of general relativity, at any bewildered head in order to convince anyone outside of a lunatic asylum that the new answers leave us precisely where we were before.
The evidence can be presented from start to finish with nothing more abstruse than what any child in the eighth grade knows. This can be done by unravelling a single strand of the tangled epic of our human kind's attempts to think straight, or even to think at all. Those who enjoy asking unanswerable questions may still amuse themselves by echoing Pilate, and no harm will come of it, provided they do not force others to believe the answers they get.
The history of sane thinking can be traced back by documentary evidence a full four thousand years, and by reasonable inference another one or two thousand years beyond that. We shall find that the steps at first were slow and confused. As we should expect from the histories of the arts, periods of intense brilliance and amazingly rapid progress alternate with rather longer stretches of dullness and apathy, where conservatism and tradition throttled liberal thought, and where no significant advance was made for centuries. Petrie's explanation (noticed in Chapter III) may account for the tide-like ebb and flow, but it is neither necessary nor helpful for our simple purpose to theorize about causes; the main facts are sufficient. Reading them, we shall be in a fair position to judge for ourselves whether Pilate has been answered by science, or whether he is likely ever to be answered by anyone. He may have been making what Russell calls "a meaningless noise."
As we follow the story down from the hazy past to the present, we shall see an ever-accelerating speed in the piling up of new guesses as we approach our own time. There is nothing miraculous in this. Nor need we congratulate ourselves that our age is a thousand times as fertile as its predecessors, merely because one year in the present decade spawns a hundred theories instead of the nine or ten per century a few hundred years ago. Our greater productivity is due partly to the fact that scores or hundreds of workers dig like demons side by side in a single narrow field which, only a century ago, was abandoned to one cogitating, dyspeptic hermit and the crows. This, by definition, is progress. Put a lot of goldfish in a bathtub, give them all they can eat, and naturally they will breed.
To understand Pilate's question and, if possible, either answer it or silence him, we must see in broad outline what human beings in the past 4000 or 6000 years have imagined "truth" to be. This will be done in later chapters. Incidentally, we must glance occasionally at what has been held as to the nature of "proof" and the basis for human beliefs of any kind whatever.
The questions thus raised are fundamental. This is not a rhetorical exaggeration; "fundamental" means, literally, "at the foundations," or "at the beginnings" of everything considered. It is no paradox to say that the utter simplicity of fundamental questions is what makes them difficult to grasp. "Fundamental" habits of thought are drilled into us from infancy, and it is only after much patient scrutiny that we can discover anything which may be taken hold of for closer examination. Complicated problems are much easier than simple, fundamental ones to handle.
To avoid even the suspicion of paradox or mysticism, I shall labor this point a bit and try to bring out its simple meaning by a personal reminiscence of a man who had rediscovered for himself one of the devices by which scientists and mathematicians make some of their most striking advances. Incidentally this story contains an invaluable hint for any beginner who may be hoping to make some contribution—if only a humble one—to the advancement of science. It may suggest to him one way of overcoming the initial difficulty which stops many, that of finding something to do that is radically new, not trivial, and within one's powers.
In my student days I kept hearing tales of one of the professors—I shall call him Z, as he is still going strong, and I have no wish to embarrass him—who was a wonder for the amount of high-grade scientific research which he turned out apparently without the slightest effort. His seminar, where he developed his ideas in plain view of anyone who cared to watch, was jammed, mostly by men on the lookout for some hint of a good problem. Z dropped hints all over the place, and he never seemed to mind who picked them up or what became of them. His stock was inexhaustible.
Three of us, trying to start in another field and caring nothing at all about Z's brand of mathematics, decided to drop in on his seminar to find out, if we could, the secret of his prolific success. The first month passed in total darkness. Z was finishing up some elaborate investigation or another, and we got not the slightest idea of what he was talking about. Nevertheless we hung on, in the hope that the night would not last forever. Our hope was not disappointed. Running out of interesting ideas in what he was doing, Z very sensibly dropped the subject like a hot brick. He was not the man to waste months embroidering something that any competent worker could pick up and slave over for years. "That's enough of that," he remarked in the middle of an involved proof. "It's gone stale on us and I'm sick of it. Let's drop it and look for something fresh. Any suggestions?"
One man half jokingly proposed a seemingly trivial subject that everyone in the room had known all about since his first year in high school
"All right," said Z "let's see what's in it."
We all saw the truth of the trite proverb, that it is the first steps which count, developing before our eyes. While we looked on, Z proceeded to write out, in minute detail, every assumption from which the trivial subject proposed is developed in the schoolbooks. Even statements so universally accepted as "things which are equal to the same thing are equal to one another" were not omitted. To most of us this wealth of detail seemed a waste of Z's time and ours. The collecting and writing out of these fundamental assumptions took the full working time of four days. Z wanted to be sure that he left out nothing, no matter how obvious it might seem.
Satisfied at last that everything was nailed down, Z proceeded to the next step. This was the step that gave him something new. Each of the long list of assumptions was an "axiom" or "self-evident truth" which some of us had accepted as a necessary element in any sane thinking about the subject, and most of them were simple subject-predicate declarations. Z pointed out that there were at least three possibilities with regard to each of these "axioms": an axiom might be contradicted or denied; it might be erased, ignored, and not used; it might be either slightly or radically modified. Any one of these possibilities would violate our accustomed habits of thought. Common sense also protested strongly against the whole seditious business. Nevertheless Z went ahead and proceeded to modify the simplest one of all the assumptions very slightly. That was the only change he made, but it was, as I have intimated, a flagrant breach of common sense. Seeming to feel that some of us were ill at ease, Z comforted us with a remark which left us rather deflated; "Common sense is not what you need if you're going to find out anything worth knowing; it is uncommon sense." However, having seen him do the trick, I fancy that those who have any sense at all can find out something for themselves, provided they have the interest and patience.
The next steps took many weeks of slow, careful labor. The consequences of the slightly modified set of assumptions were developed with great care and thoroughness—by the laws of common logic. Nothing very exciting appeared. The original subject and its almost-identical twin were so much alike that we grew to loathe the sight of both of them. The original had been plain enough for any taste; two of her were a surfeit.
Then, gradually, we began to catch distant glimpses of a vast unexplored territory. Our new road was almost parallel to the old, well-travelled highway; the extremely slight divergence, if pursued far enough, must ultimately take us into regions whose existence no traveller of the old road had ever suspected. Confident that he was at last penetrating rich new territory, Z went ahead as fast as we could follow. Ten months after his first, careful steps, he had put us in possession of a boundless new province of knowledge, not one of whose landscapes resembled in the slightest degree the somewhat drab country from which we had set out. He had, in fact, made a radical advance. To make a radical advance one must tamper with the roots, not with the branches, of the tree of knowledge. That is what Z did. We shall see the same sort of radical advance in basic thinking gradually developing as we trace the thread of deductive reasoning through its long history.
It is easy enough for a competently trained man to pick a pretty flower or a glossy leaf off someone else's tree. To create a new tree of one's own, one must get at the very life of the tree itself and interfere with the seed. This demands more patience and possibly more insight than does the choosing of even the most beautiful flower on a growing tree. But the effort. is rewarded many thousandfold.
The difficulties in the way of radical advances in thinking are chiefly two: our inability to throw off traditional patterns of thinking or reasoning which we have acquired through years of hard work at school, and the natural difficulty of taking hold of anything fundamental that is so completely simple that nothing sticks out from it to offer a hand-hold. It is something like trying to pick up a smooth, heavy box without handles. Again, the more familiar an "accepted truth" is, the harder it is to dispute, to modify, or to doubt. Yet that is precisely what we must do if we are not to remain content with what we have inherited from the dead, inadequate past.
To illustrate the difficulty of radical advances, one such may be recalled here. Is it not "obvious" that two events can happen in different places at the same time? To assume that such is the fact is sufficient for most human purposes, but is insufficient for some parts of physical science. Until 1905, when Einstein analysed this simple assumption about the simultaneity of events in different places, no sane mortal had seen anything suspicious about it. It is so simple that it must be a "universal truth."
But suppose We ask ourselves how we would set about establishing the "fact" that Jones shot Brown in New York, at exactly 12 p.m., last Monday and that Claude shot Philippe at precisely the same instant in Paris. We should need wireless signals, clocks, and other apparatus to give our procedure a meaning. In short, we should be able to describe some set of physical operations, which we or other human beings could perform, which would give us the two shootings "at the same time"—the same time as measured on our clock, and not as imagined in the mind of some mathematical philosopher. This might be possible. But, if we think it out, we begin to doubt the possibility if Jones and Brown, instead of being motionless in New York, should stage their event on an asteroid shooting through space so fast that wireless signals would be slowed down in communicating with it. The argument is presented in detail in any book on relativity. All we need note here is that this particular "obvious truth" is neither obvious nor necessarily true when picked to pieces. By analysing it, Einstein was led to his invention of the special theory of relativity—a radical advance. It violated common sense and tradition. The radical advances made in the technique of cold reason during the past four years are, at first sight, equally repugnant to common sense and accepted tradition.
Since the beginning of this century the physical sciences have made one radical advance after another. The fundamentals of the science which our century inherited from its predecessors have been modified, now slightly, now quite perceptibly, till our outlook on the physical universe today bears but little resemblance to that of only thirty years ago. Great and striking as these advances are, there has been another, most rapidly developed since 1930, which has been slowly gathering momentum for all of 2300 years, which is of far deeper significance for "truth"—or Pilate's query—than any of the radical advances of science of the past thirty years. Being more fundamental, more radical, and simpler than any of the spectacular advances in science, naturally this new advance has escaped the notice which its far-reaching importance merits. Yet it is of profound significance for all theorizing and truth-seeking, scientific or other.
It seems reasonable that those whose interest in scientific advances is chiefly that of the onlooker interested in all progress should be keener for the theories of sciences than for the detailed comprehension of the laboratory technique behind the theories. Only a man who spends most of his life trying to locate elusive leaks in vacuum pumps can get any real thrill out of a minute description of complicated apparatus and the endless refined measurements which are the justification of his existence. What the man outside the laboratory wants to know is what all these measurements "mean" in terms of the world or the universe at large. Leaving aside the apparently inevitable applications to peace and war of the most far.fetched or seemingly useless experiments, the unprofessional follower of science is interested in the broader philosophical implications of the scientific theories and speculations growing out of the experiments. This obviously is a matter for reasoning rather than for experimenting to handle.
For instance, we have heard much in the past two years of the "expanding universe." It is a fascinating theory. The whole universe of spiral nebulae—"island universes," each with its millions upon millions of intensely brilliant suns—is expanding like a soap bubble. In the not infinitely distant future this universe will have dispersed into the depths of space like a cupful of bees in a summer sky. That is one of the possibilities predicted by the mathematics. Another is that the whole swarm will alternately contract and expand, pulsating like a cosmic jelly fish. Another is that all the nebulae (our own Milky Way included) will rush together, to flash up and out forever in one transcendent blaze of annihilation. This one, however, is less "mathematically probable" than either of the others.
To continue with the universe for a moment, let us follow the father of the expanding theory back for a step or two on his bold exploration of the abyss of time. If the universe is expanding—notice the cast iron logic, the reasonableness of the conclusions—it must have been more condensed, more concentrated, at some time in the past. As it may be expressed, the nebulae must have been crowded into a smaller space than that which they now occupy. Knowing the rate at which the expansion is taking place, we can easily calculate how many years ago all the nebulae were packed together, or if not that exactly, the approximate date at which the nebulae began boiling out of whatever matrix it may have been in when they were created. That is, we can fix the date of the creation of the material universe. All of this, notice, is reached by strict deductive or mathematical reasoning from hypotheses. Deductive reasoning is the kind in which we shall be chiefly interested. It is the breeder of theories, and it is the kind of reasoning that has been revolutionized since 1930.
Let us follow our guide another step or two. The outstanding riddle of physics today is the problem of cosmic radiation. Intensely penetrating rays of some kind have been detected shooting through our atmosphere in all directions. What are they, what causes them, and where do they come from? These questions are not yet answered scientifically. But see what a beautifully logical, simple answer we get if we interrogate the expanding—or rather contracted—universe.
Radium, we recall, emits intensely penetrating radiation, and in doing so gradually loses its mass. Given time enough, a lump of radium would dissipate itself away to a microscopic speck. Now imagine the matrix out of which all the stars and nebulae swarmed when the universe began expanding. What could it have been like? To jog our imaginations, let us recall that radioactive elements dike radium, for example), in emitting their radiation, break down into simpler, less radioactive elements. With this hint we can proceed. The universe, when it was created, was nothing more nor less than one gigantic atom of an intensely radioactive element. Indeed this monstrous atom was so supernaturally active that it forthwith began exploding in a blaze of cosmic radiation. As the rays shot off in all directions—these rays are still shooting, physicists find and measure them every day—the single, "noble" element of which the atom was composed began breaking down into baser gold, quicksilver, copper, iron, tin, oxygen, hydrogen, etc., etc., etc.,—in fact into the 93 elements which chemistry has discovered or suspected. Thus at one step we have given a consistent, logical explanation of the creation of the universe, the origin of the familiar chemical elements, and the genesis of cosmic radiation. And, to repeat, all of this has been done by perfectly sound mathematical, deductive reasoning proceeding from hypotheses (assumptions), which in their turn have been framed to fit a certain narrow range of experimental data (observations) made by experimentalists working with tangible clocks, meter sticks, telescopes, and so on, in laboratories and observatories constructed, for the most part, of steel and concrete. All this, surely, is solid enough. What could be solider?
One step more, and we shall come to the end of this true parable. If the magnificent sweep of this sublime speculation obviously embraces the destiny of man and the universe—as it does—must it not also overshadow the loftiest aspirations of man's spiritual nature? After all the other steps we have so successfully taken we need not boggle over this, the last. Whether we personally take it or not does not much matter here. Others have; we shall greet one of them in the next chapter.
Now for the conclusion of this inspiring parable. As I write this, it is 6:30 p.m., March 30, 1934. This date may prove to be of some importance in the history of speculative science, not because I am writing this, but because of what began happening exactly two hours ago. At 4:30 sharp, the man upon whose cautious, unspeculative astronomical work—observations—the speculators on expanding universes of any kind base all their speculations, began reporting to a group of scientific workers the carefully analysed results of his brute-fact observations for the past several months. Most of those who heard this man's report had known for months what to expect; the two or three gentlemen from the press did not. As the report was delivered in severely technical language from start to finish, it is unlikely that tomorrow's morning paper will contain more than a line or two mentioning that so-and-so reported.
The expanding universe was mentioned only once, and then by someone who asked a question. But the report could have been played up as the latest scientific verdict on the theory of the expanding universe and all the theories—physical, astronomical, cosmological, theological—to which that prolific speculation has given birth.
What is the verdict? Simply this. The experimental data, so far as they have been obtained, will support any one of three theories equally well. These theories are: (1) The universe is expanding; (2) The universe is not expanding; (3)—this one may be omitted, as it leads into technicalities which have nothing to do with the point of this parable. To decide between the three, further data will be necessary, and such are not likely to be forthcoming (although there is a possibility of finding some other experimental answer) until the 200-inch telescope, the mirror of which is still in the annealing oven, is doing business—say in ten or fifteen years. So if anyone asks today, "Is it really true that the universe is expanding like a gigantic soapbubble?", all we can answer is "We don't know; but if it gives you a thrill to imagine that it is, then you may speculate on the possibilities to your heart's and your soul's content." And if the questioner protests that the expansion has been "proved mathematically," he may be reminded that Lord Kelvin, one of the leading physicists of the Nineteenth Century, "proved mathematically," that flight through the air in a heavier-than-air machine is impossible.
From this example I think it can be seen that any discussion of a revolution in deductive reasoning (the kind that breeds theories) is likely to raise several questions of considerable interest.
We have glanced at the peculiar difficulties inherent in any attempt to make a radical advance in thinking, and it was mentioned that one such advance has been made within the past four years. Our unravelling of a single strand in the tangled history of exact thinking will lead us naturally to an outlook on this advance. The single strand which we shall follow is the history of deductive reasoning, or, as it is sometimes called cold reason. This includes logical thinking and the subtle process by which mathematics reaches its (apparently) profound conclusions. Being fully aware of the phobia which most non-mathematicians have for symbols, and sympathizing with this justifiable fear of the unfamiliar, I shall avoid symbols entirely. I said awhile ago that the whole argument can be followed from beginning to end without technicalities of any sort, and I meant it. This is exactly what we should expect: the points to be seen are fundamental and therefore not complicated.
Between the experimental science of the laboratory and the fascinating speculations on the human significance and cosmic import of the humdrum experiments yawns a bottomless abyss, swirling from brim to brim with the dun-colored clouds of metaphysics, outmoded theology, and woolly mysticism. Between the walls of the smoky chasm, and sagging perilously down into the general fog, stretches a dizzy bridge no more substantial than a strand of cobweb. This is the tenuous cable to which daring speculators trust their scientific reputations (more precious to them than their weary lives) when they pass from the unexciting certainties of their laboratories to the exciting uncertainties of cosmic speculation. Many fall off the cable—cold reason, it is called—before they get halfway across and are heard of no more where reputable scientists should be heard of. Some, however, wisely shut their eyes the moment they leave their laboratories to venture out on the cable. These make the passage safely (except one who fell off recently half a yard from the beginning and became a bishop). The joyous noise of prophecy and the chanted revelation of the signs and wonders these successful acrobats visioned in the smoke of the abyss more than compensate for the profound silence of less adventurous men in the laboratories—who know the facts. Because these bold adventurers have reached their striking or inspiring conclusions by following the path of rigid, deductive mathematical reasoning, therefore those conclusions are true, no matter what material evidence some dull churl nursing a vacuum pump may adduce to the contrary. Such is the claim of the ardent speculators.
If it were found that the bridge across the abyss—the ageold fabric of strict deductive reasoning—is not what those who have used it for two thousand years or more have imagined it to be, would our belief in the truth of the conclusions reached by the seers and prophets undergo a change? I personally think not.
Those in whom "the will to believe"—in William James' phrase—is inborn, are independent of reason so far as reason touches their beliefs. At the other extreme are those to whom belief of any sort is a disregarded luxury seldom enjoyed. So neither the first nor the last need be disturbed when they see that our outlook on strict, deductive reasoning, and our estimate of the validity and universality of the "truths" attained by such reasoning, have suffered a radical change in the past four years.
It will be granted, I hope, as we reach the end of the whole story, that altogether too much faith has been put in the "truth" of results reached by mathematical or logical reasoning, or indeed by any other process of rigid, deductive reasoning. It was on this type of reasoning that theology leaned in the Dark Ages, with disastrous results to itself and to human society, to say nothing of common decency (as we shall see in a later chapter); and it is on this same type that scientific speculation and scientific mysticism lean today. A blind belief in the absolute, superhuman "truth" of results reached by so-called cold reason has bred, and continues to breed, superstitions as pernicious as any that ever curses our credulous race. A little patience in following the story through the 4000 or 6000 years about which anything definite is known should convince anyone that this belief is the last superstition blocking our way to a sane outlook on the universe—whether it is expanding, or contracting, or just sitting still.
Ever since the Great Depression laid our sinful world by the heels there has been a loud chorus from the right: "What the world needs is a Great Revival of Belief." Of late the chorus seems to have been growing fainter. "The devil was sick, the devil a monk would be; the devil was well, the devil a monk was he." Whether the world would be better off if it believed more than it does now or did in the past is a question that no human being can answer. The only way to decide the matter would be by a strict application of the scientific method: Experiment, and find out. But whether we shall ultimately believe more or less than we do now, particularly as to the kind of "truth" which Pilate no doubt thought he was taking about, I take it that there can be no harm in trying to understand on exactly what grounds we hold our beliefs. This we shall endeavor to do as we go on, but again only so far as concerns those of our beliefs which have been based on the results reached by the strictest kind of reasoning.
Why do some believe that a particular scientific or mathematical theory is "true"? Notice the word "theory." A theory is not a fact of observation—witness the magnificent theory of the gigantic radioactive atom that gave birth to the universe. The "truth" of a theory cannot be appraised until we have some sort of an agreement on what "truth" is to mean, and some estimate of the efficacy of our reasoning processes for reaching this particular kind of truth. Such an agreement was reached, we shall see, sometime before the year 4000 B.C. Is the "truth" in any sense absolute—that is, eternal, everlastingly indestructible, and superhuman? Does this thing, or quality, that we have agreed to call "truth" exist somewhere, somehow, independently of all human efforts to create it? Is it natural—as human speech is natural, or is it super-natural? Are there such things as "eternal truths"? Is there such a thing as "truth"? When we invoke the "eternal verities," might we not as well shake our fists at the sky and shout "abracadabra"? Do any of these questions mean anything at all, or are they just meaningless jumbles of words that sound like meaningful questions but are, in fact, nonsense, like this seemingly profound query: "Is virtue more identical than beauty?"
Whether or not any of these questions are more than meaningless noises, they are of the sort that can be disposed of, one way or another, by those who are willing to take their courage in their hands, grasp what the last few decades have given them, and resolutely turn their backs on the great traditions of the past. In passing, why must we always be looking back over our shoulders as we race forward? Would not common sense suggest that we look out occasionally for unforeseen obstacles lest we come a cropper? But to go on.
These questions, meaningful or otherwise, as the case may be, are at least among those that philosophy has considered of some interest. In stating that they can be settled or discarded once for all, I fully realize that almost any philosopher will brand the statement as brash. All I can offer at the moment in reply is a request to examine the evidence. Some of it will be presented as we go, in later chapters.
Both patience and tolerance in no small degree are demanded of anyone who makes an honest effort to appraise anything fundamentally new. The first time we hear of a heresy which traverses our acquired habits of thinking and upsets the traditions of centuries, we may be inclined to turn a deaf ear. A repetition only makes us impatient. But if we have any just claim at all to be considered rational beings we should at least examine the evidence for and against a particular heresy before lighting the faggots under the heretic. The bigots of the Middle Ages were ready enough with the torch; had they been equally quick with their tolerance and their wits, many pioneers to whom civilization owes much of such human decency as it has would have died in bed instead of at the stake.
A brief indication of a few of the main steps which we shall take may make the trend of the general argument easier to follow. As has been emphasized, it is the simplicity, the fundamental character, of the advances described that makes them somewhat elusive to follow at times.
Like the rest of recorded history for which we have definite evidence, the story of straight thinking begins in Egypt. The date is 4241 B.C. In that year, or before, human beings must have taken the first (and most important) step toward logic, mathematics, and deductive reasoning in general. The evidence for this assertion will be presented in its proper place. Babylon also, somewhat later, made tremendous advances. This, however, is one of the developments that must be left aside, as its elaboration would demand more arithmetic than most of us relish.
One point about the first Egyptian advance is of the highest interest and significance for the entire subsequent history of exact thinking. This is the early emergence of fundamental questions and as yet unsettled difficulties inherent in all consistent thinking, if not explicitly, then certainly implicitly. Doubts which trouble us today, and which, historically at least, are responsible for the advances of recent years, are implicit in some of the earliest Egyptian work. These doubts entered with the earliest attempts to talk sense about the so-called "Infinite"—the unbounded, the unending, the uncountable. Mystics of course tell us that "only infinite mind can comprehend the Infinite," but this assurance is somewhat unsatisfying to those who must try to reason consistently about infinite collections of things. It will be seen, in passing, that mathematical analysis, as it at present exists, is based upon the notion of the infinite, and that mathematical analysis is the principal tool used in the fabrication of scientific theories. The theological infinite also has been influenced by mathematical speculations.
Attempts to reason consistently about the "infinitely great" or the "infinitely small" played a tremendous part in the cast-iron formulation of the classical "laws of thought" which we owe to the Greeks, particularly Aristotle.
After the splendid start made by the Egyptians about 4241 B.C. there is a dark void of 2400 years in the reliable history of exact thinking. The thread is picked up again for a moment about 1800 B.C. There we lose it, till it reappears in the Greece of the Sixth Century B.C. where we meet Pythagoras. His great contributions will be noticed in some detail.
The importance of the Golden Age of Greek thought for our particular purpose is somewhat different than that which it may have in other, less narrow, enquiries. To state the matter blunty here, the supreme importance of the Greece of Aristotle, Plato, and Euclid for the history of abstract thinking is this: in that Golden Age were forged the chains with which human reason was bound for 2300 years. Here, for the first time, we see the beginning of the stifling curse of tradition. It is that curse which is important for our attempt to understand what is happening today, and not the concomitant blessings (such as they were) which accompanied the curse.
All of us, no doubt, have learned at school and elsewhere to appreciate the magnificent things the Greeks did in mathematics, astronomy, philosophy and the humanities. That side of the story has been told often enough, and there is no need to repeat it here. The other side is less familiar. That side is the one of interest to those who would try to appreciate the first fundamental advance in rigorous thinking that has been made—in 1930—since the time of Aristotle (384–322 B.C.). So it is not in any spirit of flippancy or crass disrespect that we emphasize the shortcomings, from a modern point of view, of the Greek way of looking at the world of thought. If we are ever to get on to tomorrow we cannot everlastingly be looking back at the fast-fading hues of today's sunset.
Mathematicians, or at least some of them, have realized this hard fact for many years. Occasionally they lose patience with those who insist that nothing essentially new has been done in the art of thinking, especially mathematical thinking, since the Greeks. Last week I heard one such outburst which is worth repeating here. It happened at a public lecture, followed by questions and discussion, on recent advances in thinking. The audience was about equally divided between those chiefly interested in science and those whose taste is for philosophy. One of the philosophers during the discussion remarked that "After all, Plato saw clearly the fundamental notions of mathematics and fixed the course which mathematics follows today." This was too much for a distinguished mathematician who happened to be present. Leaping to his feet, and speaking with great emphasis, he said: "Mr. Chairman! I should like this audience to know that Plato had not the faintest conception of what mathematics is or what it is all about. Plato knew no more about mathematics, as a modern mathematician understands mathematics, than Confucius knew about gasoline engines." Although I have no clear idea of what Confucius knew about gasoline engines, I have a feeling that the distinguished mathematician was right, for I do happen to know some of the things which Plato said about mathematical "truth," and I have spent some of the best years of my life watching the development of mathematics in its own right and in its applications to science.
Into the first 2400 year blank between Greece and Egypt we shall insert an account of what may have been responsible for the Greek habit of verbalizing the universe. Instead of consciously and persistently interrogating nature by putting her on the rack of experiment, the great thinkers of Greece preferred to make assumptions about what she might have said had she been questioned, and from these assumptions they deduced, by the strictest kind of reasoning, exactly what nature ought to be like (even if she isn't), precisely as some of the modern scientific speculators do with the more sophisticated magic of modern mathematical analysis. The analysis itself is irreproachable for the purpose in hand. However, if the assumptions (postulates, hypotheses) from which the analysis proceeds fail to accord with the results of observations in the laboratory, it seems unlikely that the theories manufactured from the hypotheses can be a very faithful picture of the universe.
A slight difficulty which is soon lost sight of is the fact that most of the hypotheses are grossly simplified or highly idealized approximations to the facts of observation, for the simple reason that the actual situation is so hopelessly snarled that neither mathematics nor logic can do anything with it in the present state of either. This simplification, as any scientific worker knows, is perfectly legitimate as a means for suggesting something new to try in the laboratory, and few scientists themselves believe in the "truth" of their idealized theories. Tomorrow or the day after may turn up a new fact that will modify the theory out of all recognition. In the meantime the false picture may have been added to the stock in trade of those enthusiastic speculators who undertake to tell us what the truth of science is, and who base their prophecies of the future of man and the universe on some fantastic guess that reputable science repudiated months ago.
At the risk of being tedious by more elaborate repetition of the same point later, I shall state here the three so-called "Laws of Thought" of Aristotle. First, "A thing is itself." Second, "A statement is either true or false." Third, "No statement is both true and false." These are not quite orthodox, but they bring out the aspect of the "laws" which we wish to emphasize (the usual statement is given in a later chapter). Our inherited patterns of reasoning are woven around these three, particularly the second and third.
Consider the second of these laws for a moment. Who but a maniac would deny that "a statement is either true or false" is a necessary rule of any system of reasoning which is to lead to consistent results? By "consistent" we mean here that the results do not violate the third law; namely, we do not reach the conclusion that some statement is both true and false. For example, we are not to reach the contradictory conclusions that "twice two is equal to four," and "twice two is not equal to four." To go on with the second law. On the supposedly unshakable rock that a statement is either true or false our cathedral of reason has been reared. Neither the Greeks nor ourselves (until recently) could conceive of a foundation in which this second law was not embedded as one of the cornerstones. Nobody but a madman (we believed) would try to get on without the second law.
What is the fact? The second law not only is not necessary for consistent reasoning, but is definitely unusable in vast regions of modern mathematics where its use, if attempted, produces flagrant contradictions—in violation of the third law, which for the present we have agreed to keep intact. This may sound like cubism or something equally bizarre in the realm reason; but it cannot be helped, and if we are to be forced into this position the sensible thing to do is to accept the fact and see what can be made of it. The parts of mathematics where the second law is illegitimate are precisely those dealing with infinite collections—the parts of the greatest use in science and practical affairs, and also the parts from which the more daring speculators take off for their flights through the invisible universe.
It has been found that we can reason consistently (that is, retaining the third law) in patterns in which it is denied that a statement is either true or false. This in itself is no novelty. It is a quarter of a century old. As modern science and modern mathematics go—so rapid is the advance—anything a quarter of a century old is far on its way to becoming prehistoric. I have mentioned the matter here, out of its proper context, because it was one of the things directly responsible for the great advance of 1930. In that year was constructed, for the first time, a system of strict, deductive reasoning entirely different from that of Aristotle in which most of us are accustomed to reason. The famous three laws no longer enjoy the unique status which they maintained for all of 2300 years as the necessary rules of all consistent, fruitful thinking. Since 1930 scores of such alternative sets of rules have been constructed.
The bearing of all this on Pilate's question becomes clear as we follow the history of cold reason through the Dark Ages. The thread reached Europe via Alexandria, by one of the most ironic accidents in history.
If my account of how the Middle Ages succeeded in hanging themselves with the perfected rope of reason which Greece bequeathed them seems rather unsympathetic, my excuse is what those rational bigots did to men like Roger Bacon and the fact that certain cults of modern mystics have definitely turned their faces and their minds back to the good old days of the Thirteenth Century. I may be wrong, but I believe we shall make a grave mistake if we recapture the devoted and sincere faith of the Middle Ages at the expense of our common sense. However, anyone is free to make his own choice, and I am not trying to "sell" any brand of reason or unreason whatever, but merely to give a short account of what has led up to our present outlook on all reason.
As we follow the thread through the Middle Ages we shall see reason—Aristotle's brand—enthroned on a level with God himself. Because Aristotle's system was the only one (as they imagined) by which human beings can reach consistent results, therefore that system must be the embodiment of some eternal, supernatural, superhuman truth. They had discovered truth at last: the logic of Aristotle was it. They even said that God himself could create anything except what was contrary to the laws of logic (Aristotle's logic). Thus logic—cold reason—was the ultimate reality, higher even than the God they worshipped, and it alone was the arbiter of fate and the body and soul of Truth. It has taken us long enough, God knows, to escape from this stupid nightmare, but at last we are free. The experiences of our race in Europe from 300 to 1500 A.D. should make us reluctant ever to discover another Absolute Truth.
After the Middle Ages nothing of first rate importance for our following of the thread happened till the year 1826, when the Greek tradition was first definitely broken. Technically the great tradition went to pieces over the matter of elementary geometry—the kind our children still learn at school. The technical part is of only minor importance for our main purpose— that of following the thread of strict reasoning through the centuries. What was of major importance in the advance of 8z6 was the courage which the brilliant success of that advance gave mathematicians and scientists in general to question, deny, or dispute the fundamental assumption; in any whatever yield of human activity. Next to the great first step taken by the Egyptians, this justification ("by works") in 1826 of sane, rational skepticism regarding traditional beliefs was, I think, the most important advance our race has ever made in its attempt to understand its reasoning processes. Rational skepticism is not the merely destructive engine that those who fear or dislike it imagine it to be. Without sane doubts now and then we should quickly revert to the Stone Age—or worse, possibly, to the Middle Ages. Doubt is often creative.
After this great step of 1826, progress accelerated at an ever increasing rate. In one of Henry Adams' books, The Phase Rule in History, there is an interesting diagram which aims to show the ever faster increase of invention and scientific discovery from the dawn of history to our own day. At first the rise is barely perceptible. As we approach the present the curve suddenly shoots up, almost out of the picture. Whether professional historians would accept the diagram as a reasonably accurate presentation of the facts may be doubted. Nevertheless, on a smaller scale, a similar diagram would give a fair picture of what has taken place since 1826 in mathematics. As we come to the year 1900 there is a short pause. Then, about 1910, the rate picks up again, rising rapidly to 1930, when it takes a sudden leap. Note in passing that knowledge advances in spite of depressions.
In the last chapter but one we shall sum up what we find and briefly indicate its possible significance for all theorizing and speculation. With the whole story before us, I think we shall agree that a suspended judgment on matters beyond the reach of human experiment or human experience is a saner frame of mind to cultivate than an eager acceptance of every passing speculation, merely because the speculations happen to be masterpieces of the creative imagination which appeal to our childhood instincts for an escape from the everyday world into the entrancing possibilities of fairyland. Our estimate of proof will have undergone a change. Unlike the mediaeval saint who exclaimed, "O Lord, only prove to me that it is impossible, and I will believe,"—a bargain grossly unfair to both parties at the time—we shall rather be moved to demand: "Show me your assumptions." And if we catch anyone trying to "prove" something for our edification while carefully concealing his hypotheses behind his rhetoric, we may be moved to tell him to whistle it. Examine the assumptions is a good working rule to practice before swallowing any theory in part or wholly. As this is another of these simple, fundamental things which are sometimes less obvious than they might be, let us examine it for a moment.
More than one philosopher has reproached the "mathematical mill"—deductive reasoning in its strictest form—for producing nothing "truer" than the assumptions from which the elaborate arguments start. We get out of the mill only what we put into it. If our assumptions conceal spectacular impossibilities, our mathematically deduced theories of the universe will fairly scintillate with dazzling plausibilities. It is the same with every properly conducted course of reasoning: we get out only what we put into the machine in the first place—unless we cheat or make mistakes. The "truths" deduced by mathematical reasoning are nothing but more or less ingeniously disguised tautologies—statements like this: "If A implies B, then A implies B,"—a statement which is true of all the propositions A and B that we can imagine. Essentially, a tautology tells us nothing we did not already know. Mathematics does not create new truths; it produces an endless string of tautologies. So, if we wish to form a just estimate of the spectacular theories of the universe which have been manufactured with the aid of mathematics, we shall not be wasting our time if we concentrate on the hypotheses which produced the theories.
If anyone imagines such questions to be purely academic, let him ponder the two following hypotheses: "All men are born free and equal;" "America entered the World War to make the world safe for democracy." Are these statements true? Are they false? Possibly they are neither. Whatever may be the fact, it is certain that extensive theories of government have been constructed on the first of these hypotheses, and that several thousand men lost their lives because somebody believed the second hypothesis to be true. This, however, is merely in passing. Let us take a less controversial example, to rub in the advisability of maintaining an open mind on speculations beyond the reach of experimental verification. The general principle is extremely simple: Any theory which is fabricated by mathematical reasoning (or other strict deductive reasoning) from scientific (or other) hypotheses, and which produces anything more than elaborate tautologies from those hypotheses, has simply blundered brilliantly, and has produced two rabbits from a hat which contained only one rabbit.
Examine the assumptions! Is it true, for example, that atoms are incompressible, perfectly elastic little spheres? What does it matter? This hypothesis generated no end of heated discussion in its heyday. More important, perhaps, it begot one of the older forms of the kinetic theory of gases, which in its turn produced improved steam engines and better boilers in disconcerting industrial profusion. As all these tangible gadgets had resulted from the hypothesis of the hard little spheres, many men of science believed in the existence of the hard little spheres as just that—hard little spheres. With sufficiently high-powered microscopes it would be possible, they declared, actually to see the little spheres at their endless game of knocking one another about all over the place. In passing, let us note that the physics of 1934 flatly denies the human possibility of ever detecting one atom in the act of hitting another, no matter how high-powered the microscope or how keen-eyed the detective. Thus do fashions change. But to go on with the hard little spheres.
Refining the theory a bit by adding another assumption or two unobtrusively here and there, two eminent physicists of the Nineteenth Century produced a very neat proof of the immortality of the human soul. Dean Wilshire dug this up for me; it will be found in the concluding chapter. Had these daring theorizers taken the trouble to doubt their assumptions about the atoms they might have proceeded with less enthusiasm to their truly remarkable proof. Happily for their peace of mind both of these eminent speculators passed away before a younger generation showed that the hard little atoms were a myth born of human physical ignorance and human mathematical incapacity to describe nature as she is, and not as incompetent mathematical philosophers might wish her to be to bolster up their old-fashioned creeds. Pick the assumptions to pieces till the stuff they are made if is exposed to plain view—this is the cardinal rule for understanding the basis of our beliefs.
The rules of reason themselves by which we do the picking to pieces are not the uniquely sacrosanct "laws" which we used to imagine them to be. They also can be picked to pieces. The "laws of thought" have been analysed, and it has been found that others will serve equally well for any human purpose. The eternal, superhuman "truth" which was supposed to reside in these laws is not what we have believed it to be for the past two thousand years. Thus we free ourselves at last of the most persistent of the stifling myths foisted on us by the dead traditions of a buried past.
Reverence for the past no doubt is a virtue that has had its uses; but if we are to go forward, the reverent approach to old difficulties is the wrong one. The recent advance in abstract thinking cuts under many of our traditional beliefs among others the belief held by many who use mathematics in their scientific work that mathematical reasoning is, in some unique way, the one instrument above all others suitable for the discovery and revelation of superhuman, necessary, and eternal truths. By seeing the human origin of these supposed eternal truths we shall see also a possible escape from any type of superstition. The door to mysticism is still, of course, wide open. If it makes anyone happier to pass through it and back to the Middle Ages, that is strictly his own affair, and no one will try to stop him. Those who take that way out may even get more out of life that those who stay behind. But there is some satisfaction in a fresh start in the other direction with a clear head and open eyes; it is something like being cold sober in the midst of a merry party.
Before beginning the main business, I shall briefly outline at few of the case histories which were directly responsible for the writing of this book. These will illuminate the chapters to follow and, I hope, justify the inclusion of at least some of them. More of these curious examples of queer thinking will be sandwiched in as we go. In their own way these aberrations of reason are as revealing as any respectable theory: they emphasize how fine is the thread separating lofty (and sometimes profitable) speculation from a tissue of nonsense.