Richard Feynman (1918-1988) was one of the important physicists of the 20th century. He was also, in his own words, a "curious character," meaning both that he was curious about things and that he was "curious" in the sense of being a curiosity. His personal peculiarities and adventures are detailed in the charming and humorous auto-biographical stories in Surely You're Joking, Mr. Feynman! (1985) and What Do You Care What Other People Think? (1988), both by Feynman as told to his friend Ralph Leighton. After Feynman's death, Leighton published a third volume, Tuva or Bust! (1991).
The biography by the Gribbins is an excellent story of Feynman's life and an accessible discussion of his physics. We have everything from his childhood to his last words ("This dying is boring," waking up briefly from his final coma, p. 258). A man who genuinely desired to know and was enthusiastic about his discoveries and his teaching is a refreshing example in a day when intellectual life is increasingly nihilistic and autistic. But we also see Feynman's limitations. Philosophical questions never had the slightest appeal to him, nor religion. He was thus a purely scientistic scientist, ignoring or dismissing anything about life and the world that was not accessible to scientific method. Scientism, unfortunately, is not much help in combatting the forms of nihilism and relativism that are just barely philosophically sophisticated enough to sceptically "deconstruct" the epistemological foundations of science itself. Feynman's world would comfortably fit in with that of Paul Kurtz, founder and chairman of the Committee for the Scientific Investigation of Claims of the Paranormal (CSICOP), the Council for Secular Humanism, and Prometheus Books, who does an admirable job of fighting pseudo-science and paranormal fraud, but who seems to put ordinary religious beliefs, or important metaphysical questions, into those categories too.
Two issues in particular come up in the course of the Gribbins' book that merit special discussion here. The first, although it appears to be a judgment of the Gribbins rather than of Feynman, nevertheless seems representative of Feynman's attitude:
The lesson to be drawn is that in some deep sense the truth about how the world works resides in the equations -- in this case, Maxwell's equations -- and not in the physical images that we conjure up to help our limited imaginations to visualize what is going on. [p.27]
This is an excellent example of the Sin of Gallileo, that the math is the real thing and that if the math works, then we understand what is going on. We even see this in scientists who are out and out Platonists, like the astronomer Allan Sandage ("I am a Platonist"), who is said in the August 1998 Scientific American to believe that the equations of fundamental physics are all that is real and that "we see only shadows on the wall" (p. 22). Feynman, who was probably a realist but not a true Platonist, probably did agree with the Gribbins and with Sandage that the reality is represented in the equations, which are the stuff of things, just as Keane Reeves sees objects consisting of the computer code of The Matrix.
The phrase "physical images that we conjure up to help our limited imaginations" dismissively refers to the conceptual part of physics. The implication is that the concepts of physics are an inessential and expendable part that is used to dress up the equations so that our "limited imaginations" can have something concrete to hold onto. All of mathematics, however, is in fact an artifact of concepts. Universals make numbers possible.
We need numbers, and numbers are meaningful, only when there are different individuals of the same kind. If there was only one dog, like one Colossus of Rhodes, we would never need to number dogs. That there is more than one means not only that we can number them, but that we can order them. Thus, if there are five dogs in my neighbor's yard, and the gate is opened, one of them will be the first out the gate, another the second, and so forth. Similarly, there have been 42 Presidents of the United States, and they can be uniquely identified by their order, with George Washington as the 1st, Abraham Lincoln as the 16th, etc. (with the interesting circumstance that the 22nd and 24th are the same individual, Grover Cleveland). Mere numbering gives us cardinal numbers, while ordered numbering gives us ordinal numbers. Kinds denoted by mass terms, like "water" or "sand," and comparable magnitudes like distance and time, cannot be numbered in quite the same way. Indeed, they cannot be numbered at all without units, which must be based on entirely conventional quantities of measure. Thus, water or sand can be counted in units of volume or weight, distance or time in units of, indeed, distance or time. As it happens, basic units for all those things develop pre-scientifically and even pre-philosophically.
In mathematics, numbering is abstracted from the numbered. This leaves the original concepts, and the whole metaphysical issue of the nature of universals, behind. Mathematicians need not worry about any of that stuff, and they usually don't, though at some point most mathematicians ask, at least themselves, whether they are discovering real things in mathematics or whether it is something that they are just making up. This question may become more accute as the development of mathematical techniques begins to produce their own artifacts of kinds of numbers that are not "natural" numbers, like negatives and imaginaries. Imaginaries especially challenge the common sense foundations of mathematics, since it is not at all obvious how such numbers could number actual things. Thus, while owning a negative number of dogs can be interpreted as meaning that I owe someone some dogs that I don't have (like a negative bank balance meaning debt), what could it possibly mean to say that I own 5i dogs? The operation of taking a square root, when directed on a negative number, produces something that, according to my calculator, is an "ERROR." Nevertheless, according to quantum mechanics the world is awash in imaginary wave functions; and the success of this as science can easily lead one to think that the conceptual interpretation of physics need not be taken seriously. When mathematics left those original concepts behind, this was a liberation into true reality, and the philosophical questions about concepts and universals can be permanently forgotten.
On the other hand, the difference between pure mathematics and physics is that the equations of physics have a conceptual component. Newton's F = ma is an absurdly simple equation, but it is a fundamental equation of physics because F is a numbering of force, m is a numbering of mass, and a is a numbering of acceleration. Now, the equation itself is a claim about concepts, since "force" itself is shown not to be a primitive quantity but equivalent to the product of mass and acceleration. "Acceleration" in turn is not primitive, but a certain measure of distance and time (a = d/t/t = d/t2). Force is therefore in units of m*d/t2. The units of physics are thus all built up from the most primitive ones.
Now, one might ask, What is "mass"? What is "distance"? What is "time"? As questions of physics these are going to be very different from similar questions in philosophy. In physics, all one need say, to get started, is that "mass resists acceleration" (intertial mass) or "mass exerts gravitational attraction" (gravitational mass), that "distance is what we measure with this rod," and that "time is what we measure with this clock." Wow. These answers, of course, are not philosophically very satisfying. They are all one needs, however, to start doing the science. And there is a reason for that. Scientific explanations are logically only sufficient, not necessary, to the phenomena. This means that they are enough to explain something about what we are seeing, but that logically they are not the only possible explanation and they do not explain everything about what we are seeing. Indeed, explaining everything is a tall order, though it is what, philosophically, we would like ultimately to have.
The logically sufficient nature of science is due to the use of prediction and falsification in scientific method, as belately but finally understood by Karl Popper. This is why mathematics dominates the hardest sciences, since it is the easiest to make the most precise predictions about the most abstract matters, and pure quantities fill the bill admirably. All that is needed conceptually are the barest meanings necessary to get the quantities applied to the phenomena. Thus, it made not the slightest difference for the value of Newtonian physics if Newton thought that gravity was the Will of God or that space was "God's boundless uniform sensorium." These metaphysical, even theological, elaborations were irrelevant to the quantitative predictions of the theory.
Does the relative unimportance of most of Newton's interpretations of his quantities really mean, however, that "the truth about how the world works resides in the equations... and not in the physical images that we conjure up to help our limited imaginations to visualize what is going on." No, because the equation, F = ma, really means nothing without its conceptual component, and the full meaning of the concepts is not irrelevant, just relatively underdetermined by Newtonian theory. If we want to explain and understand more about the phenomena, more equations will probably be necessary, but also more meaning for the concepts. There will not be a philosophically satisfying scientific theory until the whole meaning of the concepts can be specified and everything about the phenomena understood.
It is also possible that scientific theories, because of the nature of scientific method, cannot specify the whole meaning of their concepts. There may be, and likely is, an irreducible metaphysical element to all physical concepts. What that must be can be anticipated by metaphysicians, by speculation, but it also must be remembered that what science can ultimately grasp has historically produced some serious surprises. Thus, a philosopher like Hume said that, "Elasticity, gravity, cohesion of parts, communication of motion by impulse; these are probably the ultimate causes and principles which we shall ever discover in nature [Enquiry Concerning Human Understanding,, Shelby-Bigge edition, Oxford, 1902, 1972, p. 30]. Hume did not live long enough to see just how foolishly restricted was his estimation of the future of scientific knowledge. But Hume, however devastating his critique of induction, nevertheless believed in inductivist scientific method.
The scientism of Feynman, and of the Gribbins in their biography, is evident in the unconcern, or dismissal, of the conceptual side of physics. Feynman knew that the conceptual side of quantum mechanics was a hash, and he frankly called it "incomprehensible"; but his lack of interest in philosophical questions meant that there was literally nothing to be done about this situation, except to see what would come next in the science. We are still waiting on that one. On the other hand, philosophical motivations are not uncommon in physics, where, for instance, the very existence of constants is an irritation, because they are arbitrary. Physicists are not going to be happy until every constant of nature can be derived from some conceptually elegant circumstance, the way Feynman himself derived the magnetic moment of the electron as 1.00115965246.
This negative lesson from Feynman's life can be contrasted here with a very positive one, though it begins with a problem, that he was rather unproductive as a physicist in the period from 1961 to 1967. Had Feynman just run out of ideas, or had something just gone wrong?
Feynman had got to know [biologist James] Watson during the sabbatical year that Dick had spent as a 'graduate student' in biology. He had an opportunity to renew the acquaintance when he visited Chicago early in 1967, and when they met Watson gave Feynman a copy of the typescript of what was to become his famous book The Double Helix, about his discovery, together with Francis Crick, of the structure of DNA. Feynman read the book straight through, the same day. He had been accompanied on that trip by David Goodstein, then a young physicist just completing his PhD at Caltech, and late that night Feynman collared Goodstein and told him that he had to read Watson's book -- immediately. Goodstein did as he was told, reading through the night while Feynman paced up and down, or sat doodling on a pad of paper. Some time towards dawn, Goodstein looked up and commented to Feynman that the surprising thing was that Watson had been involved in making such a fundamental advance in science, and yet he had been completely out of touch with what everybody else in his field was doing.Feynman held up the pad he had been doodling on. In the middle, surrounded by all kinds of scribble, was one word, in capitals: DISREGARD. That, he told Goodstein, was the whole point. That was what he had forgotten, and why he had been making so little progress. The way for researchers like himself and Watson to make a breakthrough was to be ignorant of what everybody else was doing and plough their own furrow. [pp. 185-186]
What had gone wrong for Feynman was that he had begun taking too seriously the idea that modern knowledge is a collective enterprise. Just trying to keep up with his field had suppressed his own sources of inspiration, which were in his own solitary questions and examinations. This, indeed, is the fate of most research in most disciplines, to make the smallest, least threatening, possible addition to "current knowledge." Anything more would be presumptuous, anything more might elicit the fatal "Don't you know what so-and-so is doing" from a Peer Reviewer, anything more might invite dismissal as some off-the-wall speculation -- not serious work.
So Feynman "stopped trying to keep up with the scientific literature or compete with other theorists at their own game, and went back to his roots, comparing experiment with theory, making guesses that were all his own..." [p. 186]. Thus he became productive again, as he had been when he had just been working things out for himself, before becoming a famous physicist.
While this is an important lesson for science, it is a supreme lesson for philosophy, where "current knowledge" can be dominated by theories, like Logical Positivism or deconstruction, that are simply incoherent. Trying to keep up with literature like that is a complete waste of time, even if contributions to it earn the praise of reviewers and are snapped up by presitigious journals. To participate in this may prudently recommend itself to the careerist, but it holds little hope of making any real contributions to the progress of philosophy.
To philosophy they are assigned with their wives and children, and in spite of Petrarch's povera e nude vai filosofia ["you go poor and nude, philosophy"], they have taken a chance on it. [Arthur Schopenhauer, The World as Will and Representation, Vol. 1, E.F.J. Payne translation, Dover, 1858, p.xxvi]
New ideas do not come from committees, and although this dynamic is so well understood as to be part of folk wisdom, researchers in many areas of science or scholarship are so blinded by their own herd mentality, or collectivist ideology, or rent-seeking behavior, that they commonly act, both for themselves and in judgment of others, in denial of it. Of all the "curious" lessons of Richard Feynman's life, this is one of the best.
Richard Feynman's Quantum Mechanics