Brother Danny said:
I have decided not to address most of the points in your post due to time constraints.
Back to the object at hand -- I'll write on cosmology soon, and materialism, if time allows.”
On the basis of this statement, as well as his previous statement about Gen 1, (see below) it looks like Danny is going to contend that Gen 1 has been falsified by modern cosmology, which is sufficient reason for rejecting the Christian faith in toto.
If so, he faces the following challenges:
i) Many conservative Christians don’t interpret Gen 1 chronologically.
That is not my own position, but if Danny has bet the house on Gen 1 in relation to the chronology of modern cosmology, his argument, even if otherwise valid, would leave the position of many Christians untouched.
ii) An ancient Israelite was perfectly aware of the fact that the sun was source of the diurnal cycle.
They rose with the sun, and retired at sunset.
So one didn’t have to await the invention of the electric light bulb or modern cosmology to make this stunning observation.
However we interpret Gen 1:14-19, this does not represent a prescientific perspective which is been falsified by modern science, for one doesn’t need to be a child of the scientific age to raise the objection which Danny has leveled against this particular feature of the text.
iii) John Sailhamer, in Genesis Unbound, as well as his commentary on Genesis, has argued that 1:14 should not be rendered, “Let there be lights in the expanse to separate the day and night,” but rather, “Let the lights in the expanse be for separating day and night.”
Sailhamer is a Hebraist. Out of curiosity, I ran his interpretation by three other Hebraists: Bruce Waltke, John Currid, and David Clines.
Clines thought that this rendering was grammatically acceptable. Waltke thought it was unacceptable, while Currid also thought it was unacceptable, but for a different reason than Waltke.
So you had four eminent Hebraists evenly divided over the syntax of v14.
Then there was the view of Donald Wiseman, the late great Assyriologist, who felt that the traditional rendering of the verb (“to make”) was mistaken.
This illustrates the dangers of resting so much weight on one verse.
Many commentators have also noted parallels between the creation account and the flood account:
In the flood account we have a triple-decker ark with a window and a roof (6:16; 8:6,13). The animals occupy different decks. During the deluge the ark has water above (rain) and below (floodwaters).
Now, let's compare this to the world. In the creation account, the world has windows (7:11) and a roof (1:6-8; 14-16). It has water above and below (1:2,7). The world has three decks: sky, earth, water (cf. Exod 20:4). Animals occupy different "decks."
In that event it may also be that what we have in v14ff. is an architectural metaphor in which God is depicted as a cosmic carpenter.
So why there was light on day one—why, indeed, was there a day one with a diurnal cycle before the sunlight on day four?
The literary answer would be because a carpenter cannot install skylights until he has put a roof on the house. But there was sunlight before there were skylights.
1:14 was not the whole of Danny’s objection. But it’s part of it.
Perhaps Danny’s implicit objection is that you cannot have plants without sunshine.
But if the world was made within the span of six consecutive calendar days, then the flora could certainly survive for a day (day 3) without sunshine (day 4), even if you accept the traditional reading of v14.
Remember, too, that even on the traditional reading, there is a light source already in place.
III. Creation ex nihilo
i) Gen 1 doesn’t explicitly teach creation ex nihilo, although that is a natural enough way of reading the text.
Be that as it may, creation ex nihilo is clearly implied in Col 1:16-17.
So what, exactly, is the thesis which Danny is opposing?
This is what the traditional reading of Gen 1 amounts to:
God made the world in the span of six consecutive calendar days. He made no use of any preexistent “stuff” (matter, energy). And everything was made in the way it’s described in Gen 1, which is to say, fully formed. There is no transition from simple to complex, or embryonic to mature. Rather, the final product is given by God’s immediate creative fiat.
The end-product is a fully furnished and finished world, with no intermediate stages of development.
ii) Now, the question this poses for Danny is what evidence would count against that claim?
How would the world look if it were, indeed, made that way? How would it differ from the world we see? What, if any, would be the detectable difference between the world of Gen 1 and the world we see today?
Incidentally, I’m not appealing to the theory of “apparent age,” which is a misnomer, for objects have no inherent appearance of any particular age.
We date an object on the basis on experience. If we had no prior experience, we could not date an object.
In addition, this is not an apologetic ploy, drummed up on the spot to do an end-run around modern science.
The interpretation I’m presenting here represents the traditional, prescientific reading of the text.
If you’re going to say that Gen 1 has been disproven by modern cosmology, then you need to consider the implications of Gen 1. What would such a world look like?
Of course, Danny doesn’t believe Gen 1. But that’s irrelevant.
For purposes of argument he must assume the viewpoint of Gen 1 if he’s going to claim that Gen 1 runs contrary to modern cosmology or the natural evidence.
So, assuming, for the sake of argument, that the earth world was created ex nihilo in the span of six days, would science be able to detect its true age? Would there be a discrepancy between such a world and the world we see around us?
What kind of evidence would count for or against creation ex nihilo? Creation ex nihilo is sui generis. It leaves no trace evidence, for the effect involves no intervening process, be it fast or slow.
iii) I’d add that the world in Gen 1 is not necessarily identical with the world we see today, for the prediluvian world was subject to a global flood. So there’s also the question of what impact that would have on the empirical record.
IV. Process & Product
“Which also record lots of other goodies, like that the earth was created before the stars, that the plants were created before the sun, that the "days" were present before the sun” (Brother Danny”).
i) Danny tosses this off as if it the order of events in Gen 1 is obviously wrong.
At the risk of stating the obvious, none of us was around to see the origin of the world we inhabit.
What is more, none of us has ever seen the origin of any other world.
So the original and originating order of events is not an observational datum.
Cosmology attempts to reconstruct the distant past through trace evidence, mathematical modeling, and uniformitarian assumptions to justify the extrapolation of the past from the present.
ii) Consider, for a moment, some of the fudge factors that go into physics and cosmology these days:
UNTESTABILITY OF THE FIRST KIND
We are now in a position to discuss the testability of the new physics, which is broadly defined here as the standard paradigm of particle physics,3 the standard paradigm of cosmology and recent variations on these themes: A comprehensive analysis of the testability of these paradigms could fill several books and so only a representative sampling of major testing problems associated with the new physics will be presented here. Proponents of the new physics will no doubt feel that the present discussion does not include positive support for these theories and therefore leaves the reader with too negative an impression. However, countless technical, general, and popular discussions of the new physics have tended to be so dogmatic and optimistic that a small dose of antidote could not hurt and may be very helpful. If the "believers" repeatedly claim the right to present very idealized overviews of the new physics, then they are obligated to grant "skeptics" the chance to present reasoned alternative viewpoints.
(1) Mentioned earlier was the remarkable example of superstring theory, a variation on the standard paradigm of particle physics in which fundamental particles are treated as one-dimensional strings rather than mathematical points. The community of theoretical physicists is very excited about this theory and some regard it as the ultimate unified paradigm of physics: "It is a miracle; it is the theory of the world." 4 However, as attested to by one of the primary spokesmen for the new physics (Steven Weinberg), "there seem to be no decisive tests in sight"1 by which the superstring theory could demonstrate its scientific validity.
(2) The standard paradigm of particle physics has been unable to retrodict successfully the masses of quarks and leptons or the organization of fundamental particles into regular families. The values of more than 20 parameters that are crucial to the paradigm, such as particle masses, the coupling strengths of the forces, and the magnitudes of CP violations, cannot be uniquely derived and therefore are freely adjustable.4-6
(3) The standard model is completely dependent upon the existence of the hypothetical, "Higgs boson," yet none of the variants on the grand unification theme, a cornerstone of the new physics, can predict the mass of this crucial particle or how it interacts with other particles.7 One worries that the next new particle to be found with a mass between a few giga-electron-volts and a tera-electron-volt will be christened the Higgs boson by fiat.
(4) Many theories of the new physics require extra dimensions beyond the four dimensions of space-time with which we are familiar, 5 to 26 dimensions is typical and about 950 dimensions is the latest record. Yet there is no known way to test empirically for the existence of these extra dimensions.8
(5) The hypothesized unification of the four forces (gravitational, electromagnetic, weak, and strong) is predicted to occur at energies that are now and probably forever inaccessible to empirical testing.3
(6) The standard cosmological paradigm asserts that the key events in the evolution of the universe took place within 10-25 s after the Big Bang. However, even in principle, we cannot obtain direct information on the state of the universe prior to decoupling at about 1013 s after the Big Bang.9
(7) The validity of the most widely accepted cosmological model (Big Bang plus inflation) is completely dependent upon the validity of the standard paradigm of particle physics, but the latter, as we have seen, suffers in many ways from untestability of the first kind.9
(8) Standard cosmological models have never been able to retrodict satisfactorily the existence of galaxies, neither did they predict the recently discovered bulk streaming of large numbers of galaxies, nor did they predict the existence of the enigmatic dark matter that constitutes more than 90% of the mass of the universe.10
It is sometimes stated11 and more often implied that "physics is nearly finished," that we have just about figured everything out. The above examples of inherent untestability, which concern fundamental aspects of the "nearly finished physics,'' tell a different story.
Moreover, examples of effective untestability are more numerous and even more worrisome.
UNTESTABILITY OF THE SECOND KIND
Below is a sampling of some of the more clearcut examples of the effective untestability of the new physics, beginning with particle physics. For an excellent and readable historical review of the development of the new physics, which is quite unique in that it avoids the seamless idealization that typifies recapitulations by the particle physicists themselves, the book Constructing Quarks by Pickering3 is highly recommended.
(1) When fractionally charged quarks were initially proposed as the fundamental constituents of hadrons, a truly definitive prediction--the existence of particles with unusual charges such as + 2/3 or - 1/3--appeared to be inevitable. After many fruitless searches, theoreticians hit upon a unique way out of the failed prediction: It was proposed that quarks were dynamically confined inside hadrons and so free quarks should not be observed. Necessity is indeed the mother of invention. If negative results can be circumvented by strategies of this sort, can the theory in question be regarded as testable?
(2) The predicted existence of magnetic monopoles is another example of a definitive prediction by Grand Unified Theories (GUTs), a very important prediction for the whole standard paradigm of particle physics. In spite of many heroic and imaginative attempts over a period of at least 40 years to detect these mythological particles, results have been consistently negative.12 Most distressing is the fact that negative results are often followed by changes in the predicted properties of the magnetic monopoles (e.g., masses or hypothetical spatial distribution) such that the new magnetic monopole predictions are not in conflict with existing observations. If physicists are unwilling to take "no" for an answer to the question of the existence of magnetic monopoles, what does this say about the testability of the standard paradigm?
(3) For some time, the gauge theories that provide the mathematical foundations of the new physics had the serious problem of predicting infinite values for some physical quantities. This was regarded as physically unacceptable and a technique, called renormalization, was developed to remove the unwanted infinities. The technique worked quite nicely but some physicists, for example, Dirac,13 were deeply concerned about the ad hoc nature of this resolution and regarded renormalization as an arbitrary device whose necessity indicated that something was fundamentally wrong with the theories.
(4) In the 1970s, measurements of the proton structure function were at variance with initial predictions of the quark model. This problem was "solved" by introducing and manipulating two theoretical devices: a quark-antiquark "sea" inside the proton (in addition to the regular quark constituents) and a new set of adjustable particles (gluons) to mediate the interquark forces.3 Is this acceptable model building or the addition of epicycles?
(5) Even the strongest supporters of the new physics admit that the introduction of the "Higgs mechanism" was a totally ad hoc solution to major problems of GUTs: Without it, the gauge theory of the electroweak force was nonrenormalizable and disagreed with observations.3 In general, a renormalizable gauge theory with spontaneous symmetry breaking is highly dependent upon such a mechanism.4 According to an eminent theorist who has played a major role in the development of the new physics, "...the only legitimate reason for introducing the Higgs boson is to make the standard model mathematically consistent,.. . The biggest drawback of the Higgs boson is that so far no evidence of its existence has been found. Instead, a fair amount of indirect evidence already suggests that the elusive particle does not exist. Indeed, modern theoretical physics is constantly filling the vacuum with so many contraptions such as the Higgs boson that it is amazing a person can even see the stars on a clear night!"7 Indeed, and it is this tendency that severely reduces the effective testability of the new physics.
(6) To prevent inconsistencies between the electroweak and QCD theories, theorists invented a new set of quantum numbers for quarks called "color"; each "flavor" of quark could exhibit one of three colors.5 Like renormalization, this device worked quite well but, again, should we regard it as an ingenious and proper product of model building or as another epicycle?
(7) The same question applies to the introduction of yet another new quantum number, "charm," and the charmed quark (the GIM mechanism), which were initially introduced on an ad hoc basis to solve the kaon-decay anomaly and to provide symmetry with the four leptons known at that time (later more were found), and for which the supporting data are ambivalent at best.3
(8) Recently, the standard GUT model yielded the bold prediction that protons were unstable and had a lifetime on the order of 1032 years. This prediction is currently regarded as having been falsified, but what is more worrisome is that there is a seemingly limitless supply of alternative GUTs to take the place of the apparently falsified version. Since GUTs are so adjustable, how can one test a "one model fits all data" paradigm?
(9) A general prediction of the standard paradigm of particle physics has been that spin should be largely irrelevant in high-energy, large-angle, elastic scattering of hadrons. This prediction has been repeatedly falsified over the last 10 years in a series of proton-proton scattering experiments by Kirsch and his collaborators.14 Reaction of the theorists has been predictable: initial disbelief in the data and, when the unhappy results would not quietly disappear, subsequent attempts to "save the phenomenon" by introducing new theoretical considerations.
(10) A recent addition to the new physics repertoire is "technicolor": a hypothetical new strong interaction for hypothetical subcomponents of the hypothetical Higgs boson (see No. 5 above). How many levels of untestable abstraction are we to allow?
(11) There are at least six versions of superstring theories and currently no way to decide their relative merits.1 Even if the superstring paradigm could overcome its previously mentioned untestability of the first kind, it would still be plagued by untestability of the second kind.
(12) Coupling of the graviton (hypothetical particle mediating gravitational interactions) to the hypothetical Higgs field would result in an enormous value for the cosmological constant. This clearly violates the observation that the cosmological constant is zero (or exceedingly small). So theorists have proposed that without the Higgs field the cosmological constant would have had a huge negative value, but that with Higgs field coupling this is neatly canceled by the huge positive value to give a net value of about zero, as observed.7 Is this not the sledgehammer approach to physics problems?
These are just a few examples of the remarkable resiliency of the new physics, a resiliency that leaves it on the verge of untestability. It is amazing how predictive failures (e.g., nondetection of fractionally charged quarks) are cleverly circumvented and then gradually the failure/solution comes to be regarded as an argument in support of the paradigm! The fundamental question is where should one draw the line between acceptable modification of a theory according to traditional model building and artificial modifications that spoil the conceptual elegance of the theory and have no physical justification other than the fact that they circumvent inconsistency between theory and observation? Unfortunately, there is no clear-cut dividing line by which one could make an objective decision.
In the case of a theory in the latter stages of ad hoc development (e.g., the Ptolemaic universe), the shear ungainliness of the theoretical contraption begins to raise doubts in the minds of those who are not positively biased by participation in its construction, and gradually those doubts begin to spread throughout the scientific community. Could the new physics represent model building gone awry? Yes, that could be the case; so many new "fundamental" particles, force-carrying particles, and theoretical devices have been tacked on to the original quark model that the new physics is beginning to make the Ptolemaic universe look rather svelte. On the other hand, one must emphasize that the new physics has much more data to explain and therefore this heroic theoretical effort might not represent model building gone awry. Happily, there is an excellent prospect for deciding between these two possibilities and this will be the subject of Sec. V of this article. Briefly, the ability or inability of existing paradigms to predict correctly the form of the enigmatic dark matter that constitutes at least 90% of all matter in the cosmos will be a powerful tool for objectively and scientifically determining: (1) whether the new physics represents a natural or an artificial unification of our knowledge, and (2) if the latter is the case, then whether there are other "dark horse" paradigms that hold more promise.
Before this crucial test is discussed, however, let us take a brief look at the standard cosmological paradigm in terms of effective testability. The Big Bang theory has been the basic paradigm of cosmology for decades. The combination of the observed redshift-distance relation for galaxies and the isotropic microwave background radiation is strong evidence in support of the idea that 10-20 billion years ago the observed portion of the cosmos was in a much more compact state and that it has been expanding ever since then. According to the Big Bang model, the initial state was a singularity: All of the atoms, stars, and galaxies we now observe were compressed into an entity with infinite density, temperature, and pressure, and with a diameter equal to zero (in fact, space and time did not yet exist). For unknowable reasons (perhaps boredom), the singularity began to expand. Einstein, who developed the theory upon which the Big Bang model was based, general relativity, was very critical of the idea of an initial universal singularity because he felt that general relativity was being pushed beyond the limits of its applicability.2 He felt that one could turn the mathematical crank until values like zeros and infinities resulted but that this was not realistic. Aside from the problem of an initial singularity,9 the original Big Bang model had other major technical problems 9,15 such as the flatness problem, the horizon problem, and the smoothness problem. It also could not explain the existence of galaxies. It predicted a homogeneous distribution of matter on large scales whereas inhomogeneity in the distribution of matter has always been observed on the largest scales that could be adequately tested.16,17 It did not predict that the universe would be largely composed of an unknown form of matter referred to as the dark matter. It predicted uniform expansion of the galactic environment whereas observations reveal large deviations from a uniform Hubble flow.18 And there are many less dramatic predictive problems that have beset the original Big Bang theory, such as those associated with predicted elemental abundances19 and the predicted age of the universe.20
In order to rescue the Big Bang theory, particle physicists began to suggest ways in which the new physics might be brought to bear on cosmological problems. Although astrophysicists were initially somewhat demure about this new relationship, before long the majority of the physics community was proclaiming that the "marriage" of cosmology and particle physics heralded the birth of a complete understanding of nature. A much smaller group of scientists worried that it looked more like an incestuous affair with a high probability for yielding unsound progeny. Recently, a noted astrophysicist candidly characterized the interactions of astrophysics and particle physics as follows:
"[T]he big news so far is that particle physicists seem to be able to provide initial conditions for cosmology that meet what astronomers generally think they want without undue forcing of the particle physicist's theory. Indeed I sometimes have the feeling of taking part in a vaudeville skit: "You want a tuck in the waist? We'll take a tuck. You want massive weakly interacting particles? We have a full rack. You want an effective potential for inflation with a shallow slope? We have several possibilities." This is a lot of activity to be fed by the thin gruel of theory and negative observational results, with no prediction and experimental verification of the sort that, according to the usual rules of evidence in physics, would lead us to think we are on the right track..."21
Let us consider several examples of applications of the new physics to the realm of astrophysics and evaluate the testability of the results. Using GUTs as a theoretical toolbox, physicists fixed up the old Big Bang theory by the addition of a period of rapid expansion (inflation), which simultaneously removed the flatness, horizon, and smoothness problems.9 But, of course, the testability of the inflationary hypothesis is entirely dependent upon the testability of GUTs and we have already seen that there are major problems with the latter. Moreover, the inflated Big Bang model may have a serious problem with the age it predicts for the universe,20 and the one rigorous prediction that it does make: that W = 1 (which means that the density of the universe is equal to the "critical" value) has been repeatedly contradicted.9 A second example concerns the fact that the old Big Bang theory, when analyzed in terms of the new physics, appeared to generate too many magnetic monopoles; remember that no magnetic monopoles have ever been observed. Inflation deftly "solves" this problem by inflating space to such an extent that the resulting magnetic monopole density can be made arbitrarily low,15 in fact, low enough so that we could not have expected to detect a magnetic monopole even after 40 years of trying. Obviously, if the density of magnetic monopoles is now to be a freely adjustable parameter, via the inflation scenario, then the effective testability of theoretical constructs predicting the existence of magnetic monopoles is significantly reduced. One is reminded of that vaudeville act: "You got negative results? Not to worry. We just pump up the old cosmos a little. Oops, not too much! And, presto, your negative result is a vindicated expectation." A third example concerns the recent discovery that the large-scale structure of the cosmos is organized like "soap suds," with galaxies residing primarily in the interstices of huge bubblelike voids. The new physics has offered at least three different interpretations for this phenomenon, all post facto, of course. The cosmic suds are attributed to (1) superconducting and furiously vibrating cosmic strings,22 or (2) biased galaxy formation in a WIMP-dominated universe10 (WIMP stands for a hypothetical class of weakly interacting massive particles), or (3) hard as it may be to believe, double inflation23 (if one inflation doesn't solve all the problems, how about two inflationary episodes?). All of these theoretical interpretations are so unconstrained that they are effectively untestable and their ad hoc status is embarrassingly obvious. Many other examples of new physics applications in the astrophysical realm just leave one's mouth hanging open: Quasars are cosmic strings,22 quasars are axion lumps,24 the solar neutrino problem is due to axions or other WIMPs in the Sun,25 neutron stars are really big quark nuggets,26 the difference between spiral and elliptical galaxies is whether they were "seeded" with self-intersecting or nonself-intersecting "loops of string," 27 the dark matter is WIMPs or "shadow matter"16 (the latter idea may have been cribbed from Lewis Carroll),... .
In general, when the proponents of the new physics apply their art to cosmological problems they usually invoke essentially untestable physics that is supposed to have taken place in the unobservable past in order to explain current observations. Are they solving the problems or hiding them behind a curtain of sophistry? Without adequate testability, how are we to decide this question?
iii) One might object that Oldershaw’s article is almost 20 years old, and therefore badly out of date.
And that may well be true. But if physical and cosmological theories suffer from such an ephemeral shelf-life, then that hardly inspires our confidence.
iv) It is not enough to say that Gen 1 is contrary to modern cosmology or historical geology.
To generate a conflict, you must say that, given the same process, Gen 1 yields a result contrary to modern cosmology, &c.
But does it follow that Gen 1 is at odds with the natural record given an alternative process, or even the absence of any process whatsoever?
A dating scheme assumes a certain process. Remove the process, or substitute a different process, and what becomes of the dating scheme?
Given a certain sequence, a certain order of events, a certain rate of change, you will have a predictable result.
But to say that this is inconsistent with Gen 1 is, at best, a tautology, and, at worst, a petitio principii.
If you assume a series of events different from Gen 1, then, by definition, that will generate a conflict. But all you’ve done is to beg the question in your own favor.
What would result from Gen 1 on the basis of its own internal, narrative assumptions?
V. Mathematical Physics
Theoretical physics and mathematical physics are not interchangeable, but they do overlap, for mathematical modeling is, to my knowledge, the chief tool used by the theoretical physicist to extend the experiment data beyond the range of the extant evidence.
And it’s striking to consider the level of mathematical incompetence among theoretical physicists. In the course of reviewing a book, Steven Weinberg made this admission:
For me, the modern computer is only a faster, cheaper, and more reliable version of the teams of clerical workers (then called "computers") that were programmed at Los Alamos during World War II to do numerical calculations. But neither I nor most of the other theoretical physicists of my generation learned as students to use electronic computers. That skill was mostly needed for number crunching by experimentalists who had to process huge quantities of numerical data, and by theorists who worked on problems like stellar structure or bomb design. Computers generally weren't needed by theorists like me, whose work aimed at inventing theories and calculating what these theories predict only in the easiest cases.
Still, from time to time I have needed to find the numerical solution of a differential equation, and with some embarrassment I would have to recruit a colleague or a graduate student to do the job for me.
Here’s a Nobel laureate in physics who can’t do his own number-crunching!
Or consider an anecdote of Freeman Dyson’s:
In the spring of 1985 Ed Witten, one of the most brilliant of the young physicists at Princeton University, announced that he would give a talk. Rumors were buzzing…Witten spoke very fast for an hour and a half without stopping. It was a dazzling display of mathematical virtuosity…when Ed Witten came to the end of his hour and a half, there was no Oppenheimer in the audience to make a sharp-tongued comment. The listeners sat silent. After a while the chairman of the session called for questions. The listeners still sat silent. There were no questions. Not one of us was brave enough to stand up and reveal the depths of our ignorance. As we trooped glumly out of the room, I could hear voices asking quietly the questions nobody asked out loud.
I describe this scene because it gives a picture of what it means to explore the universe at the highest level of abstraction. Ed Witten…has moved so far into abstraction that few even of his friends know what he is talking about….[His] role is to build superstrings into a mathematical structure which reflects to an impressive extent the observed structure of particles and fields in the universe. After they had head him speak, many members of his audience went back to their desks and did the homework they should have done before, reading his papers and learning his language.
Infinite in All Directions (Harper & Row 1989), 15-16
It makes you wonder how many practioners have mastered the basic tools of the trade.
Witten can do his own computations, and I dare-say Roger Penrose can as well, but how many others are up to the task?
VI. Theory & Reality
Stephen Hawking is undoubtedly the world’s most famous cosmologist. But what epistemic status does he assign to his own theorizing?
“He's [Roger Penrose] a Platonist and I am a positivist. He's worried that Schrödinger's cat is in a quantum state, where it is half dead and half alive. He feels that can't correspond to reality. But that doesn't bother me. I don't demand that a theory correspond to reality because I do not know what it is. Reality is not a quality you can test with litmus paper. All I am concerned with is that the theory should predict the results of measurements.”
“Any sound scientific theory, whether of time or of any other concept, should in my opinion be based on the most workable philosophy of science: the positivist approach put forward by Karl Popper and others. According to this way of thinking, a scientific theory is a mathematical model that describes and codifies the observations we make.”
“If the universe was essentially unchanging in time, as was generally assumed before the 1920s, there would be no reason that time should not be defined arbitrarily far back. Any so-called beginning of the universe, would be artificial, in the sense that one could extend the history back to earlier times. Thus it might be that the universe was created last year, but with all the memories and physical evidence, to look like it was much older. This raises deep philosophical questions about the meaning of existence. I shall deal with these by adopting what is called the positivist approach. In this, the idea is that we interpret the input from our senses in terms of a model we make of the world. One cannot ask whether the model represents reality, only whether it works. A model is a good model if it first, interprets a wide range of observations, in terms of a simple and elegant model. And second, if the model makes definite predictions that can be tested, and possibly falsified by observation.”
Now, one may or may not agree with Hawking’s philosophy of science. But the immediate point is that you cannot simply appeal to a scientific theory or scientific evidence without addressing the metascientific question of what you think a theory is good for. Your science is only as good as your philosophy of science.
Here’s another quote from Dyson:
The main thing which I am trying to suggest by this discussion is the extreme remoteness of the superstring from any objects which we can see and touch. Even for experts in theoretical physics, superstrings were hard to grasp. Theoretical physics are accustomed to living in a world which is removed from tangible objects by two levels of abstraction. From tangible atoms we move by one level of abstraction to invisible fields and particles. A second level of abstraction takes us from fields and particles to the symmetry-groups by which fields and particles are related. The superstring theory takes us beyond symmetry-groups to two further levels of abstraction. The third level of abstraction is the interpretation of symmetry-groups in terms of states in ten-dimensional space-time. The fourth level is the world of the superstrings by whose dynamical behavior the states are defined. It is no wonder that most of us had a hard time trying to follow Ed Witten all the way to the fourth level.
Now, I ask you, what’s the chance of a fourth-level abstraction, a mathematical construct four steps removed from the observational data, being even approximately descriptive of the actual world we live in?
True, string theory is not the only game in town, and Dyson’s summary is rather dated.
But could any rival theory operate at a lower level of abstraction?
Again, string theory is not the same thing as cosmology. But the methodology is similar, is it not?
VIII. The Veil of Perception
Not only are our mathematical models far removed from the observational data, but the observational data is several steps removed from our mental representation thereof. As one philosopher expresses the relationship:
“The direct realist view (Gibson 1972) is incredible because it suggests that we can have experience of objects out in the world directly, beyond the sensory surface, as if bypassing the chain of sensory processing. For example if light from this paper is transduced by your retina into a neural signal which is transmitted from your eye to your brain, then the very first aspect of the paper that you can possibly experience is the information at the retinal surface, or the perceptual representation that it stimulates in your brain. The physical paper itself lies beyond the sensory surface and therefore must be beyond your direct experience. But the perceptual experience of the page stubbornly appears out in the world itself instead of in your brain, in apparent violation of everything we know about the causal chain of vision.”
Before we can even theorize about observables, the “observable” already lies at the end of black box in which the sensible is encoded in the form of electromagnet information, and then reencoded in the form of electrochemical information, in order to reach the mind of the observer.
IX. Whither Cosmology?
What’s the current state of modern cosmology and physics, anyway?
Cosmology is not the same thing as physics, but their fate is intertwined.
Here are a few recent comments from leaders in the field:
M theory is not a theory in the usual sense. Rather it is a collection of theories, that look very different, but which describe the same physical situation. These theories are related by mappings, or correspondences, called dualities, which imply that they are all reflections of the same underlying theory. Each theory in the collection, works well in the limit, like low energy, or low dilation, in which its effective coupling is small, but breaks down when the coupling is large. This means that none of the theories can predict the future of the universe, to arbitrary accuracy. For that, one would need a single formulation of M-theory that would work in all situations.
Up to now, most people have implicitly assumed that there is an ultimate theory that we will eventually discover. Indeed, I myself have suggested we might find it quite soon. However, M-theory has made me wonder if this is true. Maybe it is not possible to formulate the theory of the universe in a finite number of statements.
“WE DON’T know what we are talking about.” That was Nobel laureate David Gross at the 23rd Solvay Conference in Physics in Brussels, Belgium, during his concluding remarks on Saturday. He was referring to string theory - the attempt to unify the otherwise incompatible theories of relativity and quantum mechanics to provide a theory of everything.
“The state of physics today is like it was when we were mystified by radioactivity”
Gross - who received a Nobel for his work on the strong nuclear force, bringing physics closer to a theory of everything - has been a strong advocate of string theory, which also aims to explain dark energy. “Many of us believed that string theory was a very dramatic break with our previous notions of quantum theory,” he said. “But now we learn that string theory, well, is not that much of a break.”
He compared the state of physics today to that during the first Solvay conference in 1911. Then, physicists were mystified by the discovery of radioactivity. The puzzling phenomenon threatened even the laws of conservation of mass and energy, and physicists had to wait for the theory of quantum mechanics to explain it. “They were missing something absolutely fundamental,” he said. “We are missing perhaps something as profound as they were back then.”
Steven Weinberg recently said that this is one of the great sea changes in fundamental science since Einstein, that it changes the nature of science itself. Is it such a radical change?
In a way it is very radical but in another way it isn't. The great ambition of physicists like myself was to explain why the laws of nature are just what they are. Why is the proton just about 1800 times heavier than the electron? Why do neutrinos exist? The great hope was that some deep mathematical principle would determine all the constants of nature, like Newton's constant. But it seems increasingly likely that the constants of nature are more like the temperature of the Earth - properties of our local environment that vary from place to place. Like the temperature, many of the constants have to be just so if intelligent life is to exist. So we live where life is possible.
For some physicists this idea is an incredible disappointment. Personally, I don't see it that way. I find it exciting to think that the universe may be much bigger, richer and full of variety than we ever expected. And it doesn't seem so incredibly philosophically radical to think that some things may be environmental.
Could there be some kind of selection principle that will emerge and pick out one unique string theory and one unique universe?
Anything is possible. My friend David Gross hopes that no selection principle will be necessary because only one universe will prove to make sense mathematically, or something like that. But so far there is no evidence for this view. Even most of the hard-core adherents to the uniqueness view admit that it looks bad.
The beginning of the 21st century is a watershed in modern science, a time that will forever change our understanding of the universe. Something is happening which is far more than the discovery of new facts or new equations. This is one of those rare moments when our entire outlook, our framework for thinking, and the whole epistemology of physics and cosmology are suddenly undergoing real upheaval. The narrow 20th-century view of a unique universe, about ten billion years old and ten billion light years across with a unique set of physical laws, is giving way to something far bigger and pregnant with new possibilities.
I don’t know what strange and unimaginable twists our view of the universe will undergo while exploring the vastness of the landscape. But I would bet that at the turn of the 22nd century, philosophers and physicists will look back nostalgically at the present and recall a golden age in which the narrow provincial 20th century concept of the universe gave way to a bigger better megaverse, populating a landscape of mind-boggling proportions.
As Lenny says, this means that the old dream of a unified theory that makes unique and falsifiable predictions appears no longer possible. Much that physicists hoped to explain, as necessary features of any possible universe, are just contingent, or environmental features of one universe out of many possible ones.
X. The Road to Nowhere
Of course, not everyone agrees with string theory, but the very degree of disagreement illustrates the underlying point just as well.
All these guys are groping in the dark with a book of wet matches to light their way.