Computer simulations of the real world are becoming better and better as computer CPUs become faster, memories larger, and software smarter and more extensive. The CGI effects in movies done with computers are becoming indistinguishable from scenes filmed with large casts of extras and expensive props. Computer games are becoming ever more "realistic", even as the blood and violence of the games increases. The online virtual world Second Life simulates a mini-universe, complete with realistic avatars, an economy, and real estate, and has reached 21.3 million registered users. Movies like The Matrix and SF novels like Iain M. Banks' Surface Detail portray a computer-based virtual reality that is indistinguishable from the real world. In the future, one can anticipate that as computer hardware and software advances, the perceived gap between the real and the virtual will shrink and perhaps disappear altogether.
Nick Bostrom of St. Cross College, University
Bostrom asserts that individuals in such a post-human civilization can, if they choose, perform near-perfect "ancestor simulations" of the past, including our own era. He presents arguments leading to the conclusion that one of three alternative propositions must be true. Either (1) The human species is very likely to go extinct before reaching such a post-human stage; or (2) The fraction of post-human civilizations that are interested in running a significant number of ancestor simulations is extremely small; or (3) We are almost certainly living in such a computer simulation rather than the real world.
The logic that led Bostrom to item 3 is a statistical one: if over a given time period there are very many simulations of the present-day period of history, each containing simulated sentient individuals who think that they are experiencing the real world, then the probability that a given sentient individual is actually experiencing true reality rather than simulated reality is extremely small.
A group of theoretical
nuclear physicists at the
Let me first explain what QCD lattice gauge calculations are. The Standard Model uses quantum chromodynamics (QCD) to provide a detailed description of the strong interaction, the color force that acts between quarks through the medium of massless gluons. Physics in general has the perennial problem that, although it may be possible to accurately describe in detail the forces of the universe, the actions of these forces as they act between many particles quickly becomes so complex as to make calculations difficult or impossible. Theoretical physicists have developed many calculation techniques for dealing with the electromagnetic and weak interactions, because they become smaller with distance. However, most of these techniques are worthless for the strong interaction, because it grows stronger and stronger as the distance between two strongly interaction particle like quarks increases. The strong force is rather like a stretched spring that pulls harder and harder as the spring is stretched.
Therefore, QCD calculations require a new approach. Lattice QCD, as the name implies, represents space-time as a 4-dimensional lattice, a grid with some spacing a between lattice points. For the most accurate calculations, one makes the extent L of the lattice as large as possible and the spacing a of the lattice as small as possible. For typical contemporary lattice QCD calculations, L is around 6-12 fm and a is around 0.1 fm (where 1 fm = 10-15 meters). Inputs to these calculations are the masses of each of the 6 quark flavors, the number of strong force colors (3), and some mathematical description of the strong interaction.
The lattice must end somewhere, and problems arise at the surface boundaries where the lattice stops. Conventionally, this problem is dealt with by connecting the left boundary of the lattice, in each of the space dimensions, to the right boundary, so that no particles on the lattice encounter a region where the lattice simply stops. If you do this to a 2-dimensional sheet of rubber, gluing the left edge to the right edge and then gluing the top edge of the resulting cylinder to its bottom edge, you get a torus, i.e., a doughnut shape. Similarly connecting the spatial edges of a 4-dimensionl space-time lattice means that the space-time being simulated is a hyper-torus, and if a particle moves far enough in any space direction, it comes back to where it started. This is a peculiarity of such simulations, but if the lattice size L is large enough, it doesn’t affect the quality of the calculation.
Of particular interest in lattice gauge calculations are the masses mesons or baryons, combination of 2 or 3 quarks. For example, one might place an up-quark and an anti-down-quark at nearby points on the lattice, turn on the gluon interactions as described by QCD, and observe the mass of the resulting mass of the resulting pi-plus meson. Alternatively, one might place two up-quarks and one down-quark on the lattice and predict the mass of a proton. Usually, this would be done as a function of the lattice spacing a, the quark masses, and perhaps the number of color degrees of freedom. One would look for the predicted values to converge to a definite answer as a becomes very small and the number of colors goes to 3. Such calculations, performed on the largest available computers with large lattices with small spacing, have been very successful in predicting the known masses of most of the mesons and baryons. They have also had some success in predicting particle scattering properties.
My colleagues at the University of Washington, Silas Beane, Zoreh Davoudi, and Martin Savage (BDS) have been extending the application of lattice QCD to nuclei, i.e. systems of quarks that form more than one proton or neutron. This allows them to predict the masses of light nuclei (deuterium, 3He, 4He, ...) and to check the models of nuclear forces that are more conventionally used in theoretical nuclear physics. They hope to gain better understanding of the nuclear forces that hold neutrons and protons together in a nucleus and provide stability against radioactive decay. Their simulations, on the 0.1 fm (10-16 m) scale, can be considered to be at the very beginning of what could ultimately, with more computing resources, become a completely accurate simulation of atoms, molecules, people, and universes.
They argue that any simulation of the universe on a hyper-cubic lattice on the smallest distance scales brings with it certain artifacts of the lattice that should show up in experimental results. In particular, they examine three experimental results that could conceivably highlight the difference between the real world and a computer simulation: (1) the gyromagnetic ratio g of the m lepton (or muon), (2) the fine structure constant a, and (3) the behavior of the highest energy cosmic rays. Let's consider these individually.
lepton is an electron-like fundamental particle with an electric charge ±e,
a "spin" angular momentum of ½ħ,
and a dipole magnetic field associated with its spin.
If it were simply a classical spinning sphere of charge, it would have a
dimensionless gyromagnetic ratio g
of exactly 1. However,
quantum vacuum polarization produces a cloud of virtual particles around it,
giving it a gyromagnetic ratio of slightly more than 2.
The muon’s g value has been measured experimentally to great precision in
storage-ring experiments and compared with the prediction of quantum
electrodynamics (QED). It has an
experimental value of 2×1.00116592089±54±33, while QED
predicts a value of 2 ×1.00116591802±42±26, where
the variations are statistical and systematic errors.
In other words, the experimental and theoretical values of g
differ in the 8th decimal place in a statistically significant way, and the
experimental value is greater than the theoretical value by 5.74×10-9.
This difference, presently an unresolved issue for QED, could be
interpreted as an artifact of a simulation of our universe that does not give
the muon's g its correct value. BDS
conclude that a lattice with 1/a
GeV (corresponding to a lattice spacing a
of about 3×10-9 fm) would begin to show this difference in
fine structure constant a is a dimensionless physical constant
slightly larger than 1/137 that characterizes the strength of the
electromagnetic interaction in our universe.
Its value can be experimentally determined in two different ways: (1) by
measuring the gyromagnetic ratio g of the electron and (2) by measuring the Rydberg constant R∞
from the energies of transitions between the electron orbits of atoms.
These two approaches give values of a
that differ by 1.86±5.51×10-12.
In other words, the difference is consistent with zero to high precision.
BDS conclude that this agreement implies 1/a
> 4×108 GeV (or a
less than about 2.5×10-10 fm).
lattice artifact that might be expected from a simulation performed on a
hyper-cubic lattice is that, at sufficiently small wavelengths, energetic
particles produced in high energy collisions would begin to "see" the
lattice structure and to develop preferences for certain directions, breaking
the intrinsic rotational symmetry of space.
In particular, the maximum attainable velocity of a charged particle,
limited by its collision with the abundant photons of the cosmic microwave
background, might be different along the lattice than across the lattice,
and this might produce observable effects for the highest energy protons of
cosmic rays. BDS examines the
evidence for this cutoff in cosmic ray data and concludes that 1/a > 1×1011 GeV, corresponding to a lattice
of about 10-12 fm (or 10-27 m).
The conclusion of this exercise is that if we live in a simulation, it is a very good one, and either uses extremely powerful computers capable of simulations many orders of magnitude larger and more fine-grained than present technology would support, or uses qualitatively better computers (e.g., quantum computers), or uses qualitatively better simulation algorithms than assumed by BDS.
So, do we indeed live in such a simulation? It seems to me that, contrary to Bostrom’s arguments, the BDS work in itself rules out that possibility, simply on the basis of the physical size of the computer that would be required. Any lattice simulating our universe would be very large. The whole universe, with a diameter of about 1027 m, would have to be represented by the lattice. If the simulation extends only out to the Oort Cloud, a diameter of 1016 m would have to be included. Using the largest BDS minimum lattice spacing of about 10-23 m, this would mean that the simulation array for the universe would have to need about 1050 elements on a side, or 1039 elements on a side for the Oort Cloud simulation. The each point on such a lattice would require the storage of some minimum number of bits, say 20, to represent its state.
How densely could such information be stored in some hypothetical post-human supercomputer? Let’s extrapolate that the post-human supercomputer could be made of matter of nuclear density, say collapsed-matter neutronium, with the individual neutrons spaced 1 fm apart and that each neutron could somehow store 20 bits of information. The universe simulation, even neglecting the time dimension and using a 3-D cube rather than a 4-D hypercube, would have to be a cube 1035 meters (or 6.7×1023 light years) on a side. The Oort Cloud simulation would have to be a cube 1024 meters (or 6.7×1012 light years) on a side. There is not enough matter in the universe to construct such an object, and if constructed it would immediately collapse into a giant black hole.
I conclude that not
even Iain M. Banks' Culture could manage such a feat.
Unless there is something in the BDS assumptions that that is so wrong
that changes the ground rules by 20 or so orders of magnitude, there is no way
that our present world could be a computer simulation.
"Are You Living In a Computer Simulation?", Nick Bostrom, Philosophical Quarterly 53, #211, 243-255 (2003).
Tests for Simulations:
"Testing Constraints on the Universe as a Numerical Simulation", Silas R. Beane, Zohreh Davoudi, and Martin J. Savage, Physical Review Letters 109, 153001 (2012).
John Cramer's new book: a non-fiction work describing his Transactional Interpretation of quantum mechanics, The Quantum Handshake - Entanglement, Nonlocality, and Transactions, (Springer, January-2016) is available for purchase online as a printed or eBook at: http://www.springer.com/gp/book/9783319246406 .
SF Novels by John Cramer: my two hard SF novels, Twistor and Einstein's Bridge, are newly released as eBooks and are available at : http://bookviewcafe.com/bookstore/?s=Cramer .
Columns Online: Electronic
reprints of about 170 "The Alternate View" columns by John G. Cramer,
previously published in
Exit to the website.