The end of locality
(An essay by Eschaton)
The end of locality would mean the end of the rationality in physics. [Karl Popper]
Science will, in all probability, be increasingly impregnated by mysticism. [Pierre Teilhard de Chardin]
Entropy (the number of degrees of freedom) is, in ultimate analysis, the number of dimensions. Space is 3-dimensional and thus is the entropic component of spacetime. Time is 1-dimensional and thus is the negentropic component of spacetime.
In accordance with the minimum total potential energy principle, spacetime evolves towards the state of the minimum gravitational potential energy.
The gravitational potential energy of a many-particle system, such as the universe, is a function of the particles' spatial separation.
Therefore, spacetime evolves from the state of pure space (entropy) to the state of pure time (negentropy, information).
Time is 1-dimensional, which means that in the end of the universe's evolution, all protons will be interconnected by electrons serially, so that the ultimate number of bits of information will be equal to the total number of the protons (~ 1079). This evolution passes in the form of the informational progress of mankind:
Thermodynamic entropy and Shannon entropy are conceptually equivalent: the number of arrangements that are counted by Boltzmann entropy reflects the amount of Shannon information one would need to implement any particular arrangement. The two entropies have two salient differences, though. First, the thermodynamic entropy used by a chemist or a refrigeration engineer is expressed in units of energy divided by temperature, whereas the Shannon entropy used by a communications engineer is in bits, essentially dimensionless. That difference is merely a matter of convention. Even when reduced to common units, however, typical values of the two entropies differ vastly in magnitude. A silicon microchip carrying a gigabyte of data, for instance, has a Shannon entropy of about 1010 bits (one byte is eight bits), tremendously smaller than the chip's thermodynamic entropy, which is about 1023 bits at room temperature. This discrepancy occurs because the entropies are computed for different degrees of freedom. A degree of freedom is any quantity that can vary, such as a coordinate specifying a particle's location or one component of its velocity. The Shannon entropy of the chip cares only about the overall state of each tiny transistor etched in the silicon crystal—the transistor is on or off; it is a 0 or a 1—a single binary degree of freedom. Thermodynamic entropy, in contrast, depends on the states of all the billions of atoms (and their roaming electrons) that make up each transistor. As miniaturization brings closer the day when each atom will store one bit of information for us, the useful Shannon entropy of the state-of-the-art microchip will edge closer in magnitude to its material's thermodynamic entropy. When the two entropies are calculated for the same degrees of freedom, they are equal. [Bekenstein, Jacob. D. ♦ Information in the Holographic Universe Scientific American, August 2003]
- In 2005, information was doubling every 36 months. [Source]
- In June 2008, information was doubling every 11 months. [Source]
- On 4 August 2010, Google CEO Eric Schmidt said: "Every two days now we create as much information as we did from the dawn of civilization up until 2003." [Source]
- By the end of 2010, information will be doubling every 11 hours. [Source]
Space is locality.
Time is nonlocality.
Therefore, when the amount of bits of information accumulated by mankind becomes equal to the total number of the protons (~ 1079), the universe will become nonlocal. It will happen after 263 information doublings (starting from a single bit):
Even if, by some magic, the doubling of mankind's information will stop accelerating on 31 December 2010 and stabilize at one doubling per 11 hours, it will mean a fully interconnected (nonlocal) state of all universe's protons by the summer of the year 2011. Reality warping, time travel, eternal youth.