Information is The Fundamental Constituent of the Universe

May 23rd, 2010

Let me begin by offering a simple definition of reality that I cooked up while in high school. I have no idea how philosophers define reality, but my own definition is straightforward: reality is anything that I could conceivably observe by either directly or indirectly. So the Andromeda Galaxy is part of reality because I can see it in the sky. Neutrinos are real because I can observe them by using a very large tank of cleaning fluid deep inside a mine. Unicorns are not real because there is no way that I can directly or indirectly observe them -- unless, of course, one shows up someday and I could conceivably observe it.

I’m sure that philosophers will be able to dismiss my definition with some sniffy observation that it falls short of their rigorous standards. They might try poke holes in my definition by referring to abstract ideas. Is love real, one might ask. I would argue that love is real if I can conceivably observe, however indirectly, love in action. And I think I can. Nevertheless, I’m sure that there are lots of tricky problems that could occupy philosophers and scholastics until the end of time. I have no patience for such logic-chopping; my definition may not be perfect, but it’s useful. It helps us understand reality. That’s good enough for me.

So now I’d like to direct your attention to one crucial aspect of this definition. It hinges on the notion of "observing" something. Observation, however indirect, and merely conceivable, is the central concept of the definition. And what, in turn, is observation? I’ll claim that observation is the acquisition of information about something. If I see the Andromeda Galaxy, or a neutrino, I necessarily obtain information about it in the process.

Let’s step back for a moment and chew this cud. The ability to acquire information about something is our means of establishing its reality. It’s the information that makes it real. So now let’s turn that idea around: information is not merely the hallmark of reality: it
is reality. All this ado about mass and energy and antimatter and quarks -- all of this stuff is a distraction from the fundamental reality. When we think of reality in terms of such components, we’re only looking skin-deep. Mass and energy and quarks and so forth are merely manifestations of the underlying essence of the universe: information. If the universe is "everything that we can truly know", then information is necessarily the essence of that universe, because you can’t know mass, energy, or quarks: you can know only information.

Remember Einstein’s brilliant realization that a straight line is actually
defined by the path of light? That basic idea applies in many other ways. Spatial distance is defined by particles; if you had an expanse of space devoid of any particles, then you could never measure that expanse; you need particles to act as place-markers. And if you can’t measure space, then you can’t know about it (the negative version of "To measure is to know"). In other words, space cannot exist without particles also existing to mark the space. If I could conceivably position particles in a region of space so that I can measure it, then that space is real; if I could not conceivably position particles, then the space is not real.

What about time? Einstein uses moving clocks to measure time, and you can actually calculate relativistic time dilation by creating a clock consisting of a photon bouncing between two mirrors in a direction perpendicular to the line of motion of the clock. But there’s another aspect to this process that deserves our consideration: causation. Our universe includes at its most fundamental level the concept of causation: that one event causes another event at a later time. That later time might be nanoseconds later, but it’s still later.

Now, physicists have gotten their underwear in a tangle worrying about the problem of the directionality of time. Why must time always move forwards? Most of the basic physical laws are reversible: they work just as well running backwards in time as forwards in time. So how come our universe has unidirectional time?

The answer, I think, comes from a fundamental principle: that information (or something like it) is conserved. The universe cannot create information out of nothing. If you add to this the realization that information from two points in time can be used to determine even greater information at a third point in time, you have all you need to understand a great deal about the universe. But first, let me explain that latter notion: information gaining interest.

Suppose that I want to know the velocity of a particle. I measure its position at time t and again at time t + dt. Its velocity is then the distance it travelled divided by the time delay. Of course, there’s got to be some error in my measurement -- a perfect measurement would have an infinite amount of information. However, suppose that I use an extremely large value for dt. Then the accuracy of my velocity measurement is dramatically increased: I obtain vastly more information without any additional expenditure of information. Theoretically, I could obtain stupendous amounts of information, vastly exceeding the total information content of the universe, just by making measurements separated by billions of years. That would violate the conservation of information.

But there’s a catch: the Uncertainty Principle. It disturbs me that quantum mechanics has developed in such a way as to minimize the role of the Uncertainty Principle; wave mechanics is much more popular, probably because wave mechanics has more immediate utility. But we should never forget that the Uncertainty Principle is the underlying basis of quantum mechanics, and it loudly declares a limit on the amount of information we can obtain in any single measurement. If you apply the Uncertainty Principle to my imaginary process of acquiring humongous amounts of information, you can quickly calculate that my scheme won’t work, because I have uncertainties in BOTH the position and the velocity of the particle. Those two uncertainties combine to rob my experiment of effectiveness.

The Second Law of Thermodynamics says the same thing: the information content of any isolated system can never increase. Conservation of information stated pretty baldly.

The Big Bang was really just another manifestation of the conservation of information. Here’s my explanation of what happened: it all started with the creation of a stupendous amount of information. That information wasn’t located anywhere (remember, you can’t have space unless you have particles to measure it with, and since there were no particles, there was no space). I won’t speculate on where all that information came from; that’s almost a theological question. I’m instead assuming that all the information simply came into existence, and created its own space by its very existence. However, it created only a single point of space. This situation could not last; all that information concentrated in a single point would generate super-stupendous amounts of information with the passage of time (I haven’t yet figured out where time came from). So it had to spread out over space so as to conserve information. The rate of expansion was dictated by the Uncertainty Principle.

As you can readily observe, this is not even a half-baked idea -- I’d call it maybe 0.10-baked. But the concept is solid, and I’m confident that this idea, when fleshed out properly, will provide us with a useful explanation of many phenomena.

Postscript October 28th 2012
I now realize that it is not information that is conserved, it is a combination of information and time. Let’s go back to the Uncertainty Principle. It doesn’t say that information is conserved. It has two forms: the first is that the uncertainty in position multiplied by the uncertainty in momentum is constant; the second is that the uncertainty in energy multiplied by the uncertainty in time is constant. Let’s focus our attention on the second form and apply it to a photon. It possesses energy of E = hν, so if we attempt to measure it’s energy, we’ll have this limitation:ΔE Δt = horΔ(hν) Δt = h

Since there can be no uncertainty in the value of h, the uncertainty lies in ν and t. But ν is the frequency of the photon: the number of oscillations it completes in one second. Thus, if we measure the photon with an time precision of one second, then ΔE = h

I’m a bit uncertain that this next step is right, but it seems to me that the h on both sides should cancel out, even though we’re talking about different applications of h in the equation. If that step is correct, then we get:Δν = 1which means that we’re getting the number of oscillations per second precise to within a single oscillation. If that violates your sniff test, remember that we’re admitting a LOT of uncertainty in the time of measurement (one full second), so we should be able to obtain very little uncertainty in the number of oscillations per second.

Let’s shift gears now and think in terms of the information communicated by photons. It is well known that the bandwidth of an electromagnetic signal is proportional to its frequency. Higher frequencies can carry more information per second. Let’s simplify this and say that each cycle carries one bit of information. Applying the one-bit-per-cycle rule to the information we measure over a single second, we learn that the uncertainty in that measurement is equal to 1 cycle. Thus, a 1 Megahertz signal can carry one million minus one bits of information if it is measured over one second. Over half a second we still measure the frequency as one Megahertz, but the uncertainty doubles to two bits. If we measure it over a period of one microsecond, then our measurement is 1 million bits with an uncertainty of 1 million bits – not much information in that signal.

The temptation here is to conclude that one photon carries one bit of information, and that photons of higher energy (frequency) can reliably deliver that information over shorter periods of time. Suppose that you have a fixed amount of energy: enough to make one 1 MHz photon or a thousand 1 KHz photons. Suppose further that the receiver can measure photons over a one-second interval. Then the 1 MHz photon can deliver a million bits of information while the smaller photons can only deliver a thousand bits each – but there are a thousand of them, so they deliver net a million bits. Thus, with the same amount of energy, you can deliver the same amount of information over the same time period.

This suggests to me that it’s not information that’s conserved, it’s information rate in bits/second. Suppose then that the universe has a maximum of a googolplex of bits/second. Then it began life as, say, one photon with a frequency of one googolplex...