Information, What is it?

On the Shoulders of Giants – By Don Davis

One of the reasons that many thinkers are drawn to audio is that we get to dabble with the fundamental framework of the world around us and of life itself. As we peel away the outer, technical layers we encounter some of the same paradoxes as Einstein, Planck, Heisenberg and others. When actively teaching, Don was sometimes criticized for taking tangents away from the subject matter. What the student didn’t realize was that the chart, thought or equation that he had just shared on the overhead was so far-reaching that there was little point in elaborating until it had been fully considered by the observer. At 20 years post my first Syn-Aud-Con seminar, I am still contemplating some of those overheads.

This short article is just such a thought, and is offered to the thinkers among you. Pat Brown

Copyright 2008 Syn-Aud-Con - All Rights Reserved

INFORMATION,What is it?

What originally triggered my renewed interest in “Information Theory” was reading in Professor Chris Bissell’s writings on Karl Kuffmuller, “If today, we recognize information along with energy and matter as a third fundamental building block of the World” we are left with the challenge to bring information into a meaningful relationship with Einstein’s E = mc2. Einstein’s original paper had m=E/c2.

Einstein’s brilliant flash of insight was the realization that if indeed c was constant, then both E and m were able to be expressed in terms of each other. I have, since broaching the subject, found that my intuitive guess that Planck units might be applicable, has proven to be correct.

The absolute minimum physical size of 1.0 nats worth of information is a square exactly 2 planck lengths on a side. The planck length is 1.6 x 10-35M. The energy associated with 1.0 nats is defined by “1 Kelvin can be defined as a requirement of 1.38 x 10-23 Joules (or 86.2 μeV) of energy input per increase of the Log state count by 1.0 nat.”
Information, here in the U.S., is most commonly described in bits. There are 1.442695 bits/nat (bits = nats/ln2). During WWII the term bans was in use in code breaking by Alan Turing at Bletchley Park in England. Bans = bits/log210. 1 .0 ban = 3.3219 bits = 2.3026 nats.
Max Von Laue in the 1960s wrote, as a tribute to Einstein, a mathematical history of energy. Von Laue had been the one scientist in Hitler’s Germany to defend Einstein’s Theory of Relativity in spite of the danger it placed him in. At the conclusions of this remarkable history, he wrote,
“Every thinking person cannot but be strongly impressed by the last consequence, which is extreme, but confirmed by experience, that at least for electrons, the mass is nothing but a form of energy which can occasionally be changed into another form. Up to now our entire conception of the nature of matter depended on mass. Whatever has mass – we thought – has individuality; hypothetically at least we can follow its fate throughout time. And this is certainly not true for electrons. If not, what remains of the substantial nature of these elements of all atomic nuclei, i.e., of all matter? These are grave problems for the future of physics.”

Erwin Schrodinger’s remark, “The scientific picture of the real world around me is deficient. It gives me a lot of factual information, puts all our experience in a magnificently consistent order, but is ghastly silent about all and sundry that is really dear to our heart, that really matters to us.” Frank Wilczek, MIT, Herman Feshback Professor of Physics, Nobel Prize in 2004, has written a paper on The Origin of Mass,

“According to the principles of Quantum mechanics, the result of an individual collision is unpredictable. We can, and do, control the energies and spins of the electrons and positrons precisely, so that precisely the same kind of collision occurs repeatedly; nevertheless, different results emerge. By making many repetitions we can determine the probabilities for different outcomes. These probabilities encode basic information about the underlying fundamental interactions; according to quantum mechanics, they contain all the meaningful information. (Ed: underlining mine.)”

“These results are a remarkable embodiment of the vision that elements of reality can be reproduced by purely conceptual constructions – it’s from Bits.”

“Feynman diagrams can encompass all the content of Maxwell’s equations for radio waves and light, Schrödinger’s equation for atoms and chemistry, and Dirac’s more refined version including spin – all this, and more, is faithfully encoded in the squiggles.”

For example, Feynman’s diagrams, which contain solid lines, the world-line of an electrically charged particle and the squiggles that straddle the world-lines.

“The wave patterns that describe protons, neutrons, and their relatives resemble the vibration patterns of musical instruments. In fact, the mathematical equations that govern these superficially very different realms are quite similar.”
After many years and discoveries of gluons and quarks, Wilczek concludes, “In a sense, these calculations settle the question. They tell us the origin of (most) mass…It is particularly unsatisfactory in the present case, because the answer appears to be miraculous. The computers construct for us massive particles using building blocks – quarks and gluons – that are themselves massless.”
“Mass, a seemingly irreducible property of matter, and a byword of its resistance to change and sluggishness, turns out to reflect a harmonious interplay of symmetry, uncertainty, and energy…and ordinary matter, we have recently learned, supplies only a small fraction of mass in the universe as a whole.”

“Experiment is the ultimate arbiter of scientific truth.”

It has been suggested by NIST physicists, Mohr and Taylor, the chair and previous chair of CODATA, that the kg be defined as “The kilogram is the mass of a body at rest whose equivalent energy equals the energy of a collection of photons whose frequencies sum to 135 639 274 x 1042 (cycles per second).” That’s quite a way past cosmic and gamma rays on the Electromagnetic spectrum charts.

From a remarkable paper on the “Physical Limits of Computing” by Michael P. Frank, we find that “The sum of the system’s entropy and its known information is always conserved…Information is just known entropy. Entropy is just unknown information…In situations where information in question happens to be entropy, the nat is more widely known as Boltzmann’s Constant, Kb.”

Information Theory has generated a great deal of metaphysical (above the physical) speculation in useful directions. Metaphysical can, perhaps best be sensed by Buckminster Fuller’s statement, “To share an idea is communication. To understand the idea is metaphysical.”

Professor Gitt’s Universal Laws for Information:

    • It is impossible to set up, store or transmit information without using a code.
    • It is impossible to have a code apart from a free and deliberate convention.
    • It is impossible to have information without a sender.
    • It is impossible that information can exist without having had a mental source.
    • It is impossible for information to exist without having been established voluntarily by free will.
    • It is impossible for information to exist without all five hierarchical levels: statistics, syntax, semantics, pragmatics, and apobetics (the purpose, for which the information is intended, from the Greek apobeinon – result, success, conclusions (73).

Royal Truman writes in “The Problem of Information”

“Gitt’s book, which has been published in several languages, develops these principles in great depth. The inviolability of these laws has been accepted in numerous university discussions and conferences. Like any proposed law of nature, a single exception would suffice to disprove it.”

There are no known codes that have not required an intelligent source or origin. This makes DNA exceptionally interesting as it is an encoding/decoding mechanism that contains code, or language, representing the organism (ergo an intelligent origin.) There is no documented increase in information resulting from a mutation.

One analysis of sender and receiver I particularly responded to was where they first outlined in detail:

  • Intelligent sender and intelligent receiver
  • Non-intelligent sender and intelligent receiver
  • Intelligent sender and non-intelligent receiver
  • Non-intelligent sender and non-intelligent receiver

Quoting from Royal Truman, “Now let’s consider an absolute extreme case. The sender and receiver can only react mechanically. Suppose the setup must be fully automatic, meaning that when the sender or receiver is destroyed, a substitute has been provided for.”

“Compared to all the alternatives, this one requires the highest amount of intelligence from the agent who designed the system. Eventualities need to be anticipated and all resources for repair and energy need to be prepared in advance. Do we find anything so enormously complex? Yes – it is called life!”

This entire whirlpool of effort on the part of intelligent people to fathom the fundamental meaning of information reminds me of the metaphysical statement, “The most complex system imaginable, is the mind, by definition, since the mind must be at least one degree more complex than what it imagines. (Author unknown)”

If, for example, you accept the “big bang” theory which postulates a finite universe, (i.e., zero time to now) of space and time, then something outside of space and time had to be the origin – a quandary for those who devotedly believe in a random chaotic universe.

Again, I sincerely hope that some of this investigation into the heart of our business will encourage some of you to further exploration. dbd

Link to original document in pdf format: VOL36_AUG08_InfoTheory