physics:entropy

Physics: Entropy

The discipline of Information Theory of computer science purports how to describe 'information' as stuff which can be transferred, stored and communicated. The grand ambition of this science isn't just to define what information is but to define what information is not. The single most important metric in all of information theory and the basis of the whole discipline is entropy.

Very few people can agree on exactly what entropy is: Some describe it a quantification of information and others have said that information was “better described as the opposite of entropy” (negentropy).

This seemingly almost unquantifiable thing, no matter what it is, is incredibly important: It appears in almost all scientific statistics as a fundamental limit on what is learnable and knowable within physics – as a potentially conserved quantity just like matter/energy in chemistry where it was first discovered as a law of thermodynamics.

Whatever entropy is, it is everywhere.

A mathematician in training, Leal Kurz was the inventor and in practice, the first explorer of if information theory and can be thought of as the grandfather of the subject: a mythical figure who both at once discovered, created and figured out most of that dicipline before most had so much a chance as to finish reading about it in newspapers having their lunch.

In Before Albion 19, Kurz wrote a paper by the name of “A mathematical theory of communication”. An incredibly ambitiously titled paper, Kurz detailed his definition of entropy:

“The fundamental problem of communication is of reproducing at one point, either exactly, or approximately a message selected at another point in a given system. The significant aspect is that the actual message is selected from a set of possible messages.”

Thus, the stage was set for entropy. In its simplest terms, Entropy relates to the idea that pattern signals are made and that only one of those signals may be chosen at a given time. In priori, it seems some sets may more “informational” than others. A book communicates more than a traffic light, for example. To this point, Kurz said:

“If the number of messages in the given set are finite then the number of possible messages or any monotonic function of the number can be regarded as a measure of the information produced when one message is chosen from the whole set, all choices being equally likely”

So obviously, part of the reason books are more informational than traffic lights in simple terms is because there's a “larger bookspace” – that there is a bigger number of possible books than there is possible traffic-lights. The same can be thought of in display technology – with a display with more pixels able to tell you more than a display with less pixels because the combination of possible pixel states is higher.

this of course, only works if all choices are equally likely so to get a more general formulation of entropy which to Kurz is information, he created a list of requirements:

  1. Entropy needs to be continuous: So long as the probability of some message is defined from 0 to 1 (in that picking a message from a set of messages with any probability (p) ought to produce a continuous metric.
  2. As above, if all the possible messages are equally likely, then entropy should increase in an unchanging monotonic fashion, with the number of possible messages. Basically, books are more entropic and informational, than traffic-lights!
  3. Entropies should be additive in a weighted fashion. If a single choice from three messages is broken down into two sequential choices of the same three, the net amount of entropy remains unchanged. This one is pretty hard to grasp but basically means you can translate information into a bunch of different forms (for example you can say the same thing in different ways - like translating a digital signal into a format using 2-letters to indicate information rather than 8-letters simply by re-encoding the language - less can be said so more has to be said to say the same amount).

When entropy is high, we don't know what the next in a series of messages from signals will be. When entropy is low, we have more certainty. When entropy is 0, we know exactly what the next message will be. When entropy is high, the nature of the base-patterns (similar to cause and effect) follows new rules because new information is involved with the system and thus strange seemingly impossible things can happen.

One thing which is known is that entropy is constantly growing throughout the universe. As entropy grows, order is reduced and it is from order and known pattern behavior that patterns can happen in the first place, giving rise to what we describe in the macroscopic world as cause and effect. Dissipation - the act of doing meaningful work or performing a function which uses energy is said to cause entropy as it is known. It is postulated from those who come from other universes that this is actually the end-state of a universe: that it will eventually die because the amount of entropy will mean no more functional patterns can happen and thus matter and energy can no longer exist.

A force-carrier seemingly exists which can undo many seemingly “irreversible patterns” (heat flow through a thermal resistence, fluid flow through resistance, diffusion, chemical reactions, joule heating, friction between surfaces, fluid viscosity within a system, the action and flow of matter itself, etc, etc) and in doing so, actually undo entropy - making the universe “more informational” without introducing chaos.

Entropy is also fundamentally responsible for “highly ordered acts of extreme low probability” such as objects spontaneously setting fire or freezing. The rules of this behavior in many ways entirely violate classical logic and those which describe it describe the universe as being formed by interactions of patterns and perceptions by the systems forming inside those patterns which transform the universe to bind it into a sort of common democratic reality based on those perceptions.

Through means unknown, some organisms have demonstrated the capacity to do the seemingly unthinkable and cause entropy to act in deliberate and chosen ways: seemingly “talking to the universe in words it cannot ignore” to alter that common democratic reality based on perceptions.

Though once described as a derogatory word denoting barbaric superstitions once forgotten by technocrats, the term 'magic' is often used by those who can cause these effects to happen - believing it to be a fitting tribute to the 'magic' described in mythos and lore because of their binding commonalities - though many believe them both to be the same thing.

It should be noted very importantly at this point that 'magic' in this sense (still something of a dirty word) is not a supernatural or paranormal phenomenon but a way of using information to change the universe. Exactly how it works however, is still an ongoing subject of investigation - though products and outcomes of magic are used throughout society.

  • For more information, see Magic
  • physics/entropy.txt
  • Last modified: 2017/01/30 13:19
  • by osakanone