The precise relationship between energy & information that form the foundation of the post AI economy.

David Galbraith
5 min readApr 11, 2024

--

Information and energy may have an equivalence and their relationship in semiconductors, as defined by physics, is about both unit conversion and the difference in the effective number system used in nature and computers. The advent of AI makes this supremely important as the requirements for compute and energy intersect.

Information and energy equivalence?

When mass is converted into energy you get a nuclear explosion. When you delete information it is converted into energy as heat. All modern computers do this as they delete information (change it into heat) as the logic gates that process information run irreversibly.

Information, energy and mass are equivalent

This does not mean that energy and information are necessarily equivalent but it is a tantalising possibility and the equation for energy released from information deletion is as simple in mathematical expression as energy and mass equivalence (E∝m): E ∝ temperature x number of bits. Instead of c², the factor of proportionality as defined by Landauer is kln(2) and as we see below this is really just a unit conversion.

Information converting into energy in computers

Bits are transformed into energy when you delete bits by making calculations irreversible as all current computers do (but don’t have to). Heat can potentially be eliminated if you make things reversible. In practice this would mean logic gates that have three input and output wires (instead of the current 2 in and 1 out*) so that you know the input from the output.

All current computers are wired so that information is lost as it passes through logic gates (apart from NOT) as there are less wires out than in and from the output you don’t know what the input was (they are irreversible). It doesn’t need to be like that, there are gate designs with three inputs and outputs that preserve the bits, are reversible, and therefore don’t generate heat as the information is lost.

Information and heat entropy

Instead of energy, we will measure entropy, which is the unusable energy per unit temperature and where two different types of entropy (heat and information entropy) that were previously thought to be merely analogous,, are essentially the same.

When Shannon called his measure of information, ‘entropy’ he reputedly did so because von Neumann said that nobody really new what entropy was and it didn’t matter. Landauer effectively showed that they were the same.

Heat entropy.

Heat entropy uses the symbol S (from Sadi Carnot). Entropy itself comes from”τροπή” (trope), meaning transformation, to highlight its parallels with the transformational nature of energy flows.

Heat entropy is given the letter S from Sadi Carnot who laid the groundwork for the second law of thermodynamics (entropy increases) by describing the maximum efficiency possible for a heat engine. His nephew, also called Sadi Carnot, was President of France when the Eiffel tower was constructed.

The heat entropy of a physical system is proportional to the number of equally likely microstate arrangements (W, from Wahrscheinlichkeit,” the German word for “likelihood”) per macrostate. i.e. the logarithm of the arrangements, where the base is e so it is the natural logarithm, ln.

i.e. S is proportional to lnW

Boltzmann’s entropy introduced a statistical foundation to thermodynamics, linking the microscopic states of particles to macroscopic properties.

A macrostate is a measurable, overall characteristic of a system, like temperature or pressure and a microstate can be a particular arrangement of things like position and velocity, spin or whatever of particles. An analogy would be the number of different photographs of systems particles that would equal the same temperature overall.

This proportionality is between temperature and energy and is called k.

So: S = klnW

Despite k being named Boltzmann’s constant and it appearing in the famous entropy equation on his tombstone, Boltzmann never used it.

Planck chose it, despite being on Boltzmann’s tombstone, he never actually used the constant that is named after him. Its numerical value depends on the choice of units for energy and temperature and so if ‘natural’ Planck units are used where the macroscopic temperature is related to the energy of atomic particles, k disappears. and S = ln W.

Information Entropy

Shannon defined information entropy, H (possibly a tribute to Ralph Hartley, who laid the groundwork for measuring information).

Shannon, whose work on information theory is perhaps more complete and more profound than Einstein’s work on gravitation.

The information entropy of a variable X is defined by the following formula which looks much more complicated than Boltzmann’s one for heat because it allows for the variable to be not just a number but, say, letters where each letter has a different probability of occuring (like in English). It also allows for the log to be in any base, when for computers it is base 2 unlike Boltzmann’s for heat which is in base e. The Shannon equation is written fro a single symbol, so if you have a message with N symbols the total entropy is multioplied by N.

Don’t be put off by Shannon’s formula, if we set the base as 2 and deal with numbers of equal probability instead of letters, things are much simpler.

The information entropy (potential information) in a truly random sequence is the number of digits required to represent the sequence in the base of the number system used, which is the logarithm of the number of possible states.

In computing, this base is 2, leading to entropy measured in binary ‘bits’.

So for a given number of potential states 𝑛n (e.g., 2 if there is a 1 or a 0, or 4 if there is a 00, 01, 10, or 11), the information entropy for a single symbol is:

𝐻(𝑋)=log⁡2𝑛H(X)=log2​n

For a binary message with 𝑁N symbols, the total entropy 𝐻H is:

𝐻=𝑁⋅𝐻(𝑋)=𝑁⋅log⁡2𝑛H=NH(X)=N⋅log2​n

i.e., in a random sequence where 1011101 and 1100011 are equally likely, the entropy per symbol is the number of bits (log base 2 of the number of different 2 possible states), and the total entropy for 𝑁N symbols is 𝑁⋅𝐻(𝑋)NH(X)

Bringing heat and information energy back together.

So if information entropy is the log, base 2 of the number of arrangements of ones or zeros and heat entropy is k times the log base e of the number of possible arrangements of components that determine, say, the temperature, then converting from one to the other means just converting the base and multiplying by k for standard units of heat.

Going back to energy from entropy,

1 bit converts into kTln(2) joules of heat for a given temperature.

*except for NOT gates which have one input and one output.

--

--

David Galbraith
David Galbraith

Written by David Galbraith

Architect: I used to design buildings, now I design companies. http://davidgalbraith.org

Responses (11)