The idea of (classical) Information connects to bits (which is basically a 1 or a 0 stored in a particular memory location) and manipulation of information should be regarded a topic in computer science. However, we can also interpret the information processing as a physical phenomenon as information is always encoded in some physical state
. And ofcourse information processing is carried out in a physically realizable system. With this thought, information can infact be linked to the study of undelying physical processes.
Physics of Information.
Landauer's Principle. Rolf Landauer in 1961 presented that information erasure is a dissipative
process. It always involves compression of phase space and so is irreversible
. Say, if 1 bit of information is represented by keeping a molecule in a box on either left or right side of a defined boundary (left = 0 and right=0); and using a movable piston we force the molecule to move (with absolute certainity) to left side (irrespcective of it's initial state), then this process reduces the entropy of the gas by ΔS = k
ln2. If process is isothermal at temperature T, this energy is dissipated from box to the environment and work done in doing so is W = k
In Classical Computing, the (1+ input) gates are irreversible (aka given output of a gate, I cannot infer with certainity what the inputs were). Say, NAND [ NAND(a,b)=NOT(a AND b) ] gate takes two inputs and produces one output. So, there was an erasure. Thus at least W = k
T ln2 work is needed to operate the gate (theoretical limit?).
Charles Bennet in 1973 showed that any computation can be carried out using reversible gates and so, theorically would require no power expenditure. For example, Toufolli Gate [ TOFFOLI(a,b,c)=(a,b, c XOR (a AND b)) ]is reversible equivalent of a NAND gate (3 input and 3 output). Thus replace all NAND with TOUFOLLI and we get a reversible circuit. But a lot of extra bits are generated along and perhaps this has only caused the energy cost to be delayed. Bennet explained this as: first compute through the circuit, print a copy of the answer (a logically reversible operation
) and reverse through the circuit to get to the initial state. In practice, reversible gates dissipate orders of magnitude per gate greater than indicated limit.
The resolution of Maxwell's Demon
problem is also associated with information erasure. Considering the memory of demon to remember the energies of atoms passing through the gate is finite; we can state that, at some point the old information has to be erased and the power cost is thus paid. Leo Szilard (a pioneer in physics of information) in 1929 in the analysis of Maxwell's Demon problem, invented the concept of bit
(the term "bit" was later coined by John Tukey).
Nature is quantum mechanical. Eg. Clicks registered in the geiger counter exhibit a truely radom poisson process
.In classical dynamics, there is no place for true randomness
(though chaotic systems exhibit behaviour that is indistinguishable from true randomness).
In quantum theory, by performing measurement on one of two non-commuting observables (Observables don't commute if they can't be simultaneously diagonalized i.e. if they don't share an eigenvector basis. Non-Commuting observables follow the uncertainity principle
.) alters the state of other. Thus measuring disturbs the state of the system. There is no counter-part to this limitation in classical mechanics.
(Also read, Coppenhagen Interpretation
"The tradeoff between acquiring information and creating a disturbance is related to quantum randomness
. It is because the outcome of a measurement has a random element that we are unable to infer the initial state of the system from the measurement outcome."
No Cloning Theorem. If we could copy a quantum state, we could make a copy of the original and measure the copied one without alterning the original state; thus, beating the principle of disturbance. Unlike classical bits, quantum bits (qubits) cannot be copied with perfect fidelity. This is established as the No Cloning Theorem
The difference between classical and quantum information was shown precisely by John Bell in 1964 during his time at CERN. Bell showed that "quantum information can be (in fact, typically is) encoded in non-local correlations between different parts of a physical system, correlations with no classical counterpart." (More on this, later.)
Next Post: Quantum Algorithms, Quantum Complexity and Quantum Parallelism.
This work is primarily an adaptation of Notes by John Preskill on Quantum Information (Caltech). The complete work can be accessed here.
Text - More explaination/Link to research is required. Will be updated.