Information in Biology
CRI - Centre de Recherches Interdisciplinaires, Paris, May 2012
The concept of information is omnipresent in biology:
information is stored in DNA, transmitted in a signaling pathway or along a neuron, and translated by the ribosome.
In this short course, we will formalize and quantify these intuitive notions of biological information.
We will start from the basic definitions of Shannon's information theory, which will be linked to statistical mechanics.
We will then apply this framework to examine a variety of living information systems, starting from molecular channels, through neural networks, to population dynamics and evolution.
Coherent discussion in terms of information theory reveals common principles of noisy living information systems which we will discussed.
Lecture I: Information and Statistical Mechanics
[slides]
Shannon's information theory- fundamental properties (channel, capacity, noisy channels).
What is entropy? Intuitive definition and axiomatic definitions.
Basic properties of entropy, mutual information, relative entropy.
Relation to statistical mechanics: Maxwell's demon, Entropy of Markov chains and the second law, maximum entropy (Jaynes).
Lecture II: Overview of Living Information - molecules, neurons, population and evolution.
[slides]
Living systems as information processors.
Sources of information in Life: (modernized) central dogma; environment; cell composition.
Living information channels: Receptors - signaling; codes; replication; population dynamics; quorum sensing.
Information processing: circuits and their elements; neural networks; genetic networks.
Information output: transcription; decisions; cell fate; differentiation; development.
Infomration loops: feedbacks.
Lecture III: Neural networks
[slides]
Basic coding theory.
How neurons transfer information: basic physiology.
Single neuron (Laughlin).
Neural networks (Hopfield).
Lecture IV- A: Molecular information
[slides]
Theory of noisy channels (rate-distortion).
The biophysical constraints on information processing by molecules..
Genetic code as a noisy information channel.
More examples: Operons, genetic networks, transcription factors.
Lecture IV- B: Population dynamics, social interaction and sensing
The fitness value of information (Kelly's horserace).
Group dynamics
papers
Lecture I
Shannon, Communication in the presence of noise , Proceedings of the IEEE 1948.
Landauer, Computation: a fundamental physical view , Physica Scripta 1987.
Bennet, Logical reversibility of computation , IBM Journal of Research and Development 1973
Jaynes, Information Theory and Statistical Mechanics , Phys. Rev. 1957.
Lecture II
Crick Ideas on protein synthesis 1956.
Crick Central Dogma of Molecular Biology , nature 1970.
Lecture III
MacKay and McCulloch, The limiting information capacity of a neuronal link , Bulletin of Mathematical Biology 1952.
Laughlin, A simple coding procedure enhances neuron's information capacity , Z Naturforsch 1981.
Laughlin, The rate of infomration transfer at graded-potential neurons , Nature 1996.
Lecture IV
Hamming, Error detection and error correction codes , Bell Syst. Tech. J. 1950.
See also Hamming's later comments on this paper .
Hopfield, Kinetic Proofreading: A New Mechanism for Reducing Errors in Biosynthetic Processes Requiring High Specificity , PNAS 1974.
See an experimental verification in Hopfield et al., Direct experimental evidence for kinetic proofreading in amino acylation of tRNA , PNAS 1976
Kelly A new interpretation for information rate , Bell Sys Tech J 1956.
Kussell and Leibler Phenotypic Diversity, Population Growth, and Information in Fluctuating Environmentss , Science 2005.
additional papers
Information theory and statistical physical
L. Szilard, On the decrease of entropy , Z. fur Physik 1929
(German version).
R.P. Feynman, Ratchet and pawl , Feynman Lectures on Physics, Vol. I, Ch. 46.
F. Attneave, Some informational aspects of visual perception , Psychological Review 1954.
Maximal Entropy and Information Theory
E.T. Jaynes Information theory and statistical mechanics , in Brandeis University Summer Institute
Lectures in Theoretical Physics 1963, Ed. by K.W. Ford.
E.T. Jaynes The Gibbs Paradox , in Maximum Entropy and Bayesian Methods 1992, Ed. by C.R. Smith, G.J. Erickson and P.O. Neudorfer.
E.T. Jaynes Where do we stand on Maximal Entropy? , presented at the Maximum Entropy Formalism Conference, MIT 1978.
Information and Computation
R. Landauer, Irreversibility and heat generation in the computing process , IBM J. Res. Develop. Vol. 5 No. 3 1961.
A. Turing, Computing Machinery and Intelligence , Mind 49: 433-460 1950.
J. von Neumann, The general and logical theory of automata , Hixon Symp. on Cerebral Mechanisms in Behavior 1951.
Coding and Error Correction
C.E. Shannon, A universal Turing machine with two internal states , Automata Studies 1956.
G.A. Miller, The magical number seven, plus or minus two: some limits on our capacity for processing informatio , Psychological Review 1956.
Language
C.E. Shannon, Prediction and entropy of printed English , Bell System Technical Journal 1951.
background books
Feynman: Lectures on Computation .
Leff and Rex: Maxwell's Demon - Entropy, Information, Computing .
Cover and Thomas: Elements of information Theory .