Abstract:
While Kolmogorov (1965) complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the i...Show MoreMetadata
Abstract:
While Kolmogorov (1965) complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal information metric, based on length of shortest programs for either ordinary computations or reversible (dissipationless) computations. It turns out that these definitions are equivalent up to an additive logarithmic term. We show that the information distance is a universal cognitive similarity distance. We investigate the maximal correlation of the shortest programs involved, the maximal uncorrelation of programs (a generalization of the Slepian-Wolf theorem of classical information theory), and the density properties of the discrete metric spaces induced by the information distances. A related distance measures the amount of nonreversibility of a computation. Using the physical theory of reversible computation, we give an appropriate (universal, antisymmetric, and transitive) measure of the thermodynamic work required to transform one object in another object by the most efficient process. Information distance between individual objects is needed in pattern recognition where one wants to express effective notions of "pattern similarity" or "cognitive similarity" between individual objects and in thermodynamics of computation where one wants to analyze the energy dissipation of a computation from a particular input to a particular output.
Published in: IEEE Transactions on Information Theory ( Volume: 44, Issue: 4, July 1998)
DOI: 10.1109/18.681318
References is not available for this document.
Select All
1.
P. A. Benioff, "Quantum mechanical Hamiltonian models of discrete processes that erase their histories: Applications to Turing machines", Int. J. Theoret. Phys., vol. 21, pp. 177-202, 1982.
2.
P. A. Benioff, "Quantum mechanical Hamiltonian models of computers", Ann. New York Acad. Sci, vol. 480, pp. 475-486, 1986.
3.
C. H. Bennett, "Logical reversibility of computation", IBM J. Res. Develop., vol. 17, pp. 525-532, 1973.
4.
C. H. Bennett, "The thermodynamics of computation̵A review", Int. J. Theoret. Phys., vol. 21, pp. 905-940, 1982.
5.
C. H. Bennett, "Time/space trade-offs for reversible computation", SIAM. J. Comput., vol. 18, pp. 766-776, 1989.
6.
C. M. Caves, W. G. Unruh and W. H. Zurek, "Comment on quantitative limits on the ability of a Maxwell Demon to extract work from heat", Phys. Rev. Lett., vol. 65, pp. 1387, 1990.
7.
G. Chaitin, "A theory of program size formally identical to information theory", J. Assoc. Comput. Mach, vol. 22, pp. 329-340, 1975.
8.
I. Csisza´r and J. Ko¨rner, Information Theory, 1980.
9.
A. N. Kolmogorov, "Three approaches to the definition of the concept `quantity of information'", Probl. Inform. Transm., vol. 1, no. 1, pp. 1-7, 1965.
10.
R. P. Feynman, "Quantum mechanical computers", Opt. News, vol. 11, pp. 11, 1985.
11.
E. Fredkin and T. Toffoli, "Conservative logic", Int. J. Theoret. Phys., vol. 21, no. 3/4, pp. 219-253, 1982.
12.
P. Ga´cs and J. Ko¨rner, "Common information is far less than mutual information", Probl. Contr. and Infotm. Theory, vol. 2, pp. 149-162, 1973.
13.
P. Ga´cs, "On the symmetry of algorithmic information", Sov. Math.̵Dokl., vol. 15, pp. 1477-1480, 1974.
14.
P. Ga´cs, 1987.
15.
R. W. Keyes and R. Landauer, "Minimal energy dissipation in logic", IBM J. Res. Develop., vol. 14, pp. 152-157, 1970.
16.
R. Landauer, "Irreversibility and heat generation in the computing process", IBM J. Res. Develop., pp. 183-191, July 1961.
17.
R. Landauer, Int. J. Theor. Phys., vol. 21, pp. 283, 1982.
18.
Y. Lecerf, " Machines de Turing reversibles. Recursive insolubilite en n in N de l'equation u = theta^n ou theta est un ̶isomorphism des codes ", Compt. Rend., vol. 257, pp. 2597-2600, 1963.
19.
R. Y. Levine and A. T. Sherman, "A note on Bennett's time-space trade-off for reversible computation", SIAM. J. Comput, vol. 19, pp. 673-677, 1990.
20.
M. Li and P. M. B. Vita´nyi, Reversibility and adiabatic computation: Trading time and space for energy, vol. 452, pp. 769-789, 1996.
21.
M. Li, J. Tromp and L. Zhang, "On the neirest neighbor interchange distance between evolutionary trees", J. Theor. Biol, vol. 182, pp. 463-467, 1996.
22.
M. Li and P. M. B. Vita´nyi, An Introduction to Kolmogorov Complexity and Its Applications, 1997.
23.
K. Likharev, "Classical and quantum limitations on energy consumption on computation", Int. J. Theoret. Phys., vol. 21, pp. 311-326, 1982.
24.
D. Sleator, R. Tarjan and W. Thurston, "Short encodings of evolving structures", SIAM J. Discr. Math., vol. 5, pp. 428-450, 1992.
25.
J. Ziv and N. Merhav, "A measure of relative entropy between individual sequences with application to universal classification", IEEE Trans. Inform. Theory, vol. 39, pp. 1270-1279, July 1993.
26.
W. H. Zurek, "Thermodynamic cost of computation algorithmic complexity and the information metric", Nature, vol. 341, pp. 119-124, 1989.
27.
W. H. Zurek, "Algorithmic randomness and physical entropy", Phys. Rev, vol. A40, pp. 4731-4751, 1989.
28.
A. K. Zvonkin and L. A. Levin, "The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms", Russ. Math. Surv., vol. 25, no. 6, pp. 83-124, 1970.