Page - 450 - in Differential Geometrical Theory of Statistics
Image of the Page - 450 -
Text of the Page - 450 -
Entropy2016,18, 110
where the shift is by a bounded constant, independent ofw. The constant cT is theKolmogorov
complexityof theprogramneededtodescribeT so thatTU cansimulate it.
A variant of the notion of Kolmogorov complexity described above is given by conditional
Kolmogorovcomplexity,
KTU(w | (w))= minP:TU(P,
(w))=w (P),
where the length (w) isgiven,andmadeavailable to themachineTU.Onethenhas
K(w | (w))†(w)+c,
because if (w) isknown, thenapossibleprogramis just towriteoutw. Thismeans that then (w)+c
is justnumberofbitsneededfor the transmissionofwplus theprint instructions.
Anupperboundisgivenby
KTU(w)â€KTU(w | (w))+2log (w)+c.
If onedoesnotknowapriori (w), oneneeds to signal the endof thedescriptionofw. For this it
sufïŹces tohaveaâpunctuationmethod", andone can see that this has the effect of adds the term
2log (w) in theaboveestimate. Inparticular, anyprogramthatproduces adescriptionofw is an
upperboundonKolmogorovcomplexityKTU(w).
OnecanthinkofKolmogorovcomplexityintermsofdatacompression: theshortestdescriptionof
w isalso itsmostcompressedform.Upperbounds forKolmogorovcomplexityare thereforeprovided
easilybydatacompressionalgorithms.However,whileprovidingupperbounds forcomplexity is
straightforward, the situationwith lowerbounds is entirelydifferent: constructinga lowerbound
runs into a fundamental obstacle causedby the fact that the haltingproblem is unsolvable. As a
consequence,Kolmogorovcomplexity isnotacomputable function. Indeed, supposeonewould list
programsPk (with increasing lengths)andrunthemthroughthemachineTU. If themachinehaltson
Pkwithoutputw, then (Pk) isanapproximationtoKTU(w).However, theremaybeanearlierPj in
the list suchthatTU hasnotyethaltedonPj. IfTU eventuallyhaltsalsoonPj andoutputsw, then (Pj)
willbeabetterapproximationtoKTU(w). Soonewouldbeable tocomputeKTU(w) ifonecouldtell
exactlyonwhichprogramsPk themachineTU halts,but that is indeedtheunsolvablehaltingproblem.
KolmogorovcomplexityandShannonentropyarerelated: onecanviewShannonentropyasan
averagedversionofKolmogorovcomplexity in the followingsense (seeSection2.3of [31]). Suppose
given independent randomvariablesXk, distributedaccording toBernoullimeasureP= {pa}aâA
with pa=P(X= a). TheShannonentropyisgivenby
S(X)=ââ
aâA P(X= a) logP(X= a).
Thereexistsa c>0, suchthat, forallnâN,
S(X)†1
n âwâWn P(w)K(w | (w))â€S(X)+ #A logn
n + c
n .
Theexpectationvalue
lim
nââE( 1
n K(X1 · · ·Xn |n))=S(X)
showsthat theaverageexpectedKolmogorovcomplexity for lengthndescriptionsapproaches the
Shannonentropyin the limitwhennââ.
450
Differential Geometrical Theory of Statistics
- Title
- Differential Geometrical Theory of Statistics
- Authors
- Frédéric Barbaresco
- Frank Nielsen
- Editor
- MDPI
- Location
- Basel
- Date
- 2017
- Language
- English
- License
- CC BY-NC-ND 4.0
- ISBN
- 978-3-03842-425-3
- Size
- 17.0 x 24.4 cm
- Pages
- 476
- Keywords
- Entropy, Coding Theory, Maximum entropy, Information geometry, Computational Information Geometry, Hessian Geometry, Divergence Geometry, Information topology, Cohomology, Shape Space, Statistical physics, Thermodynamics
- Categories
- Naturwissenschaften Physik