Page - 215 - in Differential Geometrical Theory of Statistics
Image of the Page - 215 -
Text of the Page - 215 -
Entropy2016,18, 433
such ∫
F ω∈Ωq(M).
Thevectorspaceof randomdifferentialq-forms isdenotedbyΩq(E).
LetΦU×φU bea local chartofM.Weset
(ΘU,Ξ)=(φU(U),Ξ)=ΦU(EU).
Werecal that inΦU(EU) thepartialdifferentiation ∂∂θ is called thehorizontaldifferentiation inEU.
Thereforeweuse therelation ∫
F ◦ ∂
∂θ = d
dθ ◦ ∫
F
forsetting thedeRhamcomplexof randomdifferential forms.Namely
Ω(E) : 0→R→Ω0(E)→ ...Ωq(E)→Ωq+1(E)...→Ωm(E)→0.
ThecomplexΩ(E) isacomplexofΓ-modules.Here
Γ=Aut(Ξ,Ω).
ThenthecohomologyspaceH∗(Γ,Ω(E)) isbigraded,
Hp,q(Γ,Ω(E))=Hp(Γ,Ωq(E)).
Theprobabilitydensity p isΓ-invariant. It isanelementofH0,0(Γ,Ω(E)).
8.4.5.AnotherHomologicalNatureofEntropy
One ofmain purpose of [14] is the homological nature of the entropy. The classical entropy
functionofastatisticalmodel [E,π,M,D,p] isdefinedby
E(π(e))= ∫
Eπ(e) p(e∗)log(p(e∗)).
In thecomplexΩ(E)weperformthemachineryofEilenberg [59]. Thatyields theexact sequence
(of randomcohomologyspaces)
→Hq−1res (E,R)→Hqe(E,R)→HqdR(E,R)→H q
res(E,R)→
Wetake intoaccount the identities
p(γ ·e)= p(e),
γ ·(π(e))=π(γ ·e).
Thenwehave
E(γ ·π(e))= ∫
Eγ·π(e) p(γ ·e∗)log(p(γ ·e∗))
= ∫
Eπ(γ·e) p(e∗)log(p(e∗))
=E(π(e)).
Thus theentropyE(π(e)) isΓ-equivariant. Therefore, itdefinesanequivariantcohomologyclass
[E]∈H0e(M,R).
215
Differential Geometrical Theory of Statistics
- Title
- Differential Geometrical Theory of Statistics
- Authors
- Frédéric Barbaresco
- Frank Nielsen
- Editor
- MDPI
- Location
- Basel
- Date
- 2017
- Language
- English
- License
- CC BY-NC-ND 4.0
- ISBN
- 978-3-03842-425-3
- Size
- 17.0 x 24.4 cm
- Pages
- 476
- Keywords
- Entropy, Coding Theory, Maximum entropy, Information geometry, Computational Information Geometry, Hessian Geometry, Divergence Geometry, Information topology, Cohomology, Shape Space, Statistical physics, Thermodynamics
- Categories
- Naturwissenschaften Physik