Page - 295 - in Differential Geometrical Theory of Statistics
Image of the Page - 295 -
Text of the Page - 295 -
Entropy2016,18, 442
solutions.AsshowninFigure1, theupperenvelopeofGaussiandensitiescorrespondsto the lower
envelopeofparabolas.Wehave
Ci,j(a,b)=Mi(a,b) (
logwā²jā logĻā²jā 1
2 log(2Ļ)ā 1
2(Ļā²j)2 (
(μā²jāμi)2+Ļ2i ))
+ wiĻi
2 ā 2Ļ(Ļā²j)2 ā”ā£(a+μiā2μā²j)eā(aāμi) 2
2Ļ2i ā(b+μiā2μā²j)e ā(bāμi)2
2Ļ2i ā¤ā¦ , (22)
Mi(a,b)=āwi2 (
erf ( bāμiā
2Ļi )
āerf ( aāμiā
2Ļi ))
. (23)
2.3.4. TheCaseofGammaDistributions
Forsimplicity,weonlyconsidergammadistributionswith theshapeparameterk>0ļ¬xedand
the scaleĪ»> 0varying. Thedensity isdeļ¬nedon (0,ā) as p(x;k,Ī») = x kā1eā x
Ī»
Ī»kĪ(k) ,whereĪ(Ā·) is the
gammafunction. ItsCDFisΦ(x;k,Ī»)=γ(k,x/Ī»)/Ī(k),whereγ(Ā·, Ā·) is the lower incompletegamma
function. Twoweightedgammadensitiesw1p(x;k,λ1)andw2p(x;k,λ2) (withλ1 =λ2) intersectata
uniquepoint
x = (
log w1
Ī»k1 ā logw2
λk2 )/( 1
Ī»1 ā 1
λ2 )
(24)
ifx >0;otherwise theydonot intersect. Fromstraightforwardderivations,
Ci,j(a,b)= log wā²j
(Ī»ā²j)kĪ(k) Mi(a,b)+wi ā« b
a xkā1eā x
λi
Ī»kiĪ(k) (
x
Ī»ā²j ā(kā1) logx )
dx, (25)
Mi(a,b)=ā wiĪ(k) (
γ (
k, b
λi )
āγ (
k, a
λi ))
. (26)
Similar tothecaseofRayleighmixtures, the last terminEquation(25)reliesonnumerical integration.
3.Upper-BoundingtheDifferentialEntropyofaMixture
First, consideraļ¬niteparametricmixturemodelm(x)=āki=1wip(x;Īøi). Usingthechainruleof
theentropy,weendupwith thewell-knownlemma:
Lemma1. The entropy of a d-variatemixture is upper bounded by the sumof the entropy of itsmarginal
mixtures: H(m)ā¤ādi=1H(mi),wheremi is the1Dmarginalmixturewith respect tovariable xi.
Since the1DmarginalsofamultivariateGMMareunivariateGMMs,wethusgeta looseupper
bound. Ageneric sample-basedprobabilistic bound is reported for the entropies of distributions
withgivensupport [31]: Themethodbuildsprobabilisticupperandlowerpiecewisely linearCDFs
basedonani.i.d. ļ¬nitesamplesetof sizenandagivendeviationprobability threshold. It thenbuilds
algorithmicallybetweenthose twoboundsthemaximumentropydistribution[31]withaso-called
string-tighteningalgorithm.
Instead,weproceedas follows: Considerļ¬nitemixtures of componentdistributionsdeļ¬ned
on the full supportRd thathaveļ¬nite componentmeansandvariances (likeexponential families).
Thenweshalluse the fact that themaximumentropydistributionwithprescribedmeanandvariance
isaGaussiandistribution,andconclude theupperboundbypluggingthemixturemeanandvariance
in thedifferential entropy formulaof theGaussiandistribution. Ingeneral, themaximumentropy
withmomentconstraintsyieldsasasolutionanexponential family.
295
Differential Geometrical Theory of Statistics
- Title
- Differential Geometrical Theory of Statistics
- Authors
- FrƩdƩric Barbaresco
- Frank Nielsen
- Editor
- MDPI
- Location
- Basel
- Date
- 2017
- Language
- English
- License
- CC BY-NC-ND 4.0
- ISBN
- 978-3-03842-425-3
- Size
- 17.0 x 24.4 cm
- Pages
- 476
- Keywords
- Entropy, Coding Theory, Maximum entropy, Information geometry, Computational Information Geometry, Hessian Geometry, Divergence Geometry, Information topology, Cohomology, Shape Space, Statistical physics, Thermodynamics
- Categories
- Naturwissenschaften Physik