Number of the records: 1  

Optimality conditions for maximizers of the information divergence from an exponential family

  1. 1.
    0041122 - ÚTIA 2007 RIV CZ eng K - Conference Paper (Czech conference)
    Matúš, František
    Optimality conditions for maximizers of the information divergence from an exponential family.
    [Optimální podmínky pro maximalizaci informační divergence exponenciální rodiny.]
    WUPES '06 Proceedings of 7th Workshop on Uncertainty Processing. Prague: University of Economics, 2006 - (Vejnarová, J.; Kroupa, T.), s. 96-110. ISBN 80-245-1079-0.
    [WUPES 2006. Mikulov (CZ), 16.09.2006-20.09.2006]
    R&D Projects: GA AV ČR IAA100750603
    Institutional research plan: CEZ:AV0Z10750506
    Keywords : Kullback-Leibler divergence * relative entropy * exponential family * information projection * cumulant generating function * log-Laplace transform
    Subject RIV: BD - Theory of Information

    The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q in E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E.

    Informační divergence pravděpodobnostní míry P od exponenciální rodiny se definuje jako infimum divergencí P od Q v E. Byly spočteny směrové derivace této divergence pomocí nových výsledků o konjugaci log-Lapaceovy transformace. Byly formulovány nové nutné podmínky prvního řádu proto, aby P byla maximalizátorem této divergence
    Permanent Link: http://hdl.handle.net/11104/0134694

     
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.