Number of the records: 1  

Optimality conditions for maximizers of the information divergence from an exponential family

  1. 1.
    0098126 - ÚTIA 2008 RIV CZ eng J - Journal Article
    Matúš, František
    Optimality conditions for maximizers of the information divergence from an exponential family.
    [Podmínky oprimality pro informační divergenci od exponenciální rodiny.]
    Kybernetika. Roč. 43, č. 5 (2007), s. 731-746. ISSN 0023-5954
    R&D Projects: GA AV ČR IAA100750603
    Institutional research plan: CEZ:AV0Z10750506
    Keywords : Kullback-Leibler divergence * relative entropy * exponential family * information projection * log-Laplace transform * cumulant generating function * directional derivatives * convex functions * first order optimality conditions * polytopes
    Subject RIV: BA - General Mathematics
    Impact factor: 0.552, year: 2007

    The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q over Q in E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E.

    Informační divergence pravděpodobnostní měry P od exponenciální rodiny E se definuje jako infimum informačních divergencí P od Q cez Q v E. Pro diskrétní E byly explicitně nalezeny všechny směrové derivace informační divergence of E. Za tím účelem bylo studováno chování konjugované funkce k log-Laplacově transformaci. Byly nalezeny všechny podmínky optimality prvního řádu.
    Permanent Link: http://hdl.handle.net/11104/0157116

     
    FileDownloadSizeCommentaryVersionAccess
    0098126.pdf0933.8 KBPublisher’s postprintopen-access
     
Number of the records: 1  

  This site uses cookies to make them easier to browse. Learn more about how we use cookies.