![]() ![]() There are always a lot of issues about existence and well-posedness when one tries to do these anayltic continuations - but for someone like me who is brought up on a daily diet of Feynman path-integrals its a very common issue to deal with and we have a lot of tools to address these. Rnyi entropy of order If a discrete random variable X has n possible values, where the i th outcome has probability pi, then the Rnyi entropy of order is defined to be for 0. After one has done such an integration one happily forgets about the manifold used and just tries to do the analytic continuation parametrically in the variable $q$. Information Theoretic Learning: Renyis Entropy and Kernel Perspectives (Information Science and Statistics)Jose C. Rnyi entropy, developed by Hungarian mathematician Alfrd Rnyi, generalizes Shannon entropy and includes other entropy measures as special cases. Most of us are familiar with - or at least have heard of - the Shannon entropy of a random variable, $H(X) = -\mathbb$, this I call "analytic" continuation because often enough one needs to do the interpolation via contours in the complex plane - and the continuation can depend on what contours one chooses through the poles and branch-cuts of the $S_q$ that one started with)Īt integral values of $q>1$ typically there is a always a very well-defined construction in terms of some integration of some function on some $q-$branched manifold. This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information, and these results. 1.1 Shannon and Rnyi entropies The most commonly used measure of randomness of a distri- bution p over a set X is its Shannon entropy. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |