Добрый день, Коллеги. Важное сообщение, просьба принять участие. Музей Ферсмана ищет помощь для реставрационных работ в помещении. Подробности по ссылке
This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an “information geometry” background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory. <...>
Information theory, as we shall be concerned with it, is a branch of the mathematical theory of probability and statistics. As such, its abstract formulations are applicable to any probabilistic or statistical system of observations. Consequently, we find information theory applied in a variety of fields, as are probability and statistics. It plays an important role in modern communication theory, which formulates a communication system as a stochastic or random process.
У нас есть: 30022 книги, 5949 карт, 41 инбокс. Итого: 36012 материалов