Sonderangebot Stämpflis juristische Lehrbücher: Bis Ende November profitieren Sie von 20% Rabatt auf folgende Lehr- und Praxisbücher.
Fokusthemen
Publikationen
Services
Autorinnen/Autoren
Verlag
Shop
LEXIA
Zeitschriften
SachbuchLOKISemaphor

Estimation of Mutual Information

Inhalt

This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.

Bibliografische Angaben

April 2025, ca. 120 Seiten, Behaviormetrics: Quantitative Approaches to Human Behavior, Englisch
Springer Nature EN
978-981-1307-33-1

Inhaltsverzeichnis

Schlagworte

Weitere Titel der Reihe: Behaviormetrics: Quantitative Approaches to Human Behavior

Alle anzeigen

Weitere Titel zum Thema