Get Free Shipping on orders over $79
Minimum Gamma-Divergence for Regression and Classification Problems : SpringerBriefs in Statistics - Shinto Eguchi
eTextbook alternate format product

Instant online reading.
Don't wait for delivery!

Minimum Gamma-Divergence for Regression and Classification Problems

By: Shinto Eguchi

Paperback | 15 April 2025 | Edition Number 2024

At a Glance

Paperback


$74.99

or 4 interest-free payments of $18.75 with

 or 

Ships in 7 to 10 business days

This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index γ is positive. The gamma-divergence can be defined even when the power index γ is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative γ. In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when γ is equal to -1.



The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method.



In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference.



 

More in Probability & Statistics

Implementing R for Statistics - Christophe  Chesneau

RRP $180.95

$165.75

Introduction to Medical Statistics : 4th edition - Martin Bland

RRP $70.95

$62.75

12%
OFF
Research Methods and Statistics in Psychology : 8th Edition - Hugh Coolican
Rationality : What It Is, Why It Seems Scarce, Why It Matters - Steven Pinker
Sampling Theory and Practice - Casey Murphy
Practical Statistics - Nancy Maxwell

$437.75

Foundations of Statistics - Everett Davies
Mathematical Statistics with Applications : 7th Edition - Dennis Wackerly
Statistics for The Behavioral Sciences : 10th Edition - Frederick J. Gravetter
The Art of Statistics : Learning from Data - David Spiegelhalter

RRP $26.99

$22.99

15%
OFF