Statistical Data Analysis and Entropy

Statistical Data Analysis and Entropy PDF

Author: Nobuoki Eshima

Publisher:

Published: 2020

Total Pages: 263

ISBN-13: 9789811525537

DOWNLOAD EBOOK →

This book reconsiders statistical methods from the point of view of entropy, and introduces entropy-based approaches for data analysis. Further, it interprets basic statistical methods, such as the chi-square statistic, t-statistic, F-statistic and the maximum likelihood estimation in the context of entropy. In terms of categorical data analysis, the book discusses the entropy correlation coefficient (ECC) and the entropy coefficient of determination (ECD) for measuring association and/or predictive powers in association models, and generalized linear models (GLMs). Through association and GLM frameworks, it also describes ECC and ECD in correlation and regression analyses for continuous random variables. In multivariate statistical analysis, canonical correlation analysis, T2-statistic, and discriminant analysis are discussed in terms of entropy. Moreover, the book explores the efficiency of test procedures in statistical tests of hypotheses using entropy. Lastly, it presents an entropy-based path analysis for structural GLMs, which is applied in factor analysis and latent structure models. Entropy is an important concept for dealing with the uncertainty of systems of random variables and can be applied in statistical methodologies. This book motivates readers, especially young researchers, to address the challenge of new approaches to statistical data analysis and behavior-metric studies.

Statistical Data Analysis and Entropy

Statistical Data Analysis and Entropy PDF

Author: Nobuoki Eshima

Publisher: Springer Nature

Published: 2020-01-21

Total Pages: 263

ISBN-13: 9811525528

DOWNLOAD EBOOK →

This book reconsiders statistical methods from the point of view of entropy, and introduces entropy-based approaches for data analysis. Further, it interprets basic statistical methods, such as the chi-square statistic, t-statistic, F-statistic and the maximum likelihood estimation in the context of entropy. In terms of categorical data analysis, the book discusses the entropy correlation coefficient (ECC) and the entropy coefficient of determination (ECD) for measuring association and/or predictive powers in association models, and generalized linear models (GLMs). Through association and GLM frameworks, it also describes ECC and ECD in correlation and regression analyses for continuous random variables. In multivariate statistical analysis, canonical correlation analysis, T2-statistic, and discriminant analysis are discussed in terms of entropy. Moreover, the book explores the efficiency of test procedures in statistical tests of hypotheses using entropy. Lastly, it presents an entropy-based path analysis for structural GLMs, which is applied in factor analysis and latent structure models. Entropy is an important concept for dealing with the uncertainty of systems of random variables and can be applied in statistical methodologies. This book motivates readers, especially young researchers, to address the challenge of new approaches to statistical data analysis and behavior-metric studies.

Loss Data Analysis

Loss Data Analysis PDF

Author: Henryk Gzyl

Publisher: Walter de Gruyter GmbH & Co KG

Published: 2018-02-05

Total Pages: 210

ISBN-13: 3110516136

DOWNLOAD EBOOK →

This volume deals with two complementary topics. On one hand the book deals with the problem of determining the the probability distribution of a positive compound random variable, a problem which appears in the banking and insurance industries, in many areas of operational research and in reliability problems in the engineering sciences. On the other hand, the methodology proposed to solve such problems, which is based on an application of the maximum entropy method to invert the Laplace transform of the distributions, can be applied to many other problems. The book contains applications to a large variety of problems, including the problem of dependence of the sample data used to estimate empirically the Laplace transform of the random variable. Contents Introduction Frequency models Individual severity models Some detailed examples Some traditional approaches to the aggregation problem Laplace transforms and fractional moment problems The standard maximum entropy method Extensions of the method of maximum entropy Superresolution in maxentropic Laplace transform inversion Sample data dependence Disentangling frequencies and decompounding losses Computations using the maxentropic density Review of statistical procedures

Entropy Measures for Data Analysis

Entropy Measures for Data Analysis PDF

Author: Karsten Keller

Publisher: MDPI

Published: 2019-12-19

Total Pages: 260

ISBN-13: 3039280325

DOWNLOAD EBOOK →

Entropies and entropy-like quantities play an increasing role in modern non-linear data analysis. Fields that benefit from this application range from biosignal analysis to econophysics and engineering. This issue is a collection of papers touching on different aspects of entropy measures in data analysis, as well as theoretical and computational analyses. The relevant topics include the difficulty to achieve adequate application of entropy measures and the acceptable parameter choices for those entropy measures, entropy-based coupling, and similarity analysis, along with the utilization of entropy measures as features in automatic learning and classification. Various real data applications are given.

Loss Data Analysis

Loss Data Analysis PDF

Author: Henryk Gzyl

Publisher: Walter de Gruyter GmbH & Co KG

Published: 2023-03-06

Total Pages: 222

ISBN-13: 3111048187

DOWNLOAD EBOOK →

This volume deals with two complementary topics. On one hand the book deals with the problem of determining the the probability distribution of a positive compound random variable, a problem which appears in the banking and insurance industries, in many areas of operational research and in reliability problems in the engineering sciences. On the other hand, the methodology proposed to solve such problems, which is based on an application of the maximum entropy method to invert the Laplace transform of the distributions, can be applied to many other problems. The book contains applications to a large variety of problems, including the problem of dependence of the sample data used to estimate empirically the Laplace transform of the random variable. Contents Introduction Frequency models Individual severity models Some detailed examples Some traditional approaches to the aggregation problem Laplace transforms and fractional moment problems The standard maximum entropy method Extensions of the method of maximum entropy Superresolution in maxentropic Laplace transform inversion Sample data dependence Disentangling frequencies and decompounding losses Computations using the maxentropic density Review of statistical procedures

Entropy, Large Deviations, and Statistical Mechanics

Entropy, Large Deviations, and Statistical Mechanics PDF

Author: Richard.S. Ellis

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 372

ISBN-13: 1461385334

DOWNLOAD EBOOK →

This book has two main topics: large deviations and equilibrium statistical mechanics. I hope to convince the reader that these topics have many points of contact and that in being treated together, they enrich each other. Entropy, in its various guises, is their common core. The large deviation theory which is developed in this book focuses upon convergence properties of certain stochastic systems. An elementary example is the weak law of large numbers. For each positive e, P{ISn/nl 2: e} con verges to zero as n --+ 00, where Sn is the nth partial sum of indepen dent identically distributed random variables with zero mean. Large deviation theory shows that if the random variables are exponentially bounded, then the probabilities converge to zero exponentially fast as n --+ 00. The exponen tial decay allows one to prove the stronger property of almost sure conver gence (Sn/n --+ 0 a.s.). This example will be generalized extensively in the book. We will treat a large class of stochastic systems which involve both indepen dent and dependent random variables and which have the following features: probabilities converge to zero exponentially fast as the size of the system increases; the exponential decay leads to strong convergence properties of the system. The most fascinating aspect of the theory is that the exponential decay rates are computable in terms of entropy functions. This identification between entropy and decay rates of large deviation probabilities enhances the theory significantly.

Concepts and Recent Advances in Generalized Information Measures and Statistics

Concepts and Recent Advances in Generalized Information Measures and Statistics PDF

Author: Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

Publisher: Bentham Science Publishers

Published: 2013-12-13

Total Pages: 432

ISBN-13: 1608057607

DOWNLOAD EBOOK →

Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantifiers are powerful tools for the study of general time and data series independently of their sources, this book will be useful to all those doing research connected with information analysis. The tutorials in this volume are written at a broadly accessible level and readers will have the opportunity to acquire the knowledge necessary to use the information theory tools in their field of interest.

Maximum Entropy and Bayesian Methods Garching, Germany 1998

Maximum Entropy and Bayesian Methods Garching, Germany 1998 PDF

Author: Wolfgang von der Linden

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 380

ISBN-13: 9401147108

DOWNLOAD EBOOK →

In 1978 Edwin T. Jaynes and Myron Tribus initiated a series of workshops to exchange ideas and recent developments in technical aspects and applications of Bayesian probability theory. The first workshop was held at the University of Wyoming in 1981 organized by C.R. Smith and W.T. Grandy. Due to its success, the workshop was held annually during the last 18 years. Over the years, the emphasis of the workshop shifted gradually from fundamental concepts of Bayesian probability theory to increasingly realistic and challenging applications. The 18th international workshop on Maximum Entropy and Bayesian Methods was held in Garching / Munich (Germany) (27-31. July 1998). Opening lectures by G. Larry Bretthorst and by Myron Tribus were dedicated to one of th the pioneers of Bayesian probability theory who died on the 30 of April 1998: Edwin Thompson Jaynes. Jaynes revealed and advocated the correct meaning of 'probability' as the state of knowledge rather than a physical property. This inter pretation allowed him to unravel longstanding mysteries and paradoxes. Bayesian probability theory, "the logic of science" - as E.T. Jaynes called it - provides the framework to make the best possible scientific inference given all available exper imental and theoretical information. We gratefully acknowledge the efforts of Tribus and Bretthorst in commemorating the outstanding contributions of E.T. Jaynes to the development of probability theory.

Maximum Entropy and Bayesian Methods

Maximum Entropy and Bayesian Methods PDF

Author: G. Erickson

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 300

ISBN-13: 9401150281

DOWNLOAD EBOOK →

This volume has its origin in the Seventeenth International Workshop on Maximum Entropy and Bayesian Methods, MAXENT 97. The workshop was held at Boise State University in Boise, Idaho, on August 4 -8, 1997. As in the past, the purpose of the workshop was to bring together researchers in different fields to present papers on applications of Bayesian methods (these include maximum entropy) in science, engineering, medicine, economics, and many other disciplines. Thanks to significant theoretical advances and the personal computer, much progress has been made since our first Workshop in 1981. As indicated by several papers in these proceedings, the subject has matured to a stage in which computational algorithms are the objects of interest, the thrust being on feasibility, efficiency and innovation. Though applications are proliferating at a staggering rate, some in areas that hardly existed a decade ago, it is pleasing that due attention is still being paid to foundations of the subject. The following list of descriptors, applicable to papers in this volume, gives a sense of its contents: deconvolution, inverse problems, instrument (point-spread) function, model comparison, multi sensor data fusion, image processing, tomography, reconstruction, deformable models, pattern recognition, classification and group analysis, segmentation/edge detection, brain shape, marginalization, algorithms, complexity, Ockham's razor as an inference tool, foundations of probability theory, symmetry, history of probability theory and computability. MAXENT 97 and these proceedings could not have been brought to final form without the support and help of a number of people.