Ebook Chapman and Hall/CRC Monographs on Statistics and Applied Probability: Approximating Data 135 PDF, MOBI
9781482215861 1482215861 The First Detailed Account of Statistical Analysis That Treats Models as Approximations The idea of truth plays a role in both Bayesian and frequentist statistics. The Bayesian concept of coherence is based on the fact that two different models or parameter values cannot both be true. Frequentist statistics is formulated as the problem of estimating the "true but unknown" parameter value that generated the data. Forgoing any concept of truth, Data Analysis and Approximate Models: Model Choice, Location-Scale, Analysis of Variance, Nonparametric Regression and Image Analysis presents statistical analysis/inference based on approximate models. Developed by the author, this approach consistently treats models as approximations to data, not to some underlying truth. The author develops a concept of approximation for probability models with applications to: Discrete data Location scale Analysis of variance (ANOVA) Nonparametric regression, image analysis, and densities Time series Model choice The book first highlights problems with concepts such as likelihood and efficiency and covers the definition of approximation and its consequences. A chapter on discrete data then presents the total variation metric as well as the Kullback Leibler and chi-squared discrepancies as measures of fit. After focusing on outliers, the book discusses the location-scale problem, including approximation intervals, and gives a new treatment of higher-way ANOVA. The next several chapters describe novel procedures of nonparametric regression based on approximation. The final chapter assesses a range of statistical topics, from the likelihood principle to asymptotics and model choice.", This book is a philosophical study of statistics via the concept of data approximation. It is an approach developed by this well-regarded author during his career, and he has now decided to pull his ideas together as a monograph. The main idea is that models are, at best, an approximation of real data, and any analysis must take this into account. It is therefore closely related to robust statistics and nonparametric statistics, and can be used to study nearly any statistical technique. The last chapter presents an interesting discussion of the frequentist vs Bayesian debate in statistics., This book presents a philosophical study of statistics via the concept of data approximation. Developed by the well-regarded author, this approach discusses how analysis must take into account that models are, at best, an approximation of real data. It is, therefore, closely related to robust statistics and nonparametric statistics and can be used to study nearly any statistical technique. The book also includes an interesting discussion of the frequentist versus Bayesian debate in statistics.
9781482215861 1482215861 The First Detailed Account of Statistical Analysis That Treats Models as Approximations The idea of truth plays a role in both Bayesian and frequentist statistics. The Bayesian concept of coherence is based on the fact that two different models or parameter values cannot both be true. Frequentist statistics is formulated as the problem of estimating the "true but unknown" parameter value that generated the data. Forgoing any concept of truth, Data Analysis and Approximate Models: Model Choice, Location-Scale, Analysis of Variance, Nonparametric Regression and Image Analysis presents statistical analysis/inference based on approximate models. Developed by the author, this approach consistently treats models as approximations to data, not to some underlying truth. The author develops a concept of approximation for probability models with applications to: Discrete data Location scale Analysis of variance (ANOVA) Nonparametric regression, image analysis, and densities Time series Model choice The book first highlights problems with concepts such as likelihood and efficiency and covers the definition of approximation and its consequences. A chapter on discrete data then presents the total variation metric as well as the Kullback Leibler and chi-squared discrepancies as measures of fit. After focusing on outliers, the book discusses the location-scale problem, including approximation intervals, and gives a new treatment of higher-way ANOVA. The next several chapters describe novel procedures of nonparametric regression based on approximation. The final chapter assesses a range of statistical topics, from the likelihood principle to asymptotics and model choice.", This book is a philosophical study of statistics via the concept of data approximation. It is an approach developed by this well-regarded author during his career, and he has now decided to pull his ideas together as a monograph. The main idea is that models are, at best, an approximation of real data, and any analysis must take this into account. It is therefore closely related to robust statistics and nonparametric statistics, and can be used to study nearly any statistical technique. The last chapter presents an interesting discussion of the frequentist vs Bayesian debate in statistics., This book presents a philosophical study of statistics via the concept of data approximation. Developed by the well-regarded author, this approach discusses how analysis must take into account that models are, at best, an approximation of real data. It is, therefore, closely related to robust statistics and nonparametric statistics and can be used to study nearly any statistical technique. The book also includes an interesting discussion of the frequentist versus Bayesian debate in statistics.