This book is sequel to a book Statistical Inference: Testing of Hypotheses (published by PHI Learning). Intended for the postgraduate students of statistics, it introduces the problem of estimation in the light of foundations laid down by Sir R.A. Fisher (1922) and follows both classical and Bayesian approaches to solve these problems.
The book starts with discussing the growing levels of data summarization to reach maximal summarization and connects it with sufficient and minimal sufficient statistics. The book gives a complete account of theorems and results on uniformly minimum variance unbiased estimators (UMVUE)—including famous Rao and Blackwell theorem to suggest an improved estimator based on a sufficient statistic and Lehmann-Scheffe theorem to give an UMVUE. It discusses Cramer-Rao and Bhattacharyya variance lower bounds for regular models, by introducing Fishers information and Chapman, Robbins and Kiefer variance lower bounds for Pitman models. Besides, the book introduces different methods of estimation including famous method of maximum likelihood and discusses large sample properties such as consistency, consistent asymptotic normality (CAN) and best asymptotic normality (BAN) of different estimators.
Separate chapters are devoted for finding Pitman estimator, among equivariant estimators, for location and scale models, by exploiting symmetry structure, present in the model, and Bayes, Empirical Bayes, Hierarchical Bayes estimators in different statistical models. Systematic exposition of the theory and results in different statistical situations and models, is one of the several attractions of the presentation. Each chapter is concluded with several solved examples, in a number of statistical models, augmented with exposition of theorems and results.
KEY FEATURES
• Provides clarifications for a number of steps in the proof of theorems and related results.,
• Includes numerous solved examples to improve analytical insight on the subject by illustrating the application of theorems and results.
• Incorporates Chapter-end exercises to review student’s comprehension of the subject.
• Discusses detailed theory on data summarization, unbiased estimation with large sample properties, Bayes and Minimax estimation, separately, in different chapters.
MANOJ KUMAR SRIVASTAVA, Ph.D., is Associate Professor, Department of Statistics, Institute of Social Sciences, Dr. B.R. Ambedkar University (formerly Agra University), Agra. With a teaching experience of nearly two decades, Dr. Srivastava has published/presented several research publications in reputed national and international journals/conferences. He is a member of Indian Society of Agricultural Statistics, New Delhi; Assam Statistical Review, Dibrugarh University, Assam; Calcutta Statistical Association; Indian Bayesian Society, Indian Chapter of International Society for Bayesian Analysis (ISBA) and Advances and Application in Statistics, Allahabad University.|ABDUL HAMID KHAN, Ph.D., is former Dean, Faculty of Science and Chairman, Department of Statistics and Operations Research, Aligarh Muslim University (AMU), Aligarh. Dr. Khan has a teaching experience of more than 40 years at AMU. Earlier he has served at Al Fateh University, Libya, Al Yarmouk University, Jordan as Associate Professor and King Saud University, Riyadh, Kingdom of Saudi Arabia as Professor of Statistics. Dr. Khan has published more than 75 papers in reputed journals. An active member of many Statistical Societies in India and abroad, he is also associated with Universities/Colleges/PSC/UPSC etc. in different capacities.|NAMITA SRIVASTAVA, Ph.D., is Associate Professor, Department of Statistics, St. John’s College, Agra. With a teaching experience of nearly two decades, Dr. Srivastava has presented many research papers in conferences/seminars. She is a member of a number of professional organizations like Indian Society of Agricultural Statistics, New Delhi, Assam Statistical Review, Dibrugarh University, Assam, Calcutta Statistical Association, Indian Bayesian Society, Indian Chapter of International Society for Bayesian Analysis (ISBA).