Críticas:
'In the last decade, Efron has played a leading role in laying down the foundations of large-scale inference, not only in bringing back and developing old ideas, but also linking them with more recent developments, including the theory of false discovery rates and Bayes methods. We are indebted to him for this timely, readable and highly informative monograph, a book he is uniquely qualified to write. It is a synthesis of many of Efron's own contributions over the last decade with that of closely related material, together with some connecting theory, valuable comments, and challenges for the future. His avowed aim is 'not to have the last word' but to help us deal 'with the burgeoning statistical problems of the twenty-first century'. He has succeeded admirably.' Terry Speed, International Statistical Review
Reseña del editor:
We live in a new age for statistical inference, where modern scientific technology such as microarrays and fMRI machines routinely produce thousands and sometimes millions of parallel data sets, each with its own estimation or testing problem. Doing thousands of problems at once is more than repeated application of classical methods. Taking an empirical Bayes approach, Bradley Efron, inventor of the bootstrap, shows how information accrues across problems in a way that combines Bayesian and frequentist ideas. Estimation, testing and prediction blend in this framework, producing opportunities for new methodologies of increased power. New difficulties also arise, easily leading to flawed inferences. This book takes a careful look at both the promise and pitfalls of large-scale statistical inference, with particular attention to false discovery rates, the most successful of the new statistical techniques. Emphasis is on the inferential ideas underlying technical developments, illustrated using a large number of real examples.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.