Results by Title
28 books about Probability & Statistics

Are Changing Constituencies Driving Rising Polarization in the U.S. House of Representatives?
Jesse Sussell
RAND Corporation, 2015
This report addresses two questions: first, whether the spatial distribution of the American electorate has become more geographically clustered over the last 40 years with respect to party voting and socioeconomic attributes; and second, whether this clustering process has contributed to rising polarization in the U.S. House of Representatives.
Expand Description


The Broken Dice, and Other Mathematical Tales of Chance
Ivar Ekeland
University of Chicago Press, 1993
Library of Congress QA273.E4313 1993  Dewey Decimal 519.2
Ivar Ekeland extends his consideration of the catastrophe theory of the universe begun in his widely acclaimed Mathematics and the Unexpected, by drawing on rich literary sources, particularly the Norse saga of Saint Olaf, and such current topics as chaos theory, information theory, and particle physics.
"Ivar Ekeland gained a large and enthusiastic following with Mathematics and the Unexpected, a brilliant and charming exposition of fundamental new discoveries in the theory of dynamical systems. The Broken Dice continues the same theme, and in the same elegant, seemingly effortless style, but focuses more closely on the implications of those discoveries for the rest of human culture. What are chance and probability? How has our thinking about them been changed by the discovery of chaos? What are all of these concepts good for? . . . Ah, but, I mustn't give the game away, any more than I should if I were reviewing a detective novel. And this is just as gripping a tale. . . . Beg, borrow, or preferably buy a copy. . . . I guarantee you won't be disappointed."—Ian Stewart, Science
Expand Description


The Chicago Guide to Writing about Multivariate Analysis, Second Edition
Jane E. Miller
University of Chicago Press, 2013
Library of Congress T11.M484 2013  Dewey Decimal 808.066519
Many different people, from social scientists to government agencies to business professionals, depend on the results of multivariate models to inform their decisions. Researchers use these advanced statistical techniques to analyze relationships among multiple variables, such as how exercise and weight relate to the risk of heart disease, or how unemployment and interest rates affect economic growth. Yet, despite the widespread need to plainly and effectively explain the results of multivariate analyses to varied audiences, few are properly taught this critical skill.
The Chicago Guide to Writing about Multivariate Analysis is the book researchers turn to when looking for guidance on how to clearly present statistical results and break through the jargon that often clouds writing about applications of statistical analysis. This new edition features even more topics and realworld examples, making it the musthave resource for anyone who needs to communicate complex research results.
For this second edition, Jane E. Miller includes four new chapters that cover writing about interactions, writing about event history analysis, writing about multilevel models, and the “Goldilocks principle” for choosing the right size contrast for interpreting results for different variables. In addition, she has updated or added numerous examples, while retaining her clear voice and focus on writers thinking critically about their intended audience and objective. Online podcasts, templates, and an updated study guide will help readers apply skills from the book to their own projects and courses.
This continues to be the only book that brings together all of the steps involved in communicating findings based on multivariate analysis—finding data, creating variables, estimating statistical models, calculating overall effects, organizing ideas, designing tables and charts, and writing prose—in a single volume. When aligned with Miller’s twelve fundamental principles for quantitative writing, this approach will empower readers—whether students or experienced researchers—to communicate their findings clearly and effectively.
Expand Description


The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives
Stephen T. Ziliak and Deirdre N. McCloskey
University of Michigan Press, 2008
Library of Congress HB137.Z55 2008  Dewey Decimal 330.015195
“McCloskey and Ziliak have been pushing this very elementary, very correct, very important argument through several articles over several years and for reasons I cannot fathom it is still resisted. If it takes a book to get it across, I hope this book will do it. It ought to.”
—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics
“With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”
—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health
The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots.
Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).
Expand Description


Error and the Growth of Experimental Knowledge
Deborah G. Mayo
University of Chicago Press, 1996
Library of Congress QA275.M347 1996  Dewey Decimal 001.434
We may learn from our mistakes, but Deborah Mayo argues that, where experimental knowledge is concerned, we haven't begun to learn enough. Error and the Growth of Experimental Knowledge launches a vigorous critique of the subjective Bayesian view of statistical inference, and proposes Mayo's own errorstatistical approach as a more robust framework for the epistemology of experiment. Mayo genuinely addresses the needs of researchers who work with statistical analysis, and simultaneously engages the basic philosophical problems of objectivity and rationality.
Mayo has long argued for an account of learning from error that goes far beyond detecting logical inconsistencies. In this book, she presents her complete program for how we learn about the world by being "shrewd inquisitors of error, white gloves off." Her tough, practical approach will be important to philosophers, historians, and sociologists of science, and will be welcomed by researchers in the physical, biological, and social sciences whose work depends upon statistical analysis.
Expand Description


Good Thinking: The Foundations of Probability and Its Applications
I.J. Good
University of Minnesota Press, 1983
Library of Congress QA273.4.G66 1983  Dewey Decimal 519.2
Good Thinking was first published in 1983.Good Thinking is a representative sampling of I. J. Good’s writing on a wide range of questions about the foundations of statistical inference, especially where induction intersects with philosophy. Good believes that clear reasoning about many important practical and philosophical questions is impossible except in terms of probability. This book collects from various published sources 23 of Good’s articles with an emphasis on more philosophical than mathematical.He covers such topics as rational decisions, randomness, operational research, measurement of knowledge, mathematical discovery, artificial intelligence, cognitive psychology, chess, and the nature of probability itself. In spite of the wide variety of topics covered, Good Thinking is based on a unified philosophy which makes it more than the sum of its parts. The papers are organized into five sections: Bayesian Rationality; Probability; Corroboration, Hypothesis Testing, and Simplicity; Information and Surprise; and Causality and Explanation. The numerous references, an extensive index, and a bibliography guide the reader to related modern and historic literature.This collection makes available to a wide audience, for the first time, the most accessible work of a very creative thinker. Philosophers of science, mathematicians, scientists, and, in Good’s words, anyone who wants “to understand understanding, to reason about reasoning, to explain explanation, to think about thought, and to decide how to decide” will find Good Thinking a stimulating and provocative look at probability.
Expand Description


Handbook of Quantitative Ecology
Justin Kitzes
University of Chicago Press
An essential guide to quantitative research methods in ecology and conservation biology, accessible for even the most mathaverse student or professional.
Quantitative research techniques have become increasingly important in ecology and conservation biology, but the sheer breadth of methods that must be understood—from population modeling and probabilistic thinking to modern statistics, simulation, and data science—as well as a lack of computational or mathematics training have hindered quantitative literacy in these fields. In this book , ecologist Justin Kitzes answers those challenges for students and practicing scientists alike.
Requiring only basic algebra and the ability to use a spreadsheet, the Handbook of Quantitative Ecology is designed to provide a practical, intuitive, and integrated introduction to widely used quantitative methods. Kitzes builds each chapter around a specific ecological problem and arrives, step by step, at a general principle through the act of solving that problem. Grouped into five broad categories—difference equations, probability, matrix models, likelihood statistics, and other numerical methods—the book introduces basic concepts, starting with exponential and logistic growth, and helps readers to understand the field’s more advanced subjects, such as permutation tests, stochastic optimization, and cellular automata. Complete with online solutions to all numerical problems, Kitzes’s Handbook of Quantitative Ecology is an ideal coursebook for both undergraduate and graduate students of ecology, as well as a useful and necessary resource for mathematically outofpractice scientists.
Expand Description


The Hidden Game of Baseball: A Revolutionary Approach to Baseball and Its Statistics
John Thorn, Pete Palmer, with David Reuther
University of Chicago Press, 2015
Library of Congress GV867.T49 2015  Dewey Decimal 796.357021
Long before Moneyball became a sensation or Nate Silver turned the knowledge he’d honed on baseball into electoral gold, John Thorn and Pete Palmer were using statistics to shake the foundations of the game. First published in 1984, The Hidden Game of Baseball ushered in the sabermetric revolution by demonstrating that we were thinking about baseball stats—and thus the game itself—all wrong. Instead of praising sluggers for gaudy RBI totals or pitchers for wins, Thorn and Palmer argued in favor of more subtle measurements that correlated much more closely to the ultimate goal: winning baseball games.
The new gospel promulgated by Thorn and Palmer opened the door for a flood of new questions, such as how a ballpark’s layout helps or hinders offense or whether a strikeout really is worse than another kind of out. Taking questions like these seriously—and backing up the answers with data—launched a new era, showing fans, journalists, scouts, executives, and even players themselves a new, better way to look at the game.
This brandnew edition retains the body of the original, with its rich, accessible analysis rooted in a deep love of baseball, while adding a new introduction by the authors tracing the book’s influence over the years. A foreword by ESPN’s lead baseball analyst, Keith Law, details The Hidden Game’s central role in the transformation of baseball coverage and team management and shows how teams continue to reap the benefits of Thorn and Palmer’s insights today. Thirty years after its original publication, The Hidden Game is still bringing the high heat—a true classic of baseball literature.
Expand Description


The History of Statistics: The Measurement of Uncertainty before 1900
Stephen M. Stigler
Harvard University Press, 1986
Library of Congress QA276.15.S75 1986  Dewey Decimal 519.509
This magnificent book is the first comprehensive history of statistics from its beginnings around 1700 to its emergence as a distinct and mature discipline around 1900. Stephen M. Stigler shows how statistics arose from the interplay of mathematical concepts and the needs of several applied sciences including astronomy, geodesy, experimental psychology, genetics, and sociology. He addresses many intriguing questions: How did scientists learn to combine measurements made under different conditions? And how were they led to use probability theory to measure the accuracy of the result? Why were statistical methods used successfully in astronomy long before they began to play a significant role in the social sciences? How could the introduction of least squares predate the discovery of regression by more than eighty years? On what grounds can the major works of men such as Bernoulli, De Moivre, Bayes, Quetelet, and Lexis be considered partial failures, while those of Laplace, Galton, Edgeworth, Pearson, and Yule are counted as successes? How did Galton’s probability machine (the quincunx) provide him with the key to the major advance of the last half of the nineteenth century?
Stigler’s emphasis is upon how, when, and where the methods of probability theory were developed for measuring uncertainty in experimental and observational science, for reducing uncertainty, and as a conceptual framework for quantitative studies in the social sciences. He describes with care the scientific context in which the different methods evolved and identifies the problems (conceptual or mathematical) that retarded the growth of mathematical statistics and the conceptual developments that permitted major breakthroughs.
Statisticians, historians of science, and social and behavioral scientists will gain from this book a deeper understanding of the use of statistical methods and a better grasp of the promise and limitations of such techniques. The product of ten years of research, The History of Statistics will appeal to all who are interested in the humanistic study of science.
Expand Description


How Our Days Became Numbered: Risk and the Rise of the Statistical Individual
Dan Bouk
University of Chicago Press, 2015
Library of Congress HG8531.B68 2015  Dewey Decimal 368.3200973
Long before the age of "Big Data" or the rise of today's "selfquantifiers," American capitalism embraced "risk"and proceeded to number our days. Life insurers led the way, developing numerical practices for measuring individuals and groups, predicting their fates, and intervening in their futures. Emanating from the gilded boardrooms of Lower Manhattan and making their way into drawing rooms and tenement apartments across the nation, these practices soon came to change the futures they purported to divine.
How Our Days Became Numbered tells a story of corporate culture remaking American culturea story of intellectuals and professionals in and around insurance companies who reimagined Americans' lives through numbers and taught ordinary Americans to do the same. Making individuals statistical did not happen easily. Legislative battles raged over the propriety of discriminating by race or of smoothing away the effects of capitalism's fluctuations on individuals. Meanwhile, debates within companies set doctors against actuaries and agents, resulting in elaborate, secretive systems of surveillance and calculation.
Dan Bouk reveals how, in a little over half a century, insurers laid the groundwork for the muchquantified, riskinfused world that we live in today. To understand how the financial world shapes modern bodies, how risk assessments can perpetuate inequalities of race or sex, and how the quantification and claims of risk on each of us continue to grow, we must take seriously the history of those who view our lives as a series of probabilities to be managed.
Expand Description


InfiniteDimensional Optimization and Convexity
Ivar Ekeland and Thomas Turnbull
University of Chicago Press, 1983
Library of Congress QA402.5.E39 1983  Dewey Decimal 519
In this volume, Ekeland and Turnbull are mainly concerned with existence theory. They seek to determine whether, when given an optimization problem consisting of minimizing a functional over some feasible set, an optimal solution—a minimizer—may be found.
Expand Description


An Introduction to Mathematical Statistics
Fetsje Bijma, Marianne Jonker, and Aad van der Vaart
Amsterdam University Press, 2017
Statistics is the science that focuses on drawing conclusions from data, by modeling and analyzing the data using probabilistic models. In An Introduction to Mathematical Statistics the authors describe key concepts from statistics and give a mathematical basis for important statistical methods. Much attention is paid to the sound application of those methods to data. The three main topics in statistics are estimators, tests, and confidence regions. The authors illustrate these in many examples, with a separate chapter on regression models, including linear regression and analysis of variance. They also discuss the optimality of estimators and tests, as well as the selection of the bestfitting model. Each chapter ends with a case study in which the described statistical methods are applied. This book assumes a basic knowledge of probability theory, calculus, and linear algebra. Several annexes are available for Mathematical Statistics on this page.
Expand Description


The Logic of Decision
Richard C. Jeffrey
University of Chicago Press, 1983
Library of Congress QA279.5.J43 1983  Dewey Decimal 519.542
"[This book] proposes new foundations for the Bayesian principle of rational action, and goes on to develop a new logic of desirability and probabtility."—Frederic Schick, Journal of Philosophy
Expand Description


Modern Factor Analysis
Harry H. Harman
University of Chicago Press, 1976
Library of Congress QA278.5.H38 1976  Dewey Decimal 519.53
This thoroughly revised third edition of Harry H. Harman's authoritative text incorporates the many new advances made in computer science and technology over the last ten years. The author gives full coverage to both theoretical and applied aspects of factor analysis from its foundations through the most advanced techniques. This highly readable text will be welcomed by researchers and students working in psychology, statistics, economics, and related disciplines.
Expand Description


Modern Sampling Methods: Theory, Experimentation, Application
Palmer Johnson
University of Minnesota Press, 1959
Modern Sampling Methods: Theory, Experimentation, Application was first published in 1959. Minnesota Archive Editions uses digital technology to make longunavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.Of both theoretical and practical use to statisticians and research workers using sampling techniques, this book describes five new multistage sampling models. The models are described, compared, and evaluated through a skillfully designed experiment. The number of stages in all five models is the same; the manner in which they differ is in the particular sampling technique applied at each of the several stages. Recommendations are given on the choice of the most suitable model for a given practical situation. A mathematical appendix presents two lemmas that are useful for derivation of sampling formulas in multistage sampling.
Expand Description


The Nature of Scientific Evidence: Statistical, Philosophical, and Empirical Considerations
Edited by Mark L. Taper and Subhash R. Lele
University of Chicago Press, 2004
Library of Congress Q180.55.S7N37 2004  Dewey Decimal 507.2
An exploration of the statistical foundations of scientific inference, The Nature of Scientific Evidence asks what constitutes scientific evidence and whether scientific evidence can be quantified statistically. Mark Taper, Subhash Lele, and an esteemed group of contributors explore the relationships among hypotheses, models, data, and inference on which scientific progress rests in an attempt to develop a new quantitative framework for evidence. Informed by interdisciplinary discussions among scientists, philosophers, and statisticians, they propose a new "evidential" approach, which may be more in keeping with the scientific method. The Nature of Scientific Evidence persuasively argues that all scientists should care more about the fine points of statistical philosophy because therein lies the connection between theory and data.
Though the book uses ecology as an exemplary science, the interdisciplinary evaluation of the use of statistics in empirical research will be of interest to any reader engaged in the quantification and evaluation of data.
Expand Description


Observation and Experiment: An Introduction to Causal Inference
Paul R. Rosenbaum
Harvard University Press, 2017
Library of Congress Q175.32.C38R67 2017  Dewey Decimal 001.4340151954
In the face of conflicting claims about some treatments, behaviors, and policies, the question arises: What is the most scientifically rigorous way to draw conclusions about cause and effect in the study of humans? In this introduction to causal inference, Paul Rosenbaum explains key concepts and methods through realworld examples.
Expand Description


Prediction and Regulation by Linear LeastSquare Methods
Peter Whittle
University of Minnesota Press, 1963
Library of Congress QA279.2.W48 1983  Dewey Decimal 519.54
Prediction and Regulation by Linear LeastSquare Methods was first published in 1963. This revised second edition was issued in 1983. Minnesota Archive Editions uses digital technology to make longunavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.During the past two decades, statistical theories of prediction and control have assumed an increasing importance in all fields of scientific research. To understand a phenomenon is to be able to predict it and to influence it in predictable ways. First published in 1963 and long out of print, Prediction and Regulation by Linear LeastSquare Methods offers important tools for constructing models of dynamic phenomena. This elegantly written book has been a basic reference for researchers in many applied sciences who seek practical information about the representation and manipulation of stationary stochastic processes. Peter Whittle’s text has a devoted group of readers and users, especially among economists. This edition contains the unchanged text of the original and adds new works by the author and a foreword by economist Thomas J. Sargent.
Expand Description


A Primer of Probability Logic
Ernest W. Adams
CSLI, 1996
Library of Congress QA10.A34 1998  Dewey Decimal 511.3
This book is meant to be a primer, that is, an introduction, to probability logic, a subject that appears to be in its infancy. Probability logic is a subject envisioned by Hans Reichenbach and largely created by Adams. It treats conditionals as bearers of conditional probabilities and discusses an appropriate sense of validity for arguments such conditionals, as well as ordinary statements as premisses.
This is a clear wellwritten text on the subject of probability logic, suitable for advanced undergraduates or graduates, but also of interest to professional philosophers. There are wellthoughtout exercises, and a number of advanced topics treated in appendices, while some are brought up in exercises and some are alluded to only in footnotes. By this means, it is hoped that the reader will at least be made aware of most of the important ramifications of the subject and its tieins with current research, and will have some indications concerning recent and relevant literature.
Expand Description


Proximity and Preference: Problems in the Multidimensional Analysis of Large Data Sets
Reginald Golledge
University of Minnesota Press, 1982
Proximity and Preference was first published in 1982.How does one design experiments for collecting large volumes of data such as those needed for marketing surveys, studies of travel patterns, and public opinion polls? This is a common problem for social and behavioral scientists. The papers in this collection address the problems of working with large data sets primarily from the perspectives of geography and psychology, two fields that share a common quantitative research methodology.After an introductory paper on substantive and methodological aspects of the interface between geography and psychology, the book is divided into three sections, experimental design and measurement problems, preference functions and choice behavior, and special problems of analyzing large data sets with multidimensional methods. Each paper is directed toward some fundamental problem such as those relating to experimental design, data reliability, and the selection of analytical methods which are appropriate for data sets of various sizes, completeness, and reliability.
Expand Description


Randomness
Deborah J. Bennett
Harvard University Press, 1998
Library of Congress QA273.15.B46 1998  Dewey Decimal 519.2
From the ancients’ first readings of the innards of birds to your neighbor’s last bout with the state lottery, humankind has put itself into the hands of chance. Today life itself may be at stake when probability comes into play—in the chance of a false negative in a medical test, in the reliability of DNA findings as legal evidence, or in the likelihood of passing on a deadly congenital disease—yet as few people as ever understand the odds. This book is aimed at the trouble with trying to learn about probability. A story of the misconceptions and difficulties civilization overcame in progressing toward probabilistic thinking, Randomness is also a skillful account of what makes the science of probability so daunting in our own day.
To acquire a (correct) intuition of chance is not easy to begin with, and moving from an intuitive sense to a formal notion of probability presents further problems. Author Deborah Bennett traces the path this process takes in an individual trying to come to grips with concepts of uncertainty and fairness, and also charts the parallel path by which societies have developed ideas about chance. Why, from ancient to modern times, have people resorted to chance in making decisions? Is a decision made by random choice “fair”? What role has gambling played in our understanding of chance? Why do some individuals and societies refuse to accept randomness at all? If understanding randomness is so important to probabilistic thinking, why do the experts disagree about what it really is? And why are our intuitions about chance almost always dead wrong?
Anyone who has puzzled over a probability conundrum is struck by the paradoxes and counterintuitive results that occur at a relatively simple level. Why this should be, and how it has been the case through the ages, for bumblers and brilliant mathematicians alike, is the entertaining and enlightening lesson of Randomness.
Expand Description


Ratio Correlation: A Manual for Students of Petrology and Geochemistry
Felix Chayes
University of Chicago Press, 1971
Library of Congress QE431.5.C47  Dewey Decimal 552.001519537


Risk Quantification and Allocation Methods for Practitioners
Jaume BellesSampers, Montserrat Guillén, and Miguel Santolino
Amsterdam University Press, 2017
Risk Quantification and Allocation Methods for Practitioners offers a practical approach to risk management in the financial industry. This indepth study provides quantitative tools to better describe qualitative issues, as well as clear explanations of how to transform recent theoretical developments into computational practice, and key tools for dealing with the issues of risk measurement and capital allocation.
Expand Description


The Seven Pillars of Statistical Wisdom
Stephen M. Stigler
Harvard University Press, 2016
Library of Congress QA276.15S754 2016  Dewey Decimal 519.5
What gives statistics its unity as a science? Stephen Stigler sets forth the seven foundational ideas of statistics—a scientific discipline related to but distinct from mathematics and computer science and one which often seems counterintuitive. His original account will fascinate the interested layperson and engage the professional statistician.
Expand Description


Statistics on the Table: The History of Statistical Concepts and Methods
Stephen M. Stigler
Harvard University Press, 2002
Library of Congress QA276.15.S755 1999  Dewey Decimal 519.509
This lively collection of essays examines in witty detail the history of some of the concepts involved in bringing statistical argument "to the table," and some of the pitfalls that have been encountered. The topics range from seventeenthcentury medicine and the circulation of blood, to the cause of the Great Depression and the effect of the California gold discoveries of 1848 upon price levels, to the determinations of the shape of the Earth and the speed of light, to the meter of Virgil's poetry and the prediction of the Second Coming of Christ. The title essay tells how the statistician Karl Pearson came to issue the challenge to put "statistics on the table" to the economists Marshall, Keynes, and Pigou in 1911. The 1911 dispute involved the effect of parental alcoholism upon children, but the challenge is general and timeless: important arguments require evidence, and quantitative evidence requires statistical evaluation. Some essays examine deep and subtle statistical ideas such as the aggregation and regression paradoxes; others tell of the origin of the Average Man and the evaluation of fingerprints as a forerunner of the use of DNA in forensic science. Several of the essays are entirely nontechnical; all examine statistical ideas with an ironic eye for their essence and what their history can tell us about current disputes.
Expand Description


Thinking Through Statistics
John Levi Martin
University of Chicago Press, 2018
Library of Congress HA29.M135 2018  Dewey Decimal 001.422
Simply put, Thinking Through Statistics is a primer on how to maintain rigorous data standards in social science work, and one that makes a strong case for revising the way that we try to use statistics to support our theories. But don’t let that daunt you. With clever examples and witty takeaways, John Levi Martin proves himself to be a most affable tour guide through these scholarly waters.
Martin argues that the task of social statistics isn't to estimate parameters, but to reject false theory. He illustrates common pitfalls that can keep researchers from doing just that using a combination of visualizations, reanalyses, and simulations. Thinking Through Statistics gives social science practitioners accessible insight into troves of wisdom that would normally have to be earned through arduous trial and error, and it does so with a lighthearted approach that ensures this field guide is anything but stodgy.
Expand Description


The Total Survey Error Approach: A Guide to the New Science of Survey Research
Herbert F. Weisberg
University of Chicago Press, 2005
Library of Congress HM538.W45 2005  Dewey Decimal 300.723
In 1939, George Gallup's American Institute of Public Opinion published a pamphlet optimistically titled The New Science of Public Opinion Measurement. At the time, though, survey research was in its infancy, and only now, six decades later, can public opinion measurement be appropriately called a science, based in part on the development of the total survey error approach.
Herbert F. Weisberg's handbook presents a unified method for conducting good survey research centered on the various types of errors that can occur in surveys—from measurement and nonresponse error to coverage and sampling error. Each chapter is built on theoretical elements drawn from specific disciplines, such as social psychology and statistics, and follows through with detailed treatments of the specific types of error and their potential solutions. Throughout, Weisberg is attentive to survey constraints, including time and ethical considerations, as well as controversies within the field and the effects of new technology on the survey process—from Internet surveys to those completed by phone, by mail, and in person. Practitioners and students will find this comprehensive guide particularly useful now that survey research has assumed a primary place in both public and academic circles.
Expand Description


Unifying Political Methodology: The Likelihood Theory of Statistical Inference
Gary King
University of Michigan Press, 1998
Library of Congress JA71.K563 1998  Dewey Decimal 320.01
One of the hallmarks of the development of political science as a discipline has been the creation of new methodologies by scholars within the disciplinemethodologies that are wellsuited to the analysis of political data. Gary King has been a leader in the development of these new approaches to the analysis of political data. In his book, Unifying Political Methodology, King shows how the likelihood theory of inference offers a unified approach to statistical modeling for political research and thus enables us to better analyze the enormous amount of data political scientists have collected over the years. Newly reissued, this book is a landmark in the development of political methodology and continues to challenge scholars and spark controversy.
"Gary King's Unifying Political Methodology is at once an introduction to the likelihood theory of statistical inference and an evangelist's call for us to change our ways of doing political methodology. One need not accept the altar call to benefit enormously from the book, but the intellectual debate over the call for reformation is likely to be the enduring contribution of the work."
Charles Franklin, American Political Science Review
"King's book is one of the only existing books which deal with political methodology in a clear and consistent framework. The material in it is now and will continue to be essential reading for all serious students and researchers in political methodology." R. Michael Alvarez, California Institute of Technology
Gary King is Professor of Government, Harvard University. One of the leading thinkers in political methodology, he is the author of A Solution to the Ecological Inference Problem: Reconstructing Individual Behavior from Aggregate Data and other books and articles.
Expand Description


