A paper of Witztum, Rips and Rosenberg in this journal in 1994 made the extraordinary claim that the Hebrew text of the Book of Genesis encodes events which did not occur until millennia after the text was written. In reply, we argue that Witztum, Rips and Rosenberg’s case is fatally defective, indeed that their result merely reflects on the choices made in designing their experiment and collecting the data for it. We present extensive evidence in support of that conclusion. We also report on many new experiments of our own, all of which failed to detect the alleged phenomenon.
This article outlines the scientific work and life of the Finnish statistician, probabilist, and mathematician Gustav Elfving (1908–1984). Elfving’s academic career, scientific contacts, and personal life are sketched, and his main research contributions to the fields of statistics, probability, and mathematics are reviewed. (Elfving’s pioneering work in optimal design of experiments is not covered, as this topic will be treated elsewhere in this issue.) A chronological bibliography of Gustav Elfving is also given.
Gustav Elfving contributed to the genesis of optimal experimental design theory with several papers mainly in the 1950s. These papers are presented and briefly analyzed. The connections between Elfving’s results and the results of his successors are elucidated to stress the relevance of Elfving’s impact on the development of optimal design theory.
We introduce Parrondo's paradox that involves games of chance. We consider two fair gambling games, A and B, both of which can be made to have a losing expectation by changing a biasing parameter $\epsilon$. When the two games are played in any alternating order, a winning expectation is produced, even though A and B are now losing games when played individually. This strikingly counterintuitive result is a consequence of discretetime Markov chains and we develop a heuristic explanation of the phenomenon in terms of a Brownian ratchet model. As well as having possible applications in electronic signal processing, we suggest important applications in a wide range of physical processes, biological models, genetic models and sociological models. Its impact on stock market models is also an interesting open question.
It is shown that the method of maximum likelihood occurs in rudimentary forms before Fisher [Messenger of Mathematics 41 (1912) 155–160], but not under this name. Some of the estimates called "most probable" would today have been called "most likely." Gauss [Z. Astronom. Verwandte Wiss. 1 (1816) 185–196] used invariance under parameter transformation when deriving his estimate of the standard deviation in the normal case. Hagen [Grundzüge der WahrscheinlichkeitsRechnung, Dümmler, Berlin (1837)] used the maximum likelihood argument for deriving the frequentist version of the method of least squares for the linear normal model. Edgeworth [J. Roy. Statist. Soc. 72 (1909) 81–90] proved the asymptotic normality and optimality of the maximum likelihood estimate for a restricted class of distributions. Fisher had two aversions: noninvariance and unbiasedness. Replacing the posterior mode by the maximum likelihood estimate he achieved invariance, and using a twostage method of maximum likelihood he avoided appealing to unbiasedness for the linear normal model.
Lucien Le Cam is currently Emeritus Professor of Mathematics and Statistics at the University of California, Berkeley. He was born on November 18, 1924, in Croze, Creuse, France. He received a Licence es Sciences from the University of Paris in 1945, and a Ph.D. in Statistics from the University of California at Berkeley in 1952. He has been on the faculty of the Statistics Department at Berkeley since 1952 except for a year in Montreal, Canada, as the Director of the Centre de Recherches Mathématiques (1972--1973). He served as Chairman of the Department of Statistics at Berkeley (1961–1965) and was coeditor with J. Neyman of the Berkeley Symposia.
Professor Le Cam is the principal architect of the modern asymptotic theory of statistics and has also made numerous other contributions. He developed a mathematical system that substantially extended Wald's statistical decision theory to the version being used today. With his introduction of the distance between experiments, we now have a coherent statistical theory that links the asymptotics and the statistical decision theory. Encompassed in the theory are the concepts of contiguity, asymptotic sufficiency, a new method of constructing estimators (the onestep estimator), the theory of local asymptotic normality (LAN), metric dimension and numerous other seminal ideas. The metric dimension, introduced in 1973, has been found to be fundamentally important in studying nonparametric or semiparametric problems. This monumental work culminated in a big book, Asymptotic Methods in Statistical Decision Theory, published by Springer in 1986.
Professor Le Cam's scientific contributions are not limited to theoretical statistics. At age 23 he introduced the characteristic functional technique (after Kolmogorov, but independently) to study the spatial and temporal distribution of rainfall and its relation to stream flow. It resulted in a model known as Le Cam’s model in hydrology. In the domain of probability theory, he was one of the early contributors to the study of convergence of measures in topological spaces. He refined the approximation theorems and the concentration inequalities of Kolmogorov and made extensions of these results to infinitedimensional spaces. We also owe to him the introduction of the concepts of $\tau$-smooth, and $\sigma$smooth that are widely used today.
In honor of his 70th birthday in 1994, a weeklong workshop and a conference were held at Yale University, organized by Professor David Pollard. In addition, a Festschrift for Le Cam, Research Papers in Probability and Statistics Papers, was published by Springer in 1997. He is married to Louise Romig, the daughter of a founder of statistical quality control, Harry Romig. They have three grown children, Denis, Steven and Linda.