Ian Hacking
(1936-)
Ian Hacking is a philosopher and historian of science (trained in analytic language philosophy) who documented the development of
probability from the seventeenth century to the late nineteenth in his major works,
The Emergence of Probability (1975), and
The Taming of Chance (1990).
Hacking identifies probability with the mathematics of randomness and
chance, which did not appear until the Renaissance. From the beginning, he says, probability was
dual. It has an
epistemic element having to do with degrees of belief, and an
ontological aspect, having to do with the performance of randomizing devices like dice and coins in the long run of large numbers of trials. The first is epistemic or
a priori probability, the latter is the ontological and
a posteriori frequency statistics that we get from experiments.
Probabilities are theories used to establish degrees of belief. Statistics are experiments that may validate some theories.
In
The Taming of Chance, Hacking argues for a nineteenth-century "erosion of
determinism," making room for genuine
chance. (Other historians, e.g.,
Stephen Brush, made similar claims at about the same time.)
The most decisive conceptual event of twentieth century physics has been the discovery that the world is not deterministic. Causality, long the bastion of metaphysics, was toppled, or at least tilted: the past does not determine exactly what happens next. This event was preceded by a more gradual transformation. During the nineteenth century it became possible to see that the world might be regular and yet not subject to universal laws of nature. A space was cleared for chance.
This erosion of determinism made little immediate difference to anyone. Few were aware of it. Something else was pervasive and everybody came to know about it: the enumeration of people and their habits. Society became statistical. A new type of law came into being, analogous to the laws of nature, but pertaining to people. These new laws were expressed in terms of probability. They carried with them the connotations of normalcy and of deviations from the norm. The cardinal concept of the psychology of the Enlightenment had been, simply, human nature. By the end of the nineteenth century, it was being replaced by something different: normal people.
I argue that these two transformations are connected. Most of the events to be described took place in the social arena, not that of the natural sciences, but the consequences were momentous for both.
Throughout the Age of Reason, chance had been called the superstition of the vulgar. Chance, superstition, vulgarity, unreason were of one piece. The rational man, averting his eyes from such things, could cover chaos with a veil of inexorable laws. The world, it was said, might often look haphazard, but only because we do not know the inevitable workings of its inner springs. As for probabilities — whose mathematics was called the doctrine of chances — they were merely the defective but necessary tools of people who know too little.
There were plenty of sceptics about determinism in those days: those who needed room for freedom of the will, or those who insisted on the individual character of organic and living processes. None of these thought for a moment that laws of chance would provide an alternative to strictly causal laws. Yet by 1900 that was a real possibility, urged as fact by an adventurous few. The stage was set for ultimate indeterminism.
(The Taming of Chance, Cambridge, 1990, pp.1-2)
Most of the mathematicians (
Abraham de Moivre,
Pierre-Simon Laplace, Carl Friedrich Gauss, and others) who developed the calculus of probabilities, and most nineteenth-century physical scientists believed that randomness in chance events, including the atomic and molecular randomness that succeeded in explaining irreversibility and the second law of thermodynamics, may be the result of some unknown underlying universal laws of nature, such as the "law of large numbers" and the "
normal distribution."
Laplace explained the
appearance of chance as the result of human ignorance. He said,
"The word 'chance,' then expresses only our ignorance of the causes of the phenomena that we observe to occur and to succeed one another in no apparent order."
For most of them, the growing indeterminism described by Hacking was traceable to human ignorance of the detailed motion of atomic particles. To be sure, there were some nineteenth-century vociferous proponents of "absolute" chance, such as
Charles Sanders Peirce and the French philosophers
Charles Renouvier and
Alfred Fouillée, who inspired Peirce and his colleague
William James.
But the kind of
indeterminism we have as a result of quantum mechanical
indeterminacy is quite different from typical nineteenth centuries of
probability and
chance.
For example,
Arthur Stanley Eddington, who was intimately familiar with the statistical mechanical basis of the second law of thermodynamics, maintained that the determinism of classical physics, which presumably included
chance and probability, was gone forever.
In
The Nature of the Physical World (1928), Eddington dramatically announced
"It is a consequence of the advent of the quantum theory that physics is no longer pledged to a scheme of deterministic law,"
Prominent
dissenters from quantum theory such as
Max Planck,
Albert Einstein,
Louis de Broglie,
Erwin Schodinger, and
David Bohm, hoped that an underlying deterministic explanation would be found some day for quantum randomness.
Many philosophers, and a few scientists, still hold to this possibility of a return to strict
determinism and
causality.
The example of C. S. Peirce
Hacking uses
Charles Sanders Peirce as his model of a nineteenth-century thinker who embraced ontological chance (Peirce called it
tychism). While Peirce is an excellent choice, he is not at all typical. And Peirce had his doubts about chance, for example he criticized chance's role in the Darwinist version of evolution.
Peirce actually modeled his thinking on the work of
Charles Darwin, but he was not satisfied with Darwin's fortuitous variation and natural selection. He falsely associated it with the Social Darwinist thinking of his time and called it a "greed philosophy." Peirce also rejected the deterministic evolution scheme of Herbert Spencer, and proposed his own grand scheme for the evolution of everything including the laws of Nature! He called this third possibility
synechism, a coined term for continuity, in clear contrast to the merely random events of his tychism.
With his typical triad of chance, determinism, and continuity, Pierce's evolutionist thinking resembles that of
Hegel. It was the basis for the evolutionary growth of variety, of irregular departures from an otherwise mechanical universe, including life and Peirce's own original thoughts. For Peirce and Hegel, ideas are living things with meanings that grow over time. Peirce was a "realist" in that he believed these ideas have a metaphysically real existence.
Peirce argued that the laws of nature themselves changed with time, at least that laws "emerge" at different epochs and that the laws of biology are not reducible to the laws of chemistry and physics, and idea Peirce likely got from Emile Boutroux.
Hacking ends
The Taming of Chance with a paean to Peirce...
Peirce denied determinism. He also doubted that the world is a determinate given. He laboured in a community seeking to establish the true values of Babbage's constants of nature; he said there aren't any, over and above those numbers upon which we increasingly settle. He explained inductive learning and reasoning in terms of merely statistical stability. At the level of technique, he made the first self-conscious use of randomization in the design of experiments: that is, he used the law-like character of artificial chances in order to pose sharper questions and to elicit more informative answers. He provided one of the standard rationalia for statistical inference — one that, named after other and later workers, is still with us. He had an objective, frequentist approach to probability, but pioneered a measure of the subjective weight of evidence (the log odds). In epistemology and metaphysics, his pragmatic conception of reality made truth a matter of what we find out in the long run. But above all, he conceived of a universe that is irreducibly stochastic.
(ibid, pp.200-1)
I end with Peirce because he believed in absolute chance, but that is not my focus. His denial of the doctrine of necessity was incidental to a life permeated by statistics and probabilities. Somebody had to make a first leap to indeterminism. Maybe it was Peirce, perhaps a predecessor. It does not matter. He 'rejoiced to find' himself in the company of others, including Renouvier. He did argue against the doctrine of necessity, but it was not an argument that convinced him that chance is an irreducible clement of reality. He opened his eyes, and chance poured in — from a world which, in all its small details, he was seeing in a probabilistic way. In this respect, although he was very much a nineteenth-century man, he was already living in a twentieth-century environment. His working days of experimental routine, and his voyages of the mind, took place in a new kind of world that his century had been manufacturing: a world made of probabilities.
Peirce is the strongest possible indicator that certain things which could not be expressed at the end of the eighteenth century were said at the end of the nineteenth. I do not use him here because he is the happy upshot of preceding chapters, the point at which groping events finally led to the truth as we now see it. Not at all: some of what he wrote strikes me as false and much of it is obscure. I use him instead to exemplify a new field of possibilities, the one that we still inhabit. Chance poured in at every avenue of sense because he was living in a new probabilistic world. One can't grasp that just by reading him on the romantic subject of absolute chance. You have to glimpse the almost innumerable ways in which his world had become constructed out of probabilities, just like ours.
(ibid, pp.214-5)
Free Will
Hacking ends his opening argument with a famous quote from Kant on
free will (from an essay,
Idea for a Universal History with a Cosmopolitan Intent,), which shows Kant to believe that statistics may appear to be random but are clearly governed by a universal law.
Whatsoever difference there may be in our notions of the freedom of will metaphysically considered, it is evident that the manifestations of this will, viz. human actions, are as much under the control of universal laws of nature as any other physical phenomena. It is the province of History to narrate these manifestations; and, let their causes be ever so secret, we know that History, simply by taking its station at a distance and contemplating the agency of the human will upon a large scale, aims at unfolding to our view a regular stream of tendency in the great succession of events — so that the very same course of incidents which, taken separately and individually, would have seemed perplexed, incoherent, and lawless, yet viewed in their connection and as the actions of the human species and not of independent beings, never fail to discover a steady and continuous, though slow, development of certain great predispositions in our nature. Thus, for instance, deaths, births, and marriages, considering how much they are separately dependent on the freedom of the human will, should seem to be subject to no law according to which any calculation could be made beforehand of their amount: and yet the yearly registers of these events in great countries prove that they go on with as much conformity to the laws of nature as the oscillations of the weather.'
(ibid, p.15)
Hacking also looks briefly at twentieth-century arguments for freedom and tries to understand why they differ from a century earlier. He explains why probability seemed to create space for freedom in 1936, despite the fact that it had seemed to rule it out in 1836.
But this hardly explains why leading quantum scientists like
Max Planck,
Albert Einstein, and especially
Erwin Schrödinger, who endorsed the 19th-century view of probability and statistical mechanics developed by
Ludwig Boltzmann, should by 1936 be more determinist than Hacking feels that Peirce and other thinkers of the late 19th-century had become.
The second wave of quantum mechanics. which commenced in 1926, established that the fundamental laws of microphysics are irreducibly probabilistic.
In 1936 John von Neumann proved the first 'no hidden variables' theorem: no necessitarian, purely deterministic laws can underlie quantum physics. Some physicists and many kibitzers inferred that physics proves the reality of human freedom. Even today some say this solves the problem of free will.
The contrast between the sensibility of the 1830s and the 1930s seems paradoxical. In the 1930s, the conviction that the laws of nature are probabilistic was thought to make the world safe for freedom. The incoherence went in the opposite direction in the 1830s: if there were statistical laws of crime and suicide, then criminals could not help themselves. In 1930, probability made room for free will; in 1830, it precluded it.
This contrast only seems paradoxical. In the 1930s the laws of physics, which had long been the model of impersonal and irrevocable necessity, were shorn of their magisterial power. They had once ordained the slightest motion of the lightest atom and hence the fall of every sparrow, perhaps the Fall itself.
By 1936 they described only the probabilities of the future course of any individual particle. At most the collective behaviour of an enormous collection of entities or events was determined. Hence individuals within the ensemble might act freely. In the 1830s, in contrast, human behaviour was lumped under new probabilistic laws that were constantly compared to the law of gravity. Physics was still inexorable. Laws of society were like laws of physics and hence could not be violated. The 1930s pulled physics, and hence all law, away from determinism. The 1830s pulled laws of society towards physics, and hence towards determinism. That's why probability seemed to create space for freedom in 1936, and seemed to rule it out in 1836.
(ibid, p.116)
Hacking wrote an introductory essay for the 50th-anniversary edition of Kuhn's classic
The Structure of Scientific Revolutions.
For Teachers
To hide this material, click on the Normal link.
For Scholars
To hide this material, click on the Teacher or Normal link.