The Pigeon in the Machine: The Concept of Control in Behaviourism and Cybernetics

Ana Teixera Pinto

When, in 1913, John B. Watson gave his inaugural address at Columbia University, “Psychology as the Behaviourist Views It,”1 he made clear what he meant when he said that psychology was a discipline whose “theoretical goal is the prediction and control of behaviour.” Strongly influenced by Ivan Pavlov’s study of conditioned reflexes, Watson wanted to claim an objective scientific status for applied psychology. In order to anchor psychology firmly in the field of the natural sciences, however, psychologists would have to abandon speculation in favour of the experimental method. That is, psychologists would have to directly research on living things, and their parts. Dogs, rats, pigeons, and human infants came to share the countertop with Bunsen burners and glass flasks.

Animals, as Henry David Thoreau once noted, are all beasts of burden, “made to carry a portion of our thoughts.”2 They are a ragged replacement for humanity in both the material, and the metaphorical sense. The history of control as a scientific concern is also the history of the beasts, big and small, whose lives have been but raw matter for experiments. Three of these animals have achieved iconic status: Pavlov’s Dog, Schrödinger’s Cat and Skinner’s Pigeon. Needless to say, Pavlov’s dog was not one but many, and the same holds true for Skinner’s lab pigeons. Schrödinger’s cat, on the other hand, was not a cat at all but a rhetorical figure; short hand for a breathing, living thing, and small enough to fit into a suitcase. Indeed, this is not a text about animals, but it has originated in a work about animals by the artist Jan Peter Hammer, which David Riff commissioned for the Bergen Biennial in 013. I thank them both for the inspiration that our conversations have brought me.

The concept of control in the life sciences emerged out of the Victorian obsession with order. In a society shaped by glaring asymmetries and uneven development, a middle-class lifestyle was as promising as it was precarious; being that downward mobility was the norm. Economic insecurity was swiftly systematized into a code of conduct and the newly found habits of hygiene extrapolated from medicine to morals. Both behaviourism and eugenics stem out of an excessive preoccupation with proficiency and the need to control potential deviations. Watson, for instance, was convinced that thumb-sucking bred “masturbators”3—though the fixation with order extends much farther than biology. For Erwin Schrödinger, for instance, life was synonymous with order; entropy was a measure of death or disorder. Not only behaviourism but all other disciplinary fields that emerged in the early twentieth century in the USA, from molecular biology to cybernetics, revolve around the same metaphor.

After World War I, under the pressure of rapid industrialization and massive demographic shifts, the old social institutions—like family, class or church—began to erode4. The crisis of authority that ensued led to “ongoing attempts to establish new and lasting forms of social control.”5 Behaviourism was to champion a method through which “coercion from without” is easily masked as “coercion from within”—two types of constraint that would later be re-conceptualized as resolution and marketed as vocation to a growing class of young professionals and self-made career-seekers. Watson’s straightforward characterization of “man as a machine” was to prove instrumental in sketching out the conceptual framework for the emergence of a novel social technology devoted to control.

Yet what does it mean to identify human beings with mechanisms? What does it mean to establish similarities between living tissue and electronic circuitry? Machines are passive in their activity; they are replicable and predictable, and made out of parts such as cogs and wheels; they can be assembled and re-assembled. Machines, one could say, are the ideal slave, and slavery is the political unconscious behind every attempt to automate the production process.

The scientific field of applied psychology appealed to an emerging technocracy because it promised to prevent social tensions from taking on a political form, thereby managing social mobility in a society that would only let people up the ladder a few at a time.6 Behaviorism, as Watson explicitly stated, was strictly “non-political,” which is not to say that it would forsake authoritarianism and regimentation. Pre-emptive psychological testing would detect any inklings of “conduct deviation”, “emotional upsets”, “unstandardized sex reactions” or “truancy”, and warrant a process of reconditioning to purge “unsocial ways of behaving.”7 Developing in parallel to the first Red Scare, behaviourism is not a scientific doctrine; it is a political position. Just as the rhetoric of British Parliamentarism sought to stave off the French revolution, the rhetoric of American liberalism masks the fear of communist contagion: the imperatives of individualism and meritocracy urge individuals to rise from their class rather than with it.

Dogs, Rats, and a Baby Boy

Behaviourism had an uneasy relationship with the man who was credited to have founded it, the Russian physiologist Ivan Pavlov. Although Watson seemed to praise Pavlov’s comparative study of the psychological responses between higher mammals and humans, he never manifested the intention to pursue such a route. Instead, he focused on how social agents could shape children’s dispositions through the method he had borrowed from Pavlov. In his “Little Albert Experiment,” Watson and his assistant Rosalie Rayner tried to condition an eleven-month-old infant to fear stimuli that he wouldn’t have normally been predisposed to be afraid of.

Little Albert was first presented with several furry lab animals, amongst which was a white rat. After having established that Little Albert had no previous anxiety concerning the animal, Watson and Rayner began a series of tests that sought to associate the presence of the rat with a loud, unexpected noise, which Watson would elicit by striking a steel bar with a hammer. Upon hearing the noise, the child showed clear signs of distress, crying compulsively. After a sequence of trials in which the two stimuli were paired (the rat and the clanging sound), Little Albert was again presented with the rat alone. This time round however, the child seemed clearly agitated and distressed. Replacing the rat with a rabbit and a small dog, Watson also established that Little Albert had generalized his fear to all furry animals. Though the experiment was never successfully reproduced, Watson became convinced that it would be possible to define psychology as the study of the acquisition and deployment of habits.

In the wake of Watson’s experiments, American psychologists began to treat all forms of learning as skills—from “maze running in rats […] to the growth of a personality pattern.”8 For the behaviourist movement, both animal and human behaviour could be entirely explained in terms of reflexes, stimulus-response associations, and the effects of reinforcing agents upon them. Following in Watson’s footsteps, Burrhus Frederic Skinner researched how specific external stimuli affected learning using a method that he termed “operant conditioning”. While classic—or Pavlovian—conditioning simply pairs a stimulus and a response, in operant conditioning, the animal’s behavior is initially spontaneous, but the feedback that it elicits reinforces or inhibits the recurrence of certain actions. Employing a chamber which became known as the Skinner Box, Skinner could schedule rewards and establish rules.9 An animal could be conditioned for many days, each time following the same procedure, until a given pattern of behaviour was stabilized.

What behaviorists failed to realize was that only under laboratory conditions are particular outcomes necessarily produced by specific stimuli. As John A. Mills notes, “… in real life situations, by contrast, we can seldom identify reinforcing events and give a precise, moment-to-moment account of how reinforcers shape behaviour.”10 Outside of the laboratory, the same response can be the outcome of widely different antecedents, and one single cause is notoriously hard to identify. All in all, “… one can use the principle of operant conditioning as an explanatory principle only if one has created beforehand a situation in which operant principles must apply.”11

Not surprisingly, both Watson and Skinner put forth fully fleshed-out fictional accounts of behaviourist utopias: Watson, in his series of articles for Harper’s magazine; and Skinner, in his 1948 novel Walden Two. The similarities are striking, though Skinner lacks the callous misogyny and casual cruelty of his forerunner. For both authors, crime is a function of freedom. If social behavior is not managed, one can expect an increase in the number of social ills: unruliness, crime, poverty, war, and the like. Socializing people in an appropriate manner, however, requires absolute control over the educational process. Behaviourist utopia thus involves the surrender of education to a technocratic hierarchy, which would dispense with representative institutions and due political process.12

Apoliticism, as we have already noted, does not indicate that a society is devoid of coercion, but rather, that instead of representing social struggles as antagonistic, along the Marxist model of class conflict, behaviourists such as Watson and Skinner reflected the ethos of self-discipline and efficiency espoused by social planers and technocrats. Behaviourist utopias, as Kerry Buckley notes, “… worshipped efficiency alone,” tacitly ignored any conception of good and evil, and “weigh[ed] their judgments on a scale that measured only order and disorder.”13

The pigeon-guided missile experiment by B.F. Skinner, commonly known as Project Pigeon , later renamed Project Orcon, for “organic control” (1944–1948). Courtesy of the B.F. Skinner Foundation.

Pigeons, Servos, and Kamikaze Pilots

Much the same as behaviourism, cybernetics is also predicated on input-output analyses. Skinner’s description of operant behaviour as a repertoire of possible actions, some of which are selected by reinforcement, is not unlike Wiener’s description of informational loops. Behaviourism, just like cybernetics, is based on a recursive (feedback) model, which is known in biology as reinforcement. To boot, behaviourism and cybernetics have often shared more than an uncanny affinity. During World War II, both Norbert Wiener and B. F. Skinner worked on parallel research projects for the US military. Whilst Wiener together with engineer Julian Bigelow, was attempting to develop his Anti-Aircraft Predictor (AA-Predictor), a machine that was supposed to anticipate the trajectory of enemy planes, Skinner was trying to develop a pigeon-guided missile.

The idea for Project Pigeon (which was later renamed Project Orcon—“ORganic CONtrol”, after Skinner complained that nobody took him seriously) predates the American participation in the war, yet the Japanese kamikaze attacks in 1944 gave the project a renewed boost. Though the kamikaze pilots did not significantly impact the course of the war, their psychological significance cannot be overestimated—the Japanese soldiers were often depicted as lice, or vermin, but the kamikaze represented the even more unsettling identity between the organic and the mechanic.

Technically speaking, every mechanism usurps a human function. Faced with the cultural interdiction to produce his own slave-soldiers, Skinner reportedly pledged to “provide a competent substitute” for the human kamikaze. The Project Pigeon team began to train pigeons to peck when they saw a target through a bull’s-eye. The birds were then harnessed to a hoist so that the pecking movements provided the signals to control the missile. As long as the pecks remained in the centre of the screen, the missile would fly straight, but pecks off-centre would cause the screen to tilt, which via a connection to the missile’s flight controls, would then cause the missile to change course and slowly travel towards its designated target. Skinner’s pigeons proved reliable under stress, acceleration, pressure and temperature differences. In the following months, however, as Skinner’s project was still far from being operative, Skinner was asked to produce quantitative data that could be analyzed at the MIT Servomechanisms Laboratory. Skinner allegedly deplored being forced to assume the language of servo-engineering, and scorned the usage of terms such as “signal” and “information”. Project Pigeon ended up being cancelled on October 8, 1944, because the military believed that it had no immediate promise for combat application.


Naval History and Heritage Command, Photo Archives, NH 62696.

In the meantime, Wiener’s team was trying to simulate the four different types of trajectories that an enemy plane could take in its attempt to escape artillery fire, with the help of a differential analyser. As Peter Galison notes, “here was a problem simultaneously physical and physiological: the pilot, flying amidst the explosion of flak, the turbulence of air, and the sweep of searchlights, trying to guide an airplane to a target.”14 Under the strain of combat conditions, human behaviour is easy to scale down to a limited number of reflex reactions. Commenting on the analogy between the mechanical and the human behaviour pattern, Wiener concluded that the pilot’s evasion techniques would follow the same feedback principles that regulated the actions of servo-mechanisms—an idea he would swiftly extrapolate into a more general physiological theory.15

Though Wiener’s findings emerged out of his studies in engineering, “the Wiener predictor is based on good behaviourist ideas, since it tries to predict the future actions of an organism not by studying the structure of the organism but by studying the past behaviour of the organism.”16 Feedback in Wiener’s definition is “the property of being able to adjust future conduct by past performance.”17 Wiener also adopted the functional analysis that accompanies behaviourism—dealing with observable behaviour alone, and the view that all behaviour is intrinsically goal-oriented and/or purposeful. A frog aiming at a fly and a target-seeking missile are teleological mechanisms: both gather information in order to readjust their course of action. Similarities notwithstanding, Wiener never gave behaviourists any credit, instead offering them only disparaging criticism.

In 1943 the AA-Predictor was abandoned as the National Defense Research Committee concentrated on the more successful M9, the gun director that Parkinson, Lovell, Blackman, Bode and Shannon had been developing at Bell Labs. A strategic failure, much like Project Pigeon, the AA-predictor could have ended up in the dustbin of military history, had the encounter with physiology not proven decisive in Wiener’s description of man-machine interactions as a unified equation, which he went on to develop both as mathematical model and as a rhetorical device.

Circuits and the Soviets

Rather than any reliable anti-aircraft artillery, what emerged out of the AA-project was Wiener’s re-conceptualization of the term “information”, which he was about to transform into a scientific concept.18 Information—heretofore a concept with a vague meaning—begun to be treated as a statistical property, exacted by the mathematical analyses of a time-series. This paved the way for information to be defined as a mathematical entity.

Simply put, this is what cybernetics is: the treatment of feedback as a conceptual abstraction. Yet, by suggesting that “everything in the universe can be modelled into a system of information”, cybernetics also entails a “powerful metaphysics, whose essence—in spite of all the ensuing debates—always remained elusive.”19 One could even say that cybernetics is the conflation of several scientific fields into a powerful exegetical model, which Wiener sustained with his personal charisma.20 Wiener was after all “a visionary who could articulate the larger implications of the cybernetic paradigm and make clear its cosmic significance.”21 Explaining the cardinal notions of statistical mechanics to the laymen, he drew a straightforward, yet dramatic analogy: entropy is “nature’s tendency to degrade the organized and destroy the meaningful,”22 thus “the stable state of a living organism is to be dead.”23 Abstract and avant-garde art, he would later hint, are “a Niagara of increasing entropy.”24

“Entropy”, which would become a key concept for cybernetics, was first applied to biology by the physicist Erwin Schrödinger. While attempting to unify the disciplinary fields of biology and physics, Schrödinger felt confronted with a paradox. The relative stability of living organisms was in apparent contradiction with the second Law of Thermodynamics, which states that since energy is more easily lost than gained, the tendency of any closed system is to dissipate energy over time, thus increasing its entropy.25 How are thus living organisms able to “obviate their inevitable thermal death”?26 Schrödinger solved his puzzle by recasting organisms as thermodynamic systems that extract “orderliness” from their environment in order to counteract increasing entropy. This idea entailed a curious conclusion: the fundamental divide between living and non-living was not to be found between organisms and machines but between order and chaos. For Schrödinger, entropy became a measure of disorder.27

Schrödinger’s incursions into the field of the life sciences were rebuffed by biologists and his theories were found to be wanting. His translation of biological concepts into the lexicon of physics would have a major impact however, as Schrödinger introduced into the scientific discourse the crucial analogy, which would ground the field of molecular biology: “the chromosome as a message written in code.”28

Courtesy of Sally Eertdelstein,, last accessed 10 October 2014.

The code metaphor was conspicuously derived from the war efforts and their system of encoding and decoding military messages. Claude Shannon, a cryptologist, had also extrapolated the code metaphor to encompass all human communication, and like Schrödinger, he employed the concept of entropy in a broader sense, as a measure of uncertainty. Oblivious to the fact that the continuity Schrödinger had sketched between physics and biology was almost entirely metaphorical, Wiener would later describe the message as a form of organization, stating that information is the opposite of entropy. In a rhetoric straight from the Cold War, Wiener also described the universe as an increasingly chaotic place in which, against all odds, small islands of life fight to preserve order and increase organisation.29

Emboldened by Wiener’s observations on the epistemological relevance of the new field, the presuppositions that underpinned the study of thermodynamic systems spread to evolutionary biology, neuroscience, anthropology, psychology, language studies, ecology, politics, and economy. Between 1943 and 1954 ten conferences under the heading “Cybernetics—Circular Causal, and Feedback Mechanisms in Biological and Social Systems” were held at the Macy Foundation, sponsored by Josiah Macy Jr. The contributing scholars tried to develop a universal theory of regulation and control, applicable to economic as well as to mental processes, and to sociological as well as to aesthetic phenomena. Contemporary art, for instance, was described as an operationally closed system, which reduces the complexity of its environment according to a program it devises for itself.30 Behaviourism—the theory, which had first articulated the aspiration to formulate a single encompassing theory for all human and animal behaviour, based on the analogy between man and machine—was finally assimilated into the strain of cybernetics, which became known as cognitivism.

By the early 1950s, based on W. Ross Ashby and Claude Shannon’s information theory, the ontology of man became equated with the functionality of programming. Molecular and evolutionary biology treated genetic information as an essential code; the body being but its carrier. Cognitive science and neurobiology described consciousness as the processing of formal symbols and logical inferences, operating under the assumption that the brain is analogous to computer hardware and that the mind is analogous to computer software. Exalting cybernetics as a new philosophy of universal application, Occidental authors made ever more fantastic claims. In the 1950s, Norbert Wiener had suggested that it was theoretically possible to telegraph a human being, and that it was only a matter of time until the necessary technology would become available.31 In the 1980s, scientists argued that it would soon be possible to upload human consciousness and have one’s grandmother run on Windows—or stored in a floppy disk. Science fiction brimmed with fantasies of immortal life as informational code. Stephan Wolfran even went so far as to claim that reality is a program run by a cosmic computer. Consciousness is but the “user’s illusion”; the interface, so to speak.

The debate concerning the similarities and differences between living tissue and electronic circuitry also gave rise to darker man machine fantasies: zombies, living dolls, robots, brain washing, and hypnotism. Animism is correlated with the problem of agency: who or what can be said to have volition, a question which involves a transfer of purpose from the animate to the inanimate. “Our consciousness of will in another person,” Wiener argued, “is just that sense of encountering a self-maintaining mechanism aiding or opposing our actions. By providing such a self-stabilizing resistance, the airplane acts as if it had purpose, in short, as if it were inhabited by a Gremlin.” This Gremlin, “the servomechanical enemy, became […] the prototype for human physiology and, ultimately, for all of human nature.”32

Defining peace as a state of dynamic equilibrium, cybernetics proved to be an effective tool to escape from a vertical, authoritarian system, and to enter a horizontal, self-regulating one. Many members of the budding counterculture were drawn to its promise of spontaneous organization and harmonious order. This order was already in place in Adam Smith’s description of free-market interaction, however. Regulating devices—especially after Watts’s incorporation of the governor into the steam engine in the 1780s—had been correlated with a political rhetoric, which spoke of “dynamic equilibrium,” “checks and balances,” “self-regulation,” and “supply and demand,” ever since the dawn of British liberalism.33 Similarly, the notion of a feedback loop between organism and environment was already present in the theories of both Malthus and Darwin, and Adam Smith’s classic definition of the free market (a blank slate that brackets out society and culture) also happens to be the underlying principle of the Skinner Box experiments.

Unsurprisingly, the abstractions performed by science have materially concrete effects. At the end of every thought experiment is a (dead) cat. The notion of a chaotic, deteriorating universe, in which small enclaves of orderly life are increasingly under siege, echoed the fears of communist contagion and the urge to halt the Red Tide. The calculation of nuclear missile trajectories, the Distance Early Warning System, and the development of deterrence theory, together with operations research and game theory, were all devoted to predicting the coming crisis. Yet prediction is also an act of violence that re-inscribes the past onto the future, foreclosing history. The war that had initially been waged to “make the world safe for democracy” had also “involved a sweeping suspension of social liberties, and brought about a massive regimentation of American life”34 Unable to account for the belligerent bodies of the North Korean and the Viet Cong, or the destitute bodies of the African American, cybernetics came to embrace the immateriality of the post-human. Ignoring political differences, pigeons, rats, communists, and kamikaze pilots ended up conflated with the servo-mechanical gremlin. All in all, nothing but the noise that hinders information flows inside electronic circuitry.

At length, cybernetics went on to become the scientific ideology of neo-liberalism, the denouement of which was the late-eighties notion of the “end of history”35 that imposed the wide cultural convergence of an iterative liberal economy as the final form of human government. In 1997, Wired magazine ran a cover story titled “The wLong Boom”, whose header read: “We’re facing twenty-five years of prosperity, freedom, and a better environment for the whole world. You got a problem with that?” In the wake of the USSR’s demise and the fall of the Berlin Wall, The Long Boom claimed that, no longer encumbered by political strife and ideological antagonism, the world would witness unending market-driven prosperity and unabated growth. Though from our current standpoint the article’s claims seem somewhat ludicrous, its brand of market-besotted optimism shaped the mindset of the nineties. It also proned what would become known as the Californian dream; a weak utopia that ignored the “contradiction at the center of the American dream: some individuals can prosper only at the expense of others.”36

Dialectical materialism—the theory that cybernetics came to replace—presupposed the successive dissolution of political forms into the higher form of History, but feedback is no dialectics. Friedrich Engels defined dialectics as the most general laws of all motion, which he associated to the triadic laws of thought: the law of the transformation of quantity into quality; the law of the unity and struggle of opposites; and the law of the negation of the negation. Although feedback and dialectics represent motion in similar ways, cybernetics is an integrated model, while dialectical materialism is an antagonistic one. Dialectics implies a fundamental tension, and an unresolved antagonism; whilst feedback knows no outside or contradiction, only perpetual iteration. Not surprisingly, cybernetics was briefly outlawed under Joseph Stalin, who denounced it as bourgeois pseudoscience because it conflicted with materialistic dialectics by equating nature, science, and technical systems.3738 Simply put, feedback is dialectics without the possibility of communism. Against the backdrop of an Augustinian noise, history itself becomes an endlessly repeating loop, revolving around an “enclosed space surrounded and sealed by American power.”39

Unlike social or psychic systems, however, thermodynamic systems are not subject to dialectical tensions. Nor do they experience historical change. They only accumulate a remainder—a kind of refuse—or increase in entropy. In the 1960s this refuse materialized in the Manson Family and the Viet Cong guerrilla. At present, the rejects of globalization, be they Somali pirates or Al-Shabaab militants, remain the subject of the Global War on Terror.

  • 1. This was the first of a series of lectures that later became known as the “Behaviourist Manifesto”.
  • 2. Henry David Thoreau, Walden and Other Writings, (New York: Bantam, 1980), 85.
  • 3. Kerry W. Buckley, Mechanical Man—John Broadus Watson and the Beginnings of Behaviorism (New York: Guilford Press, 1989).
  • 4. Ibid.
  • 5. Ibid, 114.
  • 6. Kerry W. Buckley, Mechanical Man—John Broadus Watson and the Beginnings of Behaviorism (New York: Guilford Press, 1989), 113.
  • 7. Ibid, 152.
  • 8. John A. Mills, Control—A History of Behavioral Psychology (New York: NYU Press, 1998), 84.
  • 9. The original Skinner Box had a lever and a food tray, and a hungry rat could get food delivered to the tray by learning to press the lever.
  • 10. John A. Mills, Control—A History of Behavioral Psychology (New York: NYU Press, 1998), 124.
  • 11. Ibid, 141.
  • 12. Kerry W. Buckley, Mechanical Man—John Broadus Watson and the Beginnings of Behaviorism (New York: Guilford Press, 1989).
  • 13. Ibid, 165.
  • 14. Peter Galison “The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision,” Critical Inquiry 1, n. 1. (Autumn 1994): 28–266
  • 15. In the tradition of James Watt’s steam engine governor, an automatic device that uses error-sensing negative feedback to adjust its performance.
  • 16. George Stibitz quoted in: Peter Galison “The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision,” Critical Inquiry 1, n. 1. (Autumn 1994): 28–266.
  • 17. Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society ([1950] Reprint of the revised and updated edition of 1954) (Cambridge, MA: Da Capo Press, 1988), 33.
  • 18. As Peter Galison notes in “The Ontology of the Enemy,” Wiener’s novel usage of the term emerges in November 1940, in a letter to MIT’s Samuel H. Caldwell.
  • 19. Mindell, Segal and Gerovitch, “Cybernetics and Information Theory in the United States, France and the Soviet Union,” in Science and Ideology: A Comparative History (London: Mark Walker Ed. Routledge, 2003), 67.
  • 20. Ibid.
  • 21. N. Catherine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics, (Chicago: University of Chicago Press, 1999), 7.
  • 22. Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society ([1950] Reprint of the revised and updated edition of 1954) (Cambridge, MA: Da Capo Press, 1988).
  • 23. Norbert Wiener, Cybernetics: or Control and Communication in the Animal and the Machine, (Boston, MA: MIT Press, 1961), 58.
  • 24. Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society ([1950] Reprint of the revised and updated edition of 1954) (Cambridge, MA: Da Capo Press, 1988), 134.
  • 25. Entropy, one should note is an empirical description, not a physical law.
  • 26. Slava Gerovitch, From Newspeak to Cyberspeak—A History of Soviet Cybernetics (Boston, MA: MIT Press, 2002), 65.
  • 27. Ibid.
  • 28. Ibid, 67.
  • 29. Norbert Wiener, Cybernetics: or Control and Communication in the Animal and the Machine (Boston, MA: MIT Press, 1961).
  • 30. Edgar Landgraf, Emergence and Embodiment: New Essays on Second-Order Systems Theory, Bruce Clarke and Mark B. Hansen Eds. (Durham, NC: Duke University Press, 2009).
  • 31. Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society ([1950] Reprint of the revised and updated edition of 1954) (Cambridge, MA: Da Capo Press, 1988), 103.
  • 32. Peter Galison, The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision, Critical Inquiry 1, n. 1. (Autumn, 1994): 28–266.
  • 33. Otto Authority Mayr, Liberty and Automatic Machinery in Early Modern Europe (Baltimore, ND: Johns Hopkins University Press, 1986), 139–140.
  • 34. Kerry W. Buckley, Mechanical Man—John Broadus Watson and the Beginnings of Behaviorism (New York: Guilford Press, 1989), 114.
  • 35. The concept of the “end of history” was put forth by conservative political scientist Francis Fukuyama in his 1992 book, The End of History and the Last Man.
  • 36. Richard Barbrook and Andy Cameron, “The Californian Ideology,” Mute, n° 3 (Autumn 1995).
  • 37. Maxim W. Mikulak, “Cybernetics and Marxism-Leninism,” in The Social Impact of Cybernetics, ed. Charles Dechert (Notre Dame, IN: University of Notre Dame Press, 1966).
  • 38. Richard Barbrook and Andy Cameron, “The Californian Ideology,” Mute, n° 3 (Autumn 1995).
  • 39. Paul N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, MA: MIT Press, 1997), 8.