In the decades immediately following the Second World War, architectural practice and spatial theory experienced a crisis of legitimacy following an increasing awareness that what was perceived as the utopianism and formalism of aesthetic and functional modernism was incapable of addressing the complexities of the contemporary world, especially in relation to human scale and function, urban ensembles and social diversity. Connoisseurship gave way to the demands of the growing research-industrial complex. Architecture and architectural education turned, in part, to cybernetics, concerned with control and communications, and systems theory to rationalize and validate its work. In the process, information lost its materiality and its body. That is, the information and systems age, which supplanted the industrial and machine age, struggled to maintain (or willingly ceded) its materiality.
Katherine Hayles periodizes cybernetics and systems theory by distinguishing between first-generation or –order theory focused on homeostasis (1945-1960) and second-generation or –order theory (1960-1980), which incorporated concepts of reflexivity. First-order cybernetics was concerned with reducing noise to facilitate signal and filtering out non-sense to emphasize meaning. It created a top-down, hierarchical model, which, despite an ostensible opposition to conventional scientific method, remained epistemologically representational, articulating knowledge of, and mapping, a pre-existing environment rather than creating and adapting to a new one. This mode of thought has given rise to what John Johnston characterizes as “two conflicting cultural narratives, the adversarial and the symbiotic,” in which humans either lose control of their environment at the hands of technology or merge with technological systems. Either option creates fear and alienation because humans are not active participants in creation. With the machine, a discrepancy between technics and culture opens up, because humans are no longer “tool bearers.”
However, philosophers, such as Gilbert Simondon, who conceptualizes transduction and individuation in relation to technical mentality, Gilles Deleuze and Felix Guattari, who develop a machinic phylum, Bernard Stiegler, who relates technics and time, as well as second-generation systems theorists like Niklas Luhmann, who reconsiders the relationship between systems and environments, offer alternative means for considering information theory. These bottom-up models share a conviction that technological purpose and function are neither pre-given nor teleological, and they treat noise as a productive excess that allows for change and evolution, part of a creative process rather than a hindrance to efficient communication.
In addition, the British strand of cybernetics, which, inspired by Norbert Wiener and early Macy’s Conferences, arises out of brain and cognitive sciences, rather than engineering and warfare, constitutes what Andrew Pickering calls “ontological theatre,” a move away from treating the brain and cognitive behavior as representational toward a performative model that elucidates an emergent interplay of human and material agency. Among the British strand of second-order cyberneticists, Gordon Pask, whose design work includes cybernetic machines, theatrical sets and architecture, informs his research and inventions by progressive notions of materiality and process. He understands that matter is anything but inert, in the process challenging hylomorphic models of thought that impose form on static materials, as well as dualisms between material and message, nature and technology, episteme and techne, hardware and software, genes and bodies, individuals and collectives.
As Alberto Toscano notes, “To think beyond mechanicism and vitalism is no mean feat.” Pask, however, moves beyond unproductive dichotomies that posit the individual and objects as either atomistic, separate from the environment, or holist, existing as only a part of the environment. He replaces formalism with transformation, designing works that “inform” matter, positing information as a process by which some thing is modulated by something else via interchange and communication. His designs reveal metastable processes of techno-, onto- and phylo-genesis, rather than the stable systems of equilibrium sought by science and early cybernetics. His works refuse to treat information as a purely abstract concept of exchange, revealing instead a simultaneous individuation of thought and individuation of matter.
Cybernetics is broadly defined as the study of feedback, communication and control in living organisms, machines and organizations, the means by which they process information, react to information and change in response to these tasks. Early cybernetics developed a theory of communication and control applicable to animals, humans and machines, and established laws of information that explained the tendency among biological and technological systems to organize and counter entropy. Cybernetics, like other sciences, posited a clear divide between nature and culture to facilitate its primary aim, which was to understand intelligent behavior and to ensure survival despite nature’s tendency to degrade the organized and destroy the meaningful.
Norbert Wiener is justifiably credited as the visionary behind the advent of cybernetics. However, it is worth noting that at the early Macy’s Conference Claude Shannon developed a theory of information, Warren McCulloch a model of neural functioning, and John von Neumann brought an interest in computer technology, and these figures as a group enabled the complex model of humans as information-processing entities, essentially similar to intelligent machines, to emerge. Wiener writes, “Information is information not matter or energy. No materialism which does not admit this can survive at the present day.” Part of a utopian view of information, messages here have no material presence, acting as pure pattern, distinct from energy, remaining immaterial until encoded in print, electrical pulse or digital bit. Shannon’s quantitative theory of information, first articulated the same year as Wiener’s cybernetics, also defined information in statistical terms based on entropy and randomness in thermodynamic systems. In short, the conjectured advantage of their concept of information was its lack of presence. As Hayles writes:
Here, at the inaugural moment of the computer age, the erasure of embodiment is performed so that “intelligence” becomes a property of the formal manipulation of symbols rather than enaction in the human lifeworld. […] All that mattered was the formal generation and manipulation of informational patterns. Aiding this process was a definition of information, formalized by Claude Shannon and Norbert Wiener, that conceptualized information as an entity distinct from the substrates carrying it.
Cybernetics, in other words, constituted an epistemic shift in which technological progress became informational rather than material or energy.
The concept of life was central to cybernetics from the start, its machines and robots anthropomorphized and measured in terms of how well they represented life. Early systems theory also addressed fundamental philosophical issues, such as the definition of life. For Wiener, biological life arose through negentropy, while Ludwig von Bertalanffy advocated an organismic notion of life in which the whole or system is more than the sum of its isolated parts. Systems theory adapted the Heraclitean view that everything is in flux and contested the Parmenidean view that only static or fixed being is real. For Bertalanffy, forms and structures are secondary, rather than primary and transcendent, to the open system that gives rise to dynamic and volatile forms.
As Johnston points out, while there is little agreement on what constitutes life, biologists generally agree that it entails metabolism with which to extract energy from the environment and reproduction with which to evolve mechanisms for adaptability. British cyberneticists began to consider the ways in which life extended to machines and technology and in which machinic life functions as more than a simulacrum of organic or natural life. Grey Walter’s tortoises (Figs. 1-3), three-wheeled robots with single-cell sensors programmed
to avoid obstacles and to seek out light sources, constitute an early example. Walter did not conceive his tortoises as tools for conventional scientific research or data gathering, but rather as active participants that engaged their environment as if it was an unknown system to explore, from which it could learn and to which it could adapt. These machines pointed to the unimaginable richness of performance that could be generated through a few simple parts. Such mechanisms were still representational, operating via analogy and identity in its approach to the brain. The tortoises never evolved new goals, nor did the environment respond to their actions. Nevertheless, they became a model for black box theory or imperceptibility that would become central to cybernetic and systems theory.
Ross Ashby’s Black box theory introduced the ontology of unknowability, which he exemplified through the processes of doctors treating patients with aphasis and scientists analyzing rats in a maze, as well as extending the concept to more quotidian instances. He writes, “The child who tries to open a door has to manipulate the handle (the input) so as to produce the desired movement of the latch (the output); and he has to learn how to control the one by the other without being able to see the internal mechanism that links them.” In turn, epistemology, either inductive or deductive, became insufficient to consider new cybernetic machines. As Johnston writes of computers, thinking and logic machines, they “inhabit and traverse us in unnoticed ways, giving structure and meaning to what we all too casually call life.”
Black box theory is a precursor to Michel Foucault’s notion that life could emerge only through “the grid of knowledge constituted by natural history” or by becoming an invisible process within the depths of the body, or Deleuze and Guattari’s concept of becoming imperceptible. Heidegger also famously conceived of presence-at-hand as an awareness of technology only when it breaks or its operations become askew. Systems and cybernetics theory similarly realized that scientific discourse was performative, embodied, situated and material, rendered invisible by the efficacy of science. As Gilbert Simondon argues, the industrial revolution brought the gradual devolution of the subject from active to passive operator, becoming only a part of a larger system.
As Simondon argues, technology has become autonomous, developing at its own pace and influenced by its own internal logic. As a result, technology cannot be reduced to function or utility or theorized in terms of human aims and intentions. Technology is rather an ensemble or assemblage imbricating machines, humans and environments. Technology is a process of invention rather than fixed and isolated objects, but it nevertheless contains, like the subjects who use it, materiality and agency. Technology is not the application of scientific knowledge but the precondition for it. Technology exists prior to any theory/practice or episteme/techne split. Placing ourselves as subjects before the taking-form of objects precludes us from witnessing the process of ontogenesis. In the absence of a clear understanding of that autonomy and its dynamics, it becomes difficult to develop strategies for responding to this new world. This inability is countered through conceptualizing individuation as a process that requires matter and form to exist.
Inspired by Simondon, Bernard Stiegler contends, “The modern age is essentially that of modern technics,” and industrial (or information-based) civilization hinges on permanent innovation, yielding a divorce between culture and technology in which “Technics evolves more quickly than culture.” Between the inorganic beings of the physical sciences and the organized beings of biology, there exists a third genre of “being”: “inorganic organized beings,” or technical objects. These nonorganic organizations of matter have their own dynamic when compared with that of either physical or biological beings, a dynamic, moreover, that cannot be reduced to the “aggregate” or “product” of these beings.
Johnston argues that, “Technical systems form when a technical evolution stabilizes around a point of equilibrium concretized by a particular technology.” It is significant, therefore, that cybernetics and systems theory, unlike artificial intelligence that modeled familiar cognitive feats like playing chess, solving equations and performing logical deductions, were interested in the challenges posed by the performative brain. As Pickering notes:
We have little conscious access to processes of adaptation, for example, including a whole range of what one might call altered states and strange performances: dreams, visions, synesthesia, hallucination, hypnotic trance, extrasensory perception, the achievement of nirvana and the weird abilities of Eastern yogis and fakirs—the “strange feats” of “grotesque cults,” such as suspending breathing and the heartbeat and tolerating intense pain.
Increasingly concerned with adaptation and bottom-up strategies of synthesis, Ross Ashby defines environment as “those variables whose changes affect the organism, and those variables which are changed by the organism’s behaviour.” As early as 1947, Ashby had first introduced self-organization as a process by which a structure or pattern emerges from a system without a central authority or external element imposing it through intentional planning, or in which internal order increases over time, moving from entropy to negentropy. In the 1960s, Heinz von Foerster developed similar self-referential systems in which structural changes are produced by the system itself. As Hayles writes, “Reflexivity is the movement whereby that which has been used to generate a system is made, through a changed perspective, to become part of the system it generates.” More importantly, reflexivity challenged early cybernetics and ushered in the advent of second-generation systems theory, which gradually came to view system as that which reproduces itself through, in the terms of Humberto Maturana and Fracisco Varela, auto- or allo-poeitic behavior rather than for a purpose.
If, following Ashby, second-generation systems theory posited two systems of continuous, interacting variables – system and environment – it also introduced a conundrum. Function cannot define and change organization from within, but if change comes from outside, the system is no longer self-organizing. In other words, the epistemology of second-order cybernetics becomes a vicious circle in which a constructivist approach imprisons us in a perspectivism that can no longer effect change. A number of responses that address the performative brain and technologies of a decentered self as other would partially address this impasse, foregrounding couplings, assemblies, what Pickering calls “ontological theatre.” The second-generation cyberneticists, beginning with Ashby, however, embraced this indetermination. Their approach entailed a continuing interaction with human and nonhuman materials to explore (as distinct here from research) an evolutionary approach to design in which as Pickering notes, “The entire task of cybernetics was to figure out how to get along in a world that was not enframable, that could not be subjugated to human designs,” adding “the cybernetic approach […] necessarily entailed a degree of respect for the other.”
Gilbert Simondon reformulates information theory on the basis of a new philosophy of technology, in which the individual emerges as an ongoing process, constituting individuation as a process of becoming. In 1964, Simondon organized a cybernetics conference at Royamount, “The Concept of Information in Contemporary Science,” which featured Wiener as plenary speaker. In his introduction, Simondon notes, “In fact, historically, cybernetics appeared as something new directed to achieving a synthesis; in sum, we find ourselves brought back to the time of Newton, or to the time when the great philosophers were mathematicians or scientists in the natural sciences and inversely. This is doubtless the context in which it is now possible to listen to what Professor Wiener has to present to us.”
While Simondon approved of Wiener’s link between biological and technological organisms, where Weiner had developed a static system of classification, Simondon developed a dynamic theory that captured technological objects in their development and in their relationship to milieus. He was interested in their immanent power and process of individuation rather than their overarching structural organization. Simondon also criticized early cybernetics for its singular focus on machines with feedback mechanisms, writing:
It would not even be right to found a separate science for the study of regulatory and control mechanisms in automata built to be automata: technology ought to take as its subject the universality of technical objects. In this respect, the science of Cybernetics is found wanting; even though it has the boundless merit of being the first inductive study of technical objects and of being a study of the middle ground between the specialized sciences, it has particularized its field of investigation to too great an extent, for it is part of the study of a certain number of technical objects. Cybernetics at its starting point accepted a classification of technical objects that operates in terms of criteria of genus and species: the science of technology must not do so. There is no species of automata: there are simply technical objects; these possess a functional organisation, and in them different degrees of automatism are realized.
Systems science was interested in this dynamic, flowing character of reality, which is obvious in its turn from analysis to synthesis. The synthetic approach does not exclude analysis, but in the Systems Age synthesis has priority over analysis, and function over structure.
Simondon develops “mechanology,” a theory of synthesis of perception that evades the form/content duality of Kant’s fixed structures in favor of synthesis as process of becoming, which he terms individuation. Simondon traces two stages of individuation, the first of which generates an individuated subject, equivalent to Kant’s a priori transcendental subject, and the second of which creates an individualized subject, equivalent to Kant’s a posteriori empirical subject. Whereas the central notion of early cybernetics was system, the comparable concept in mechanology is soma, the body (of humans and nonhumans) as material entities. For Simondon, any cultural imbalance arises because technical objects are not afforded meaning in the way aesthetic objects are. Resolving issues is never an issue of information exchange but rather a transformation of signals into signification, information into meaning, which is a material process of excess that moves beyond prior limitations and continues its individuation.
This progress assumes that each structure is consciously endowed by its maker with characteristics which correspond to all the components of its functioning, as if an artificial object differed in no way from a physical system studied in all knowable aspects of energy exchange and of physical and chemical transformations. In the concrete object each piece is not merely a thing designed by its maker to perform a determined function; rather, it is part of a system in which a multitude of forces are exercised and in which effects are produced that are independent of the design plan. The concrete technical object is a physicochemical system in which mutual actions take place according to all the laws of science. The ultimate goal of the design can only be perfectly realized in the construction of the object if it identified with universal scientific knowledge.
The mechanism driving the process of individuation is what Simondon calls transduction, a transfer or transformation of information through a material medium. The materiality of the medium influences the possibilities, the virtual potentials of propagation. Individuation, that is, takes place in between form and matter, as opposed to the imposition of a preconceived form on inert matter. This process applies equally to information, which cannot elide the medium through which it is transmitted.
If the bottom-up approach of cybernetics and systems theory was its advantage, it was also cybernetics undoing and. The symbolic, analytic, top-down paradigm dominated government funding until the 1980s when the ubiquity of personal computers allowed networks to replace the single, large, powerful computer as repository of information and brought attention to the synthetic approach of interconnection. Nevertheless, cybernetics influenced key creative figures, including Gordon Pask, who brought cybernetics to architecture. Stiegler writes:
Accounting for the technical dynamic non-anthropologically, by means of the concept of “process,” means refusing to consider the technical object as a utensil, a means, but rather defining it “in itself.” A utensil is characterized by its inertia. But the inventiveness proper to the technical object is a process of concretization by functional overdetermination. […] The industrial technical object is not inert. It harbors a genetic logic that belongs to itself alone, and that is its “mode of existence.”
This over-determination became the driving force behind Pask’s artistic and architectural innovations, which can only be measured in terms of productive excess.
Design traditionally conceives of its processes, overtly or tacitly, as hylomorphic and dualistic. An architect formulates a plan, which is then imposed upon inert matter to create an object experienced by a subject. Despite the awareness of myopia inherent to such an approach, architects in the 1960s who turned to systems theory and cybernetics for validation still maintained such dichotomies. Christopher Alexander and Peter Eisenman, for instance, despite vast differences, both sought transcendent structures within the grasp of their immediate control, eliminating indetermination. For Eisenman, architecture was treated as an extension of Chomskian deep structure to constitute a formal competence with structure and rules, which elicit an endless series of variations. For Alexander, process was mechanistic and form deterministic. Like Colin Rowe and Kevin Lynch, Alexander maintained vestiges of behavioralism. As Janet Daley writes:
Alexander betrays a quite primitive and unfortunate theory of language. He seems to confuse, for example, ‘intelligibility’ with ‘utility’. He says at one point that a certain statement about ‘needs’ has ‘so many ways of interpreting it, that the statement is almost useless’. He then implies that the statement is thus meaningless (‘We don’t know what it really says’). What he means by ‘useless’ is, apparently, ‘not capable of immediate application to the given problem’. To condemn all statements which have no immediate utility as being unintelligible is almost incredibly philistine and insensitive. Statements are not tools or engineering implements. This kind of view represents a gross misconception of what language is about.
Any reference to systems or feedback, that is, still depends on the pre-existence of a code to decrease indetermination. There is no room for reflexivity or excess in these design approaches.
Gordon Pask represented architecture in its attempt to ameliorate this reactionary tendency. He founded System Research in 1953 to explore strategies of learning, knowledge, task analysis, design processes and other systems-based issues. He also created “maverick machines” – works that obfuscated the boundary between art and technology. He writes:
The design goal is nearly always underspecified and the “controller” is no longer the authoritarian apparatus which this purely technical name commonly brings to mind. In contrast the controller is an odd mixture of catalyst, crutch, memory and arbiter. These, I believe, are the dispositions a designer should bring to bear upon his work (when he professionally plays the part of a controller) and these are the qualities he should embed in the systems (control systems) which he designs.
His route to architecture first traversed art, set design, installation art, technology and industry. Pask was convinced that cybernetic architecture would “elicit [the inhabitant’s] interest as well as simply answering his queries,” citing Musicolour and the Colloquy of Mobiles, two of his installation projects, as examples. Like Ashby, Pask was interested in simulating the learning process using electronics to represent the nervous system. His first cybernetic machines included a musical typewriter and a self-adapting metronome, but the most theatrical was Musicolour (Figs. 4-6), a lighting rig that interacted with musical performances to control a light show and create a synaesthetic combination of image and sound. Music was converted via microphone into electrical signals, which controlled lights. However, the parameters of the circuitry were constantly variable. It was designed to get bored, eliciting something new and different from the performer. Pickering writes, “Thus, a Musicolour performance staged the encounter of two exceedingly complex systems—the human performer and the machine (we can come back to the latter)—each having its own endogenous dynamics but nevertheless capable of consequential performative interaction with the other in a dance of agency.”
Pask contended that through performance both machine and performer learned from each other, performatively rather than cognitively. One could not think through the ceaselessly changing strategies or dynamics of Musicolour. The process unfolded non-linearly. Neither machine nor performer had a preconceived agenda, and the process challenged distinctions between passive machines and active humans or the post-human conceit of a passive subject controlled by machines. Musicolour was an assemblage. The wiring diagram suggests a simple system, which produced great complexity. But unlike earlier systems theory, this complexity was immanent to the machine as process rather than transcendent and apriori. Pask believed that “man is prone to seek novelty in his environment and, having found a novel situation, to learn how to control it. [. . .] In slightly different words, man is always aiming to achieve some goal and he is always looking for new goals. [. . .] My contention is that man enjoys performing these jointly innovative and cohesive operations. Together, they represent an essentially human and inherently pleasurable activity.”
As an object, Musicolour was ugly, as art dubious and commercially a failure. Nevertheless it accomplished Pask’s goals. He contends that:
any competent work of art is an aesthetically potent environment. [. . .] Condition is satisfied implicitly and often in a complex fashion that depends upon the sensory modality used by the work. Thus, a painting does not move. But our interaction with it is dynamic for we scan it with our eyes, we attend to it selectively and our perceptual processes build up images of parts of it. [. . .] Of course, a painting does not respond to us either. But our internal representation of the picture, our active perception of it, does respond and does engage in an internal ‘conversation’ with the part of our mind responsible for immediate awareness.
Here Pask seems to suggest that any competent work of art could satisfy cybernetic criteria. Pask’s own work, however, suggests a continued attempt to employ art and technology to engage spectators, externalize the discursive conceits and put the cybernetic ideas out front instead of concealing or merely assuming them. What is perhaps most noteworthy is that Pask never considered himself a scientist in any conventional sense and considered the algorithms with which so many other systems theorists were concerned to not be the defining facet of cybernetics, that approach being limited to “electronic engineers examining their particular kinds of hypotheses about managers.” Pask wanted to explore adaptation and learning, not mimic them.
He continued to do so through his chemical computers (Figs. 7-10), which consisted of electrodes dipped into a dish of ferrous sulphate. As currents passed though the electrodes, filaments of iron would grow and branch outward into the liquid. The “threads” as Pask referred to them, were metastable, growing in states of high current density but dissolving or maintaining equilibrium otherwise. The threads were literally rhizomatic, growing in unpredictable fashion. The threads were also reflexive, as extensions of the electrodes influenced current densities in the dish, which in turn influenced electrodes. The computers processers were contingent on its history and its emergent properties none of which was pre-determined. The computer had a memory and it could learn. Pask’s ideal was to develop this model into a factory manager, neither mechanistic nor deterministic, idealistic nor subjective.
These threads and the systems hypothetically constructed with or through them exploit the immanent power of matter rather than preconceived goals, purposes, functions or needs. Perhaps most importantly, Pask’s chemical computers could grow organs and acquire senses that were not built into them or defined in advance. In one experiment, Pask connected a microphone to the computer and held it out the window. In response to the urban noise, the computer grew an ear and acquired new sensitivity to magnetic fields. As Pask writes:
We have made an ear and we have made a magnetic receptor. The ear can discriminate two frequencies, one of the order of fifty cycles per second and the other of the order of one hundred cycles per second. The “training” procedure takes approximately half a day and once having got the ability to recognize sound at all, the ability to recognize and discriminate two sounds comes more rapidly. [. . .] The ear, incidentally, looks rather like an ear. It is a gap in the thread structure in which you have fibrils which resonate at the excitation frequency.
Pask and System Research worked with the Rolling Stones, Pink Floyd and Captain Beefheart, but most relevant for architecture was his association with Cedric Price, his undergraduate classmate at Cambridge. In addition to Musicolour, the project most influential on his architecture was “Colloquy of Mobiles” (Figs. 11-14), produced by “Cybernetic Serendipity at the Institute for Contemporary Arts Cybernetic and curated by Jasia Reichardt in 1968. “Colloquy of Mobiles,” like Walter’s Tortoises had limited “evolutionary” capacities, but incorporated human interaction into its assemblage, an idea Pask would extend into architectural exploration.
The interest in cybernetics, adaptive or performative architecture was in part a reaction against the failure of high aesthetic and functionalist modernism, especially in urban planning. It was also part of a counter-culture that sought new ways to reinvent the self. Pask writes, “The high point of functionalism is the concept of a house as a ‘machine for living in.’ But the bias is towards a machine that acts as a tool serving the inhabitant. This notion will, I believe, be refined into the concept of an environment with which the inhabitant cooperates and in which he can externalize his mental processes.” Cybernetic architecture, in other words, challenged the primary tenets of engineering and statics. Bridges, televisions, computers and phones are generally designed to be indifferent to their environment insofar as they can withstand fluctuations rather than adapt to them. Unlike structures that just stand there under all circumstances, cybernetic machines make reality (always in the making) rather than respond to graspable and immediate causes.
While Archigram’s 1963 Living City installation at the ICA included a flicker machine, cybernetic architecture is perhaps best represented by the unbuilt Fun Palace (Figs. 15-16). Conceived in 1960 by Joan Littlewood, with whom Pask had worked previously on set design projects, the Fun Palace constituted her “childhood dream of a people’s palace, a university of the streets, re-inventing Vauxhall Gardens, the eighteenth century Thames-side entertainment promenade, with music, lectures, plays, restaurants under an all-weather-dome.” In short, the project was conceived as an adaptive and reconfigurable series of spaces that could support an enormous variety of spatial and temporal activities, including musical and theatrical performances, games, tests, interactive jam sessions, dances, science experiments, lectures, films, modeling an crafts. Pask agreed to join the Fun Palace team and organized the Fun Palace Cybernetics Subcommittee, and along with Littlewood and Price, he became the third major personality behind the project.
The Fun Palace architecture featured informal, flexible, open and transient spaces that not only supported the program but also facilitated the development of its agenda. The individual features – ramps, moving walkways, floating walls, floors and ceilings, suspended auditoriums connected by a gantry crane, which could move individual spaces, were not entirely new ideas. But the materiality – charged vapor barriers, optical barriers, air curtains, fog dispersal machines, horizontal and vertical blinds, challenged the ostensible permanence of architecture. In fact, no fabric within the complex was designed to last more than a decade and some were designed for single use.
The Fun Palace contested boundaries of self and other, linking knowledge to experience and performance, which in turn presents us with differentials of speeds and intensities, temporal and spatial scales. Through materiality, the dynamics of speed, delay, affect and sensation are revealed as physical parameters of corporeality and continually reconfigured through the interplay of divergent and indeterminate realities in which the informational data stream is connected to human and nonhuman bodies, not divorced from them. Through Fun Palace, techne is revealed as an interconnectedness among technologies of machines, the self and the environment, exceeding any narrowly utilitarian purpose.
Technology is also revealed as an ensemble capable of expansion to produce new networks of relations that mediate between the organic and inorganic through feedback rather than alienating one from the other. As Adrian Mackenzie writes, ‘Transductive processes occur at the interface between technical and non-technical, human and non-human, living and non-living.” Through what Simondon calls “the mentality of the technical object,” the designer can unite disparate fields, facilitating emergence, but the passing of a threshold belongs to their potential. The Fun Palace, in short, was an exceedingly complex system with its own dynamics with which other machines and humans could interact but which could never be known exhaustively.
There are inherent issues with such an approach to architecture. Aside from the fact that the Fun Palace (as well as similar projects by Archigram, Archizoon and other contemporary designers) was never constructed, there is a difficulty in maintaining the indeterminacy of stochastic processes in built form, which tends to reify and hypostasize as much as express. The materiality and concretization of architecture reintroduces control and reductive notions of information and communications. Pask’s own logic diagram of the Fun Palace’s “cybernetic controls system” featured “unmodified people” as input and “modified people” as output, echoing Ashby’s black box theory, as well as adaptive architecture as a transformative technology of the self. However, under the heading of “Determination of what is likely to induce happiness” Pask comes dangerously close to insinuating the Fun Palace as a tool for behavior modification or social control, made even more overt by the Cybernetic Subcommittee’s diagram, in which humans and activities were reduced to data in a flowchart.
Opening things up to chance processes, performance and improvisation, creating without a plan or knowledge of what will happen next does not necessarily require cybernetic building strategies to initiate creativity. However, its strategies are well equipped at avoiding the return to control and communications found in Pask’s diagrams. Conceptualizing through the flow of information is as important as tectonics and other architecture strategies, especially when, as was the case with the Fun Palace, no one could say exactly what the building was. As Deleuze writes, “The work of art has nothing to do with communication. The work of art strictly does not contain the least bit of information. To the contrary, there is a fundamental affinity between the work of art and the act of resistance. There yes. It has something to do with information and communication as acts of resistance.”
When Stiegler calls technics the industrialization of memory, he is arguing that human intelligence arises through the use of tools, gestures, signs, language and props that make cognitive activities repeatable, and humans are defined by tools that make the exteriorization of memory possible. Through the passage from potential to actual energy, information intervenes as the precondition for actualization. As Toscano writes, “This intervention of information – as an event which produces the ontogenetic communication between a structural germ and a metastable domain – signifies that actualization cannot be anticipated by any logical, material or mathematical form. This is yet another application of Simondon’s dictum: ‘the a posteriori becomes a priori, the event becomes principle’.” The interactivity of information has to remain distinct from the fixity of form. Information is neither unity nor identity, but is rather a material process of ongoing individuation. That is, the subjects sending or receiving messages and the materials through which they are transmitted are in a constant state of becoming. Informing means individuating and individualizing.
Information is imbricated with matter and energy. It “informs” matter in the sense that it modulates it, but matter reciprocates with immanent potential for being formed in particular directions. Most importantly, form is never absolute or complete, “transducing” itself into a material through a series of transformations that transmit energy. The resistance Deleuze identifies is possible because this process leaves matter in a metastable state in which the right force or pressure will break the system’s equilibrium and generate emergence, which changes materiality and alters milieu, making the boundary dividing living and non-living both contingent and collective, topological and temporal. Transduction echoes William James’ notion of “Plasticity,” the possession of a structure weak enough to yield to an influence, but strong enough not to yield all at once.” The process of transduction exposes moments of rupture in material and signifying processes, eliciting a reappraisal of techno-social relations within architecture.
Ashby, W. Ross. Design for a Brain: The Origin of Adaptive Behaviour. London: Chapman & Hall, Ltd., 1966.
Ashby, W. Ross. An Introduction to Cybernetics. London: Chapman & Hall, Ltd., 1957.
Bertalanffy, Ludwig von. General System Theory: Foundations, Development, Applications. New York: George Braziller, 1968.
Daley, Janet. “A Philosophical Critique of Behaviourism in Architectural Design.” In Design Methods in Architecture, edited by Geoffrey Broadbent and Anthony Ward, 71-75. London: Lund Humphries for the Architectural Association, 1969.
Deleuze, Gilles. “Having an Idea in Cinema.” In Deleuze and Guattari: New Mappings in Politics, Philosophy, and Culture, edited by Eleanor Kaufman and K.J. Heller, 14-19. Minneapolis: University of Minnesota Press, 1998.
Deleuze, Gilles, and Felix Guattari. A Thousand Plateaus: Capitalism and Schizophrenia. Translated by Brian Massumi. Minneapolis: University of Minnesota Press, 1987.
Ezard, John. “Joan Littlewood.” Guardian, September, 20 2002, 23.
Foucault, Michel. The Birth of the Clinic: An Archaeology of Medical Perception. Translated by A.M. Sheridan. New York: Routledge, 1973.
Foucault, Michel. The Order of Things: An Archaeology of the Human Sciences. New York: Routledge, 2002.
Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: The University of Chicago Press, 1999.
Heidegger, Martin. Being and Time. Translated by Joan Stambaugh. Albany: State University of New York Press, 1996.
James, William. Essays in Radical Empiricism. New York: Longmans, Green, and Co., 1912.
James, William. The Principles of Psychology, Volume 1. New York: Cosimo, Inc., 2007.
Johnston, John. The Allure of Machinc Life: Cybernetics, Artificial Life, and the New Ai. Cambridge: The MIT Press, 2008.
Mackenzie, Adrian. Transductions: Bodies and Machines at Speed. New York: Continuum, 2002.
Pask, Gordon. “The Architectural Relevance of Cybernetics.” Architectural Design 39, no. September (1969): 494-96.
Pask, Gordon. “A Comment, a Case History and a Plan.” In Cybernetics, Art, and Ideas, edited by J. Reichardt, 76-99. Greenwich, CT: New York Graphics Society, 1971.
Pask, Gordon. “The Natural History of Networks.” In Self-Organizing Systems: Proceedings of an Interdisciplinary Conference, edited by M. Yovits and S. Cameron, 232-63. New York: Pergamon, 1960.
Pask, Gordon. “Organic Control and the Cybernetic Method.” Cybernetica 1 (1958): 155-73.
Pickering, Andrew. The Cybernetic Brain: Sketches of Another Future. Chicago: The University of Chicago Press, 2008.
Simondon, Gilbert. On the Mode of Existence of Technical Objects. Translated by Ninian Mellamphy. Paris: Aubier, Editions Montaigne, 1980.
Spinoza, Baruch. “Ethics.” In Complete Works, edited by Michael L. Morgan, 213-82. Indianapolis: Hackett Publishing Company, Inc., 2002.
Stiegler, Bernard. Technics and Time, 1: The Fault of Epimetheus. Translated by Richard Beardsworth and George Collins. Stanford: Stanford UP, 1998.
Toscano, Alberto. The Theatre of Production: Philosophy and Individuation between Kant and Deleuze. New York: Palgrave Macmillan, 2006.
Wiener, Norbert. Cybernetics: Or Control and Communication in the Animal and the Machine. Cambridge: The MIT Press, 1948.