Warning: include_once(/home/cwwang/futurefeeder.com/wp-content/plugins/wordpress-support/wordpress-support.php): failed to open stream: Permission denied in /home/cwwang/futurefeeder.com/wp-settings.php on line 263

Warning: include_once(): Failed opening '/home/cwwang/futurefeeder.com/wp-content/plugins/wordpress-support/wordpress-support.php' for inclusion (include_path='.:/usr/local/lib/php:/usr/local/php5/lib/pear') in /home/cwwang/futurefeeder.com/wp-settings.php on line 263

Warning: Cannot modify header information - headers already sent by (output started at /home/cwwang/futurefeeder.com/wp-settings.php:263) in /home/cwwang/futurefeeder.com/wp-includes/feed-rss2.php on line 8
JAC – Future Feeder http://www.futurefeeder.com Wed, 08 Jul 2015 14:06:21 +0000 en-US hourly 1 https://wordpress.org/?v=4.5.12 Escape From the Future: Architecture, Language, and the Computational Turn http://www.futurefeeder.com/2006/10/escape-from-the-future-architecture-language-and-the-computational-turn/ http://www.futurefeeder.com/2006/10/escape-from-the-future-architecture-language-and-the-computational-turn/#respond Fri, 27 Oct 2006 18:27:52 +0000 http://futurefeeder.com/index.php/archives/2007/10/27/escape-from-the-future-architecture-language-and-the-computational-turn/ Figure 4.png

The creators of this online journal and forum controversially argue that computation will engender the final stage of development in the relationship between architecture and computers by completely eliminating the concept of form from the architectural equation3. The use of language (in this case, the language of computer code) to evade the trappings of form has precedent in the postmodern use of semiotics to free architecture from the formal dogma of Modernism and the Classical tradition. In contrast to the semiotic critique, however, whose analytical methods were defined by the very logo centric system it was attempting to undermine, the computational turn represents a far more substantial historical break, with the potential to create a completely autonomous architecture freed from Classical notions of past and future. This essay explores the evolution from the semiotic to the computational model in architecture as a way of better understanding the circumstances that made these radical leaps into language both possible, and necessary.]]>
“Many people would argue that natural languages are much more broadly based than programming languages, a stance that relegates code to the relatively small niche of artificial languages intended for intelligent machines. Recently, however, strong claims have been made for digital algorithms as the language of nature itself. If, as Stephen Wolfram, Edward Fredkin, and Harold Morowitz maintain, the universe is fundamentally computational, code is elevated to the lingua franca not only of computers but of all physical reality.”1

N. Katherine Hayles

“…computational irreducibility occurs whenever a physical system can act as a computer. The behavior of the system can be found only by direct simulation or observation: no general predictive procedure is possible.”2

Stephen Wolfram

The creators of this online journal and forum controversially argue that computation will engender the final stage of development in the relationship between architecture and computers by completely eliminating the concept of form from the architectural equation3. The use of language (in this case, the language of computer code) to evade the trappings of form has precedent in the postmodern use of semiotics to free architecture from the formal dogma of Modernism and the Classical tradition. In contrast to the semiotic critique, however, whose analytical methods were defined by the very logo centric system it was attempting to undermine, the computational turn represents a far more substantial historical break, with the potential to create a completely autonomous architecture freed from Classical notions of past and future. This essay explores the evolution from the semiotic to the computational model in architecture as a way of better understanding the circumstances that made these radical leaps into language both possible, and necessary.

Beginning in the late sixties and arguably culminating in the Deconstructivist Architecture exhibit at the MoMA in 1988, architects systematically interrogated what Mark Wigley dubbed “the dream of pure form”4, exposing the inherently subjective and arbitrary nature of the Modernist canon. Drawing upon Ferdinand de Saussure’s notion of “the arbitrary nature of the sign”5, architectural form was subjected to a relentless semiotic critique. In his seminal 1984 essay “The End of the Classical, The End of the Beginning, The End of the End,” Peter Eisenman dismantled what he referred to as the ‘three fictions’ of architecture: representation, reason, and history.6 Eisenman argued that the representational function of architecture had essentially remained unchanged from the time of the Renaissance. The abstraction associated with Modernism which claimed to liberate itself from the “outward trappings of Classical style” by representing pure function, was for Eisenman merely the replacement of “the message of antiquity” with the “message of utility”7. This meant that the underlying ‘representational fiction’ which placed ‘meaning and value’ outside of architecture itself was still completely intact.8 Michael Hays summarized Eisenman’s critique as follows:

“In Eisenman’s view, modern architecture was never fully modern. Though it did produce a certain opacity of the architectural sign (most often referred to as its abstractness), modern architecture was never really free of the burden to mean; the referent still survives, albeit problematically, in cherished modernist emblems like the industrial shed, grain silo, and steamship, their workmanlike materials and their social utility.”9

Instead, Eisenman pursued an autonomous architecture, “…a representation of itself, of its own values and internal experience.”10 This enabled him to pry the discipline from mythical origins, utopist futures, and narratives of meaningful presence, in favor of the “meaning-free, arbitrary, and timeless.â€?11 Eisenman used the underlying, syntactical structure of language to liberate himself (and many other architects) from form and the semantic entanglements that had thoroughly exhausted it.

Cut to the mid-nineties. With the ‘Decon’ show at the MoMA now history and the digital revolution lurking on the horizon, architecture occupied a tenuous position between a recent past that relieved it of “the burden to mean”12 and a future where new technologies promised to make the expression of almost anything possible. It is at this moment when, in the vacuum created by the postmodern project, and for the first time in history, a form was created that appeared to be completely a-signifying. This form became known as the “blob.”13 Jorge Silvetti, in his 2002 Gropius Lecture at Harvard dramatically described the moment of the blob’s first appearance on the architectural scene:

“And what a sudden, frightening abyss it opened up in front of us as the computer certainly intimated that it could produce forms that not only do not have precedent, but, more perplexing, may not even have referents! Freedom from semantics, history, and culture was perhaps made possible for the first time in civilization.”14

The blob’s formlessness is what allowed it to escape conventional signification but is also paradoxically what stripped it down to nothing but form. Like its cinematic counterpart, the power of the blob was its ability to absorb into its surface everything around it. In the 1996 essay that introduced the new paradigm to the world, Greg Lynn’s blob completely assimilated the platonic, eliminating any vestige of the referential that the originary “MetaBlobâ€?, in being purely spherical, may have possessed: “In this regard, even what seems to be a sphere is actually a blob without influence: an inexact form that merely masquerades as an exact form because it is isolated from adjacent forces.”15 According to Silvetti, this radical instability of meaning became unbearable and was quickly filled in by organic, biological, and process based analogues.

“Since as creatures that may wish to produce a form without meaning also harbor the even more compelling and contrary impulse to be repulsed by that which we cannot name or understand, we began to invest Blobs with the meaning of whatever we could associate with them.�16

This ultimately led to what Silvetti considers the dominant trend of contemporary architectural representation, something he labeled “Literalism.â€?17 The formlessness and inherent immateriality of the blob exposed it to multiple readings, allowing for a limitless variety of material attributes to be projected against it. For Silvetti, the blob gave birth to the contemporary practice of making buildings that look like the metaphors upon which they are based, permanently fusing language and form. In this regard the blob was to architecture what according to Danto, Warhol’s Brillo Boxes were to art.18 Each brought about in its own way and in relation to its own discipline, the collapse of signifier and signified and the end of historical categories. The blob paved the way for the eventual replacement of the semiotic language model with the computational one.

Amidst this atmosphere of post-blob ‘literalism’ architecture is once again turning inward. Focus has shifted from the dynamic outer appearance of form to the underlying genetic code that makes it possible; from meaningless form to formless meaning. Karl Chu, Haresh Lalvani, Michael Silver, and a host of other architects, each inspired in their own way by the kind of CA simulations that Wolfram has been conducting, have developed their own strategies for exploring the potential of computation in architecture. Karl Chu, one of the originators of the new paradigm, employs what are known as “L Systems” to create fractal-like, self-similar morphologies where the whole and its parts have the same structure. In a 2004 essay, “Metaphysics of Genetic Architecture and Computation,â€? Chu divided contemporary architectural discourse into two divergent trends, the “morphodynamicalâ€? and the “morphogeneticâ€?.19 According to Chu, morphogenetic systems contain an “…internal principle that generates architectural form and organizationâ€?20 that morphodynamical systems such as Gregg Lynn’s do not. “L Systems” are recursive, which means that objects are defined in terms of previously defined objects of the same class. This is what makes them ‘generative’ in a way that the parametric constraints of animation software are not. Chu characterized the dynamic ‘soft morphology’ of the blob and its progeny as nothing more than a placeholder for numerical values.21 These values

“…map changes in time, density and space: the frequency of a gene, the concentration of a chemical, the position and velocity of an aircraft, the pressure of a gas, the rate of change in interest rates, the fluctuations of the dollar, the density of population, the earnings of a firm, the rise and fall of stocks, the diagrammatic flow of traffic, etc.�22

Figure_4_THM.jpgFigure 1:
From Stephen Wolfram’s A New Kind of Science (Copyrighted image courtesy of Wolfram Research, Inc.)This characterization confirms Silvetti’s diagnosis and twenty years later echoes Eisenman’s in its claim that architecture is still trapped within a referential system. Instead of columns as “surrogates of trees,â€? however, and windows that “resemble the portholes of ships,”23 blobs morphologically (and according to Chu, ‘spuriously’) describe frequencies, velocities, pressures, and other dynamic conditions in nature that otherwise lack formal embodiment. Computational architects such as Chu seek to halt the endless profusion of smooth, computer generated forms that have once again done nothing more than replace one ‘representational fiction’ with another. No less controversially, Haresh Lalvani is seeking to map what he calls “the architectural genome,â€? “…a universal code for all morphologies.â€?24 Once mapped, he argues that the pairs past and future as well as natural and artificial will cease to be dialectically opposed and will fuse into one. Michael Silver’s project Automason 1.0 uses generative codes to address real problems associated with building construction. Silver was inspired by the emergent properties of CA, which “…[consist] of a field of discrete cells divided into small groups of neighborhoods [that are] defined in terms of finite states, on or off, transparent or opaque, white or black,”25 and evolve from a simple set of rules to achieve an astonishing level of complexity. (Figure 1)

He identified a similar potential in masonry technology, which is also based on a step-by-step process following principles of adjacency and iteration. Silver is proposing a teleonomic architecture, where building construction would remain a goal-oriented process with the one exception that the mason would be unconscious of the goal. A builder would receive instructions in the field from a hand-held device, with a brick being laid in accordance with each new cell of the evolving CA pattern that appeared on the screen. (Figure 2)

Silver controversially argued that:”The patterns created in the process [would be] entirely natural to both the craftsman and the mathematics. With simple programs building details obtain their complexity for free; no external agent, author or extraneous system is needed to design them.”26

Figure 5.jpg Figure 2:
San Jose State University Museum of Art and Design, Competition Entry, 2003. (Courtesy of Michael Silver).
In contrast to “…the deconstructive architect [who] puts the pure forms of the architectural tradition on the couch and identifies the symptoms of a repressed impurity,â€?27 the computational architect has no psychoanalytic agenda. Computation does not critique form; it replaces it. According to N. Katherine Hayles the use of code in what is now controversially being called the post-human era marks a radical departure from the postmodern use of natural language in at least two important ways. First, the postmodern critique of “the metaphysics of presence”28 was only possible against the background of an “originary Logosâ€?29; something that computation, in its reduction of “…ontological requirements to a bare minimum,”30 has done away with completely. Hayles aptly pointed out that the starting point for a CA simulation requires no more than the “…elementary distinction between something and nothing,”31 a solid pixel versus an empty one. The second important distinction that Hayles’ makes between these two language models is that the emergent characteristics of computation imply a radical “disjunction between surface and interior,”32 where what is manifested at a global scale can in no way be deciphered (or therefore, destabilized) by recourse to the zeros and ones of the code that created it. Digital languages produce a surface transparency which only disguises a highly abstract and impenetrable opacity; for “[u]nlike the depth model of meaningful interiority in the analogue subject, the further down into the coding levels the programmer goes, the less intuitive is the code and the more obscure the meaning.”33 Because of this, code cannot be used as a language of or for interpretation. By extension, buildings which result from code writing escape the circularity of metaphysical or hermeneutical arguments by irreconcilably severing the origin from the outcome, and by paradoxically placing all of their complexity on the surface.

Cellular automata have been used to study a baffling number of subjects including but not limited to ethnography, signaling networks, the human uterus, chaos, concrete structures, ecologies, fluid dynamics, forest insect infestations, red blood cells, crystals, bacteria, jigsaw puzzles, genetically modified plants, snowflakes, sand mandalas, weather, drainage networks, urban sprawl, computer games, heat transfer, artificial life, combat, painting, debris flow, the immune system, education, traffic, hormones, smallpox, artificial morphogenesis, SARS, yeast proteins, musical composition, intracellular ion migration, sand piles, stock markets, geophysics, grazing, limb growth, and not least of all, architecture.34 The fact that all of these systems can be simulated using CA makes a pretty strong argument in favor of Wolfram’s thesis that nature itself may in fact be computational. However, a simulation is by definition not the real thing. So if the ultimate ambition of computational architecture is to get as close as possible to the unmediated production of structure and space, to an architecture that is purely itself, then there is clearly a missing link between simulations like CA and their architectural manifestation. If life itself remains the ultimate model of emergence, then, according to Elizabeth Grosz, CA still fall short of the kind of Bergsonian duration that would produce genuine novelty.35 Grosz, in her recent book The Nick of Time: Politics, Evolution, and the Untimely, argued that “…algorithmic models share the same philosophical or ontological problemsâ€? as mathematical ones, and that in simulations like CA, “time becomes merely the neutral, regulatable background in which objects or relations change, rather than an inherent ingredient in such research.â€?36 Grosz made the keen observation that the ‘duration of steps’ within a CA simulation could be sped up or slowed down without in any way affecting the outcome.37 A New Kind of Science confirms this where Wolfram made it clear that his most crucial discoveries could only have been made once the computer sped up the computational process.38 So while CA evolve in real-time, their duration is dependent upon the limits of technology. The more crucial question arises, however, when we take a minute to actually imagine a world in the not-too-distant-future where Grosz’ demands are satisfied and architecture is self-organizing, unmediated, and possesses true duration. In this world, which according to Haresh Lalvani, is quite possible,

“[b]uildings would grow, respond, adapt and recycle, they would self-assemble and self organize, they would remember and be self-aware, they would evolve, and they would reproduce and die. Organic architecture, were it to attain biology, would design itself. It would also perpetuate itself. Architecture would then become “life”, and paradoxically, buildings would no longer need architects. Organic architecture, in this limit case scenario, would also define the end of architecture (as we define it now).”39

The “End” that Eisenman’s essay ‘ended’ was the representation of an ultimate point in the future that functioned “…as a value laden effect of the progress or direction of history.”40 While this perception of a break in historical continuity is exactly what freed Eisenman to treat every project as its own origin with its own arbitrary set of rules and tactics, it wasn’t until the appearance of the blob a decade later that historical categories would actually come to an end; making a truly emergent architecture possible. While computational architecture is informed on a theoretical level by the “non-dialectical, “non-directional,” “non-goal oriented”41 program that Eisenman’s work initiated, one crucial place where it contradicts the postmodern semiotic paradigm is in its tendency toward the transcendental, in the form of the technological telos that has ultimately come to define it. So while CA may not exhibit goal-oriented behavior, computational architects do. For invariably their work circles around a desire to reach that almost utopist point in the future where all barriers will finally be broken down. Karl Chu sees the “convergence of computation and biogenetics”, for example, as leading to what he dramatically calls “…the unmasking of the primordial veil of reality.”42 So while the postmodern use of semiotics enabled architecture to escape from the future, the post-human use of computer code may be turning architecture once again into one of its dependents. The closer architecture gets to science the more inevitable it seems that that future, which we have successfully managed to evade for almost three decades, will return to cast a shadow on the present.

By Brad Horn

Notes:

1 N. Katherine Hayles, My Mother Was a Computer: Digital Subjects and Literary Texts (Chicago: The University of Chicago Press, 2005), 15.
2 Stephen Wolfram, “Undecidability and Intractibility in Theroetical Physics”, 1985. http://www.stephenwolfram.com/publications/articles/physics/85-undecidability/2/text.html.
3 The Journal of Architecture and Computation, http://www.comparch.org.html.
4 Mark Wigley, “Deconstructivist Architecture,” in Deconstructivist Architecture: The Museum of Modern Art, New York, ed. Phillip Johnson (New York: Museum of Modern Art, 1988), 10.
5 Ferdinand De Saussure, Course In General Linguistics (New York: McGraw-Hill Humanities/Social Sciences/Languages, 1965), 67.
6 Peter Eisenman, “The End of the Classical, The End of the Beginning, The End of the End,� Perspecta 21 (1984): 155.
7 Ibid., 157.
8 Ibid., 157.
9 K. Michael Hays, Architectural Theory Since 1968 (Cambridge: The MIT Press, 1998), 522.
10 Eisenman, 167.
11 Eisenman, 166.
12 Hays, 522.
13 Greg Lynn, www.glform.com.
14 Jorge Silvetti, “The Muses are Not Amused: Pandemonium in the House of Architecture,� Harvard Design Magazine 19 (Fall 2003 / Winter 2004): 26.
15 Greg Lynn, “Blobs: Or Why Tectonics is Square and Topology is Groovy,” ANY 14 (1996): 60.
16 Silvetti, 26, 27.
17 Silvetti, 27.
18 Arthur Danto, After the End of Art: Contemporary Art and the Pale of History (New Jersey: Princeton University Press, 1997), 24, 36.
19 Karl Chu, “Metaphysics of Genetic Architecture and Computation,� Perspecta 35 (2004): 79.
20 Ibid., 83.
21 Chu, 82, 83.
22 Chu, 83.
23 Eisenman, 165.
24 Haresh Lalvani, “Genomic Architecture,” Journal of Architecture and Computation (2005), http://www.comparch.org.html.
25 Silver.
26 Silver.
27 Wigley, 11.
28 Hayles, 41.
29 Hayles, 22.
30 Hayles, 22.
31 Hayles, 23.
32 N. Katherine Hayles, “Simulating Narratives: What Virtual Creatures Can Teach Us,” Critical Inquiry 1, Vol. 26 (Autumn 1999), 15.
33 Ibid., 14.
34 www.wolframscience.com (Current Bibliography: A current list of publications based on NKS).
35 Henri Bergson, Matter and Memory (New York: Zone Books, 1990), 186, 205, 207, 210.
36 Elizabeth Grosz, The Nick of Time: Politics, Evolution, and the Untimely (London: Duke University Press, 2004), 242.
37 Ibid., 242.
38 Stephen Wolfram, A New Kind of Science (Illinois: Wolfram Media Inc., 2002), 45.
39 Lalvani.
40 Eisenman, 169.
41 Eisenman, 170.
42 Karl Chu, “Genetic Architecture,� Journal of Architecture and Computation, July, 2005, http://www.comparch.org.html.

]]>
http://www.futurefeeder.com/2006/10/escape-from-the-future-architecture-language-and-the-computational-turn/feed/ 0
Process/Drawing http://www.futurefeeder.com/2006/08/processdrawing/ http://www.futurefeeder.com/2006/08/processdrawing/#comments Mon, 14 Aug 2006 14:57:13 +0000 http://futurefeeder.com/index.php/archives/2007/05/13/processdrawing/
Writing software is at the core of Casey Reas’s artistic practice. The digital is his medium of choice rather than a means of manipulation. He reflects on the evolution of his work in software and why the history of using computers to produce visual images is largely an unrecorded one in the history of art, but why this might all be set to change as scripting takes on a new primacy in contemporary art. I started playing with computers as a child. Our family’s Apple IIe machine was a toy for playing video games and writing simple programs in BASIC and Logo.1 I spent years exploring and testing it, but I preferred drawing and my interest in computers slowly dissipated. In 1997 I was introduced to John Maeda and the work of his students in the Aesthetics and Computation Group at MIT.]]>
ReasThumb.jpgWriting software is at the core of Casey Reas’s artistic practice. The digital is his medium of choice rather than a means of manipulation. He reflects on the evolution of his work in software and why the history of using computers to produce visual images is largely an unrecorded one in the history of art, but why this might all be set to change as scripting takes on a new primacy in contemporary art.

I started playing with computers as a child. Our family’s Apple IIe machine was a toy for playing video games and writing simple programs in BASIC and Logo.1 I spent years exploring and testing it, but I preferred drawing and my interest in computers slowly dissipated. In 1997 I was introduced to John Maeda and the work of his students in the Aesthetics and Computation Group at MIT.

[pdf download]

]]>
http://www.futurefeeder.com/2006/08/processdrawing/feed/ 2
Metaphysics of Genetic Architecture and Computation http://www.futurefeeder.com/2006/08/metaphysics-of-genetic-architecture-and-computation/ http://www.futurefeeder.com/2006/08/metaphysics-of-genetic-architecture-and-computation/#respond Sat, 12 Aug 2006 15:09:28 +0000 http://futurefeeder.com/index.php/archives/2006/08/12/metaphysics-of-genetic-architecture-and-computation/ With the dissolution of the last utopian project of Man in the name of Communism, the great spectre that once haunted Europe and the rest of the world has all but vanished, leaving in its wake an ideological vacuum that is now being filled by the tentacles of globalisation with its ecumenical ambition. As humanity has become mesmerised by the triumphant spell of capitalism, what remains less apparent in the aftermath of this dissolution is that the world is moving incipiently towards a threshold that is far more radical and fantastic than any utopic vision since the dawn of the Enlightenment. Once again, the world is witnessing the rumblings of a Promethean fire that is destined to irrupt into the universe of humanity, calling into question the nature and function of life–world relations as they so far have existed. These rumblings, stemming in large measure from the convergence of computation and biogenetics in the latter part of the 20th century, have already begun to invoke gravid visions of the unthinkable: the unmasking of the primordial veil of reality.]]> Thumb copy.jpgWith the dissolution of the last utopian project of Man in the name of Communism, the great spectre that once haunted Europe and the rest of the world has all but vanished, leaving in its wake an ideological vacuum that is now being filled by the tentacles of globalisation with its ecumenical ambition. As humanity has become mesmerised by the triumphant spell of capitalism, what remains less apparent in the aftermath of this dissolution is that the world is moving incipiently towards a threshold that is far more radical and fantastic than any utopic vision since the dawn of the Enlightenment. Once again, the world is witnessing the rumblings of a Promethean fire that is destined to irrupt into the universe of humanity, calling into question the nature and function of life–world relations as they so far have existed. These rumblings, stemming in large measure from the convergence of computation and biogenetics in the latter part of the 20th century, have already begun to invoke gravid visions of the unthinkable: the unmasking of the primordial veil of reality. [pdf download]

]]>
http://www.futurefeeder.com/2006/08/metaphysics-of-genetic-architecture-and-computation/feed/ 0
Dazzle Topologies http://www.futurefeeder.com/2006/08/dazzle-topologies/ http://www.futurefeeder.com/2006/08/dazzle-topologies/#comments Thu, 10 Aug 2006 16:00:02 +0000 http://futurefeeder.com/index.php/archives/2006/08/06/dazzle-topologies/
One of the great lessons of the 20th Century that our particular generation of architects has inherited is our appreciation of the infra-thin scale: the primal math containing the profound secrets to all animate matter. Whether it is the splitting of the atom, or the isolation of the DNA strand in the first half of the century, or more recently discoveries surrounding the genome project, collectively they represent within their own respective disciplines the smallest increment of information necessary to recreate all possible expressions in the game of life.]]>
EVAN copy.jpgOne of the great lessons of the 20th Century that our particular generation of architects has inherited is our appreciation of the infra-thin scale: the primal math containing the profound secrets to all animate matter. Whether it is the splitting of the atom, or the isolation of the DNA strand in the first half of the century, or more recently discoveries surrounding the genome project, collectively they represent within their own respective disciplines the smallest increment of information necessary to recreate all possible expressions in the game of life. The significant value for architecture lies in our capacity to speculate upon biological mimesis as a new paradigm for both material and programmatic behavior. In other words, world history has entered into a radical phase where the very destiny of life as we know it can now be altered by reconfiguring the “computational logics” of natural selection. Beyond the profound ethical consequences of these considerations — which clearly need to be addressed — the significance of these advances for contemporary material practices resides in the infinite performative scenarios available in the creation of sentient matter.

Auto Braids / Auto Breeding

Auto-Braids

Today we confront an accelerated technological revolution unparalleled in modern times. The pursuit of electronic networks aimed at overcoming the inhibiting effects of distance has irreversibly altered the accepted protocols of vision and thereby its corresponding value. In this project of erasing distance, the loss of the real becomes synonymous with speed. Appearances are liquidated and swiftly reabsorbed into the endless disciplinary logic of commodified skins. The challenge for architecture is to develop a corresponding theory: one that mediates between aesthetics of the inert and energy to excess; between the limit and transgression. As Foucault wrote, “Its model is that of a perpetual battle rather than a contract regulating transaction or the conquest of a territory. A power exercised rather than possessed: it is not the privilege acquired or preserved, but the overall effect of its strategic positions.”

Take today’s “travel gear” as an example: a heterogeneous collection of portable, exotic hardware, permitting the individual an extraordinary degree of personal autonomy, mobility, information and control. Alluringly soft and hard, crisp or curved, glossy or grained, slippery or matte, convex or concave, these streamlined, ergonomically precise vehicles represent the infinite range of topological blends that are available to us.
Now that membrane technology can effectively re-enact a broad range of sentient expressions (for example, elastic modification, climatic adjustment, and material responsiveness from memory retention), these “imitations” challenge the conventional precept that inorganic bodies are distinctly separate from their organic counterparts. Biologically mimetic and artificially intelligent, these life-like specimens represent an entirely new synthetic ecology. No longer simple inanimate objects, they are an indispensable extension of our desire to bring architecture to life.

Auto-Braids

In appreciation of this shift towards the “play of surface,” Auto Braids / Auto Breeding (a “display-scape” specifically fabricated for our exhibition of Jean Prouvé’s modular systems) set out to produce just such an instance of biological mimesis. Faithful to the proposition that all topological behavior is ultimately determined by its base mathematical logics, we began to look for an ideal computational unit capable of initiating infinite variation throughout the replicating process. Unlike the early modernist conception of the building block as a continuous reinstatement of sameness leading to generic patterns of pure unadulterated repetition, our “digital module” was conceived as an iterative seedling, part of a larger “field operation.” In other words, the discrete iconographic image of the repetitive unit made popular during the early years of mass production was purposefully suppressed here in favor of an “intelligent behavior” — the measure of our unit’s value was determined by its ability to initiate multiple affiliations within a changing aggregative context. As a singular isolated wave pattern, its deceptively simple collection of local topological characteristics functioned as the base material for the production of a larger, continuously changing “sea of surface intensities.”

Auto-Braids

Inspired by Jean Prouvé’s commitment to the most advanced technology of his time and the legendary contributions he made to the development of modular systems, the exhibition installation set out to reinterpret these conceptual directives from a new and contemporary perspective within the theoretical aims of Auto-Braids / Auto-Breeding. Celebrating the current opportunities afforded by high-end 3-D modeling software and five-axis rapid prototyping milling, a series of interlocking modular elements were produced for assembly as an exhibition “display-scape”. Offered as one continuous surface and capable of varying spatial configurations due to changing programmatic and contextual requirements, this new topological terrain represents a universal meta-stage for the entire collection of artifacts. Through the rigorous and changing assignment of destination, sequence, and proximity an endless scenerio of conceptual affiliations are achieved. Each offering a subtle or dramatic reinterpretation of their professed value in history. Intended to function as a curatorial game board, this new membrane and its matrix of landing sites serves to offer a range of recombinatory flexibility ideal for any collection continuously undergoing change.

The simultaneous pursuit of curatorial and topological “multiplicity” inherent in Auto-Braids / Auto-Breeding represents a much larger architectural aspiration beyond the limits of the gallery. It concerns the future development and application of ‘biological mimesis’ as a new paradigm for architectural production. As we enter through this new phase of morphogenetic and technological expansion we unleash a range of material and programmatic opportunity capable of altering the very destiny of architecture.

by Evan Douglis

]]>
http://www.futurefeeder.com/2006/08/dazzle-topologies/feed/ 1
Tectonics, Economics and the Reconfiguration of Practice: The Case for Process Change by Digital Means http://www.futurefeeder.com/2006/08/tectonics-economics-and-the-reconfiguration-of-practice-the-case-for-process-change-by-digital-means/ http://www.futurefeeder.com/2006/08/tectonics-economics-and-the-reconfiguration-of-practice-the-case-for-process-change-by-digital-means/#respond Tue, 08 Aug 2006 15:14:34 +0000 http://futurefeeder.com/index.php/archives/2006/08/08/tectonics-economics-and-the-reconfiguration-of-practice-the-case-for-process-change-by-digital-means/ The current programming culture in architecture could all too easily be written off as a youthful, geeky obsession with the algorithmic and the parametric among nascent practitioners, who have had little if any opportunity to build. The activities of Gehry Technologies run counter to this stereotype. Building on 15 years of experience at Gehry and Partners, Gehry Technologies was founded in 2002 as an independent organisation dedicated to the business of technological innovation and the development of architectural software tools. Dennis R Shelden, chief technology officer, discusses the wider implications of a concentrated focus on technological tools and organisational processes for designers and the business of building.]]> sheldon-thumb.jpgThe current programming culture in architecture could all too easily be written off as a youthful, geeky obsession with the algorithmic and the parametric among nascent practitioners, who have had little if any opportunity to build. The activities of Gehry Technologies run counter to this stereotype. Building on 15 years of experience at Gehry and Partners, Gehry Technologies was founded in 2002 as an independent organisation dedicated to the business of technological innovation and the development of architectural software tools. Dennis R Shelden, chief technology officer, discusses the wider implications
of a concentrated focus on technological tools and organisational processes for designers and the business of building. [pdf download]

]]>
http://www.futurefeeder.com/2006/08/tectonics-economics-and-the-reconfiguration-of-practice-the-case-for-process-change-by-digital-means/feed/ 0
Bodies Unfolding http://www.futurefeeder.com/2006/08/bodies-unfolding-2/ http://www.futurefeeder.com/2006/08/bodies-unfolding-2/#comments Mon, 07 Aug 2006 18:45:31 +0000 http://futurefeeder.com/index.php/archives/2006/08/06/bodies-unfolding-2/
After seeing Buckminster Fuller's Dymaxion World Map, a map projected on a flattened isohedron, we began working on the idea of using computer technology to transfer our physical bodies into two-dimensional images. The profundity of simple sculptural gestures translated through the mechanics of a map projection intrigued us. The representation of three-dimensional objects on two-dimension surfaces has been a perennial concern for artists through the centuries. The idea of simultaneity, where an object is experienced all at once -was a major theme of the Cubists and Futurists. “Selfportrait.map” explores this idea in a contemporary way using new digital imaging tools.

The earliest map projections were produced by first visually and later mathematically projecting three-dimensional details onto two-dimensional surfaces. With the advent of computers ever more complex objects could be electronically recorded and transformed. As artists accustomed to working with physical materials like clay, stone or steel we considered the manipulation of three-dimensional forms in virtual space as a non-traditional extension of the sculptural process.

]]>
Bill and Lila Thumb copy.jpg

After seeing Buckminster Fuller’s Dymaxion World Map, a map projected on a flattened isohedron, we began working on the idea of using computer technology to transfer our physical bodies into two-dimensional images. The profundity of simple sculptural gestures translated through the mechanics of a map projection intrigued us. The representation of three-dimensional objects on two-dimension surfaces has been a perennial concern for artists through the centuries. The idea of simultaneity, where an object is experienced all at once -was a major theme of the Cubists and Futurists. “Selfportrait.map” explores this idea in a contemporary way using new digital imaging tools.

The earliest map projections were produced by first visually and later mathematically projecting three-dimensional details onto two-dimensional surfaces. With the advent of computers ever more complex objects could be electronically recorded and transformed. As artists accustomed to working with physical materials like clay, stone or steel we considered the manipulation of three-dimensional forms in virtual space as a non-traditional extension of the sculptural process.

The fragility and tenuous nature of life is a reoccurring theme in our work. Selfportrait.map explores the digital reordering of three-dimensional forms through a reshaping of the digitized body. A new way of representing the human figure was created by remapping its surface onto a set of simple shapes. In the process of unfolding the scans the computer generated a complex network of jagged seems and torn edges. Stitching utilities exist that allow the projections to be repaired but we considered their complexity to be highly evocative.

The scanner produces two kinds of body data. The first is a set of x,y,x spatial coordinates recorded by an automated laser range finer. The second is a flat, camera generated texture map of the same object’s surface. The camera travels around the object recording it as a string of digital snapshots. The two are combined into one by literally projecting the photographic data onto the three-dimensional surface scan.

We were aware from the start that the images captured by a traditional camera were limited to a single viewpoint. This fixes the photographic eye in time. Whether the picture is of a speeding bullet or a still life, the camera is limited by its position and temporal mechanics. Viewing all surfaces of a figure in a single instant, as one views a map of the world, places the viewer outside the frame of lens-based, perspective vision. With the photographic data produced by a three-dimensional scanner the views of the body become omni directional. This allows an unlimited number of views to be derived from a single scan and has the effect of liberating the image from the conventions of photographic representation. The result, in visual terms, is similar to surrounding the subject with an infinite number of eyes that can see all sides of an object at once.

The fusion of spatial coordinates and photographic textures stored as a plastic data file was of particular interest to us. Using a suite software tools we chopped our scanned bodies into innumerable slices. As each layer was separated from the next it could be viewed as an individual drawing. Changing the angle of the drawing affected its shape. By varying the depth and position of each slice a complex, painterly set of calligraphic lines were developed ones that looks much like brushstrokes.

From these sections we created a short animation. Imagine looking down at a topographic map, seeing only the topmost layer and then scanning through each strata, one layer at a time. Altering the angle and depth of the strata greatly affects the appearance, speed and motion of the resulting motion. By changing the camera view of a whole figure, and recording images from the camera’s rotation around the figure, we were able to animate a still, three-dimensional image as it rotates in space. In the completed animations the figure assemble and disassemble itself at times dancing across the screen and at other engaging in a macabre snowfall of thin slices.

This amazing tool has allowed us to interact with the in ways unimaginable until recently. We are just beginning to explore some of the issues raised in working with the spatial, temporal and calligraphic data produced by the scanner and its software.

by Bill Outcault and Lilla Locurto

]]>
http://www.futurefeeder.com/2006/08/bodies-unfolding-2/feed/ 2
Cultural Concerns in Computational Architecture http://www.futurefeeder.com/2006/08/cultural-concerns-in-computational-architecture/ http://www.futurefeeder.com/2006/08/cultural-concerns-in-computational-architecture/#comments Sun, 06 Aug 2006 17:54:49 +0000 http://futurefeeder.com/index.php/archives/2006/08/06/cultural-concerns-in-computational-architecture/
G. Holmes Perkins, 1904-2004

In September of 2004 I attended two events that reflect on each other. One was the Non-Standard Praxis conference held at MIT. The other was a memorial at the University of Pennsylvania for G. Holmes Perkins, my dean when I was an architecture student at Penn from 1959 to 1966. He had died just a couple of weeks short of his 100th birthday.

The “Non-Standard” in Non-Standard Praxis refers ambiguously to Nonstandard Analysis in mathematics and to emerging computational approaches in architecture. Some of the phrases in titles of presentations at the conference include: performativity, topologies, virtual standardization, amorphous space, hyperbody research, immaterial limits, affective space, algorithmic flares, the digital surrational, the boundaries of an event-space interation, bi-directional design process, and voxel space.

The architecture shown was for the most part either generated with 3D animation software or by computational algorithms. Much of it exists only as images, or it has been CNC milled from foam and coated. Much of the work is what has been categorized as “blob” architecture. As would be expected, when built buildings were presented, they were usually less radical than the unbuilt designs.

]]>
Perkins

G. Holmes Perkins, 1904-2004

In September of 2004 I attended two events that reflect on each other. One was the Non-Standard Praxis conference held at MIT. The other was a memorial at the University of Pennsylvania for G. Holmes Perkins, my dean when I was an architecture student at Penn from 1959 to 1966. He had died just a couple of weeks short of his 100th birthday.

The “Non-Standard” in Non-Standard Praxis refers ambiguously to Nonstandard Analysis in mathematics and to emerging computational approaches in architecture. Some of the phrases in titles of presentations at the conference include: performativity, topologies, virtual standardization, amorphous space, hyperbody research, immaterial limits, affective space, algorithmic flares, the digital surrational, the boundaries of an event-space interation, bi-directional design process, and voxel space.

The architecture shown was for the most part either generated with 3D animation software or by computational algorithms. Much of it exists only as images, or it has been CNC milled from foam and coated. Much of the work is what has been categorized as “blob” architecture. As would be expected, when built buildings were presented, they were usually less radical than the unbuilt designs.

So yes, there was a lot of impracticality and pretension, and a lot of stuff that is not architecture if you define architecture as something you might be able to build with the technologies available in the foreseeable future. But non-the-less, most of the work was extremely interesting, and suggests we are in a highly innovative period in architecture. The architects and students who presented were imaginative, knowledgeable, and committed to bringing the revolutionary power of the computer into architecture at the most fundamental levels.

But I was struck by something that was not addressed at the Non-Standard Praxis conference that was suggested by Dean Perkins’ memorial.

Perkins had been a young modern architect in the 1930s, and had been the chair of city planning at Harvard while Gropius was chair of architecture. In 1951 Perkins was brought to Penn to rebuild the school. He began with the premise that a school of design must include city planning and landscape architecture along with architecture. The landscape and planning departments he built were among the strongest of the 20th century, but the architecture department became even better known as “The Philadelphia School.” Perkins brought to the school Louis Kahn, Robert Venturi, Denise Scott Brown, Robert Geddes, Louis Mumford, Robert Le Ricolais, Edmund Bacon, and many others. In many cases these people were not known when Perkins hired them but developed in the supportive atmosphere he provided. And The Philadelphia School was not just personalities, but a strong curriculum based on a integrated approach to architecture and a unique synergy of culture, school, city, and profession.

Perkins had always said that architecture must have an ethical, social and aesthetic component, otherwise it is just fashion. This attitude was lacking at the Non-Standard Praxis conference and among many of today’s computational architects.

Perkins was a traditional modernist in all of the best senses. That is the modernism of clarity, social responsibility, a realization that we need to address cities and regional ecology as well as monuments, and that modernism in architecture is a part of a new culture of industrialism and democracy. In short the modernism not only of Mies, Corbu and Gropius, but also of CIAM and Americans like Lewis Mumford and Clarence Stein. Perkins and the Philadelphia School held Mumford’s position that “Architecture, properly understood, is civilization itself.” Perkins and his school based architecture on historical traditions and our current culture and the criticisms of it.

Why is this approach absent among many architects today? There is a claim that Modern Architecture failed in its attempt to address social concerns as evinced by the 1975 dynamiting of Pruitt-Igo. But the failure of certain public housing projects is not the failure of Modern Architecture, which was in fact a major success.

Modern Architecture was an integral part of the modern culture that brought about a more egalitarian society, a radical increase in lifespan, a diminishment of infant mortality, an education for the average person that exceeds that of Renaissance nobles, and access to the secrets of the atom and the universe. Modern Architecture provided the homes, offices, factories, laboratories, schools, institutions, and cities in which all of this was realized.

Over the past two decades many architects, perhaps most notably John Hejduk and Peter Eisenman, have challenged the notion that architecture has social responsibilities. They took this position in response to movements in the 1970s that saw architecture as a branch of social advocacy. But it would be a mistake to think that Eisenman and Hejduk do not have cultural concerns. They both set out to reintroduce Architecture as the central concern of architecture. This is to say aesthetics as a concern with the deepest human meanings. For example, far from being isolated from culture, Eisenman is deeply immersed the postmodern condition, and the various phases of his work are in response to the culture of the time, be it underlying linguistic structures, discordance, or a resolution of chaotic disjuncture.

It is this sense of setting in a culture that I found missing at Non-Standard Praxis. We are now in one of the most significant periods of cultural and technological change in history, probably greater in scope than those associated with Newton and Einstein. Developments in quantum mechanics are leading to quantum computers that gain their prodigious power through harnessing their siblings throughout the multiverse. Biotech and genetic engineering are bringing about new species and perhaps the alteration of homo sapiens. Materials engineering and nanotechnology are altering the object we use and how they are made. Communications technologies promise that eventually everything can be connected to everything. Cosmologists place us in ever expanding infinities of multiple universes. And individuals will have unprecedented opportunities for education, knowledge and achievement and the prospect of cognitive powers we cannot yet imagine.

The ribbed vault was not just a structural technique, but also a means of putting the human soul in touch with God. Perspective was not just a means of organizing pictorial space, but also a means of asserting the human observer. Industrial materials were not just economical, but also a means of finding the human place in a democratic world.

In like manner, we should consider Maya, 3D Studio MAX and computational algorithms not just tools in themselves, but as means of engaging our new world.

Notes

  1. The Non-Standard Praxis conference was based on work exhibited at the Centre Pompidou in 2003 in a show curated by Frederic Migayrou.
  2. Brief bio of Dean G. Holmes Perkins.

by John Lobell

]]>
http://www.futurefeeder.com/2006/08/cultural-concerns-in-computational-architecture/feed/ 1
Genetic Architecture http://www.futurefeeder.com/2006/08/genetic-architecture/ http://www.futurefeeder.com/2006/08/genetic-architecture/#comments Sun, 06 Aug 2006 17:52:57 +0000 http://futurefeeder.com/index.php/archives/2006/08/06/genetic-architecture/ With the dissolution of the last utopian project of Man in the name of Communism, thegreat specter that once haunted Europe and the rest of the world has all but vanished, leaving in its wake an ideological vacuum that is now being filled by the tentacles of globalization withits ecumenical ambition. As humanity has become mesmerized by the triumphant spell of capitalism, what remains less apparent in the aftermath of this dissolution is that the world ismoving incipiently toward a threshold that is far more radical and fantastic than any utopic vision since the dawn of the Enlightenment. Once again, the world is witnessing the rumblingsof a Promethean fire that is destined to irrupt into the universe of humanity, calling into question the nature and function of life world relations as they so far have existed. Theserumblings, stemming in large measure from the convergence of computation and biogenetics in the latter part of the twentieth century, have already begun to invoke gravid visions of theunthinkable: the unmasking of the primordial veil of reality.]]>

“All is algorithm!”

Gregory Chaitini

Genetic Architecture

With the dissolution of the last utopian project of Man in the name of Communism, thegreat specter that once haunted Europe and the rest of the world has all but vanished, leaving in its wake an ideological vacuum that is now being filled by the tentacles of globalization withits ecumenical ambition. As humanity has become mesmerized by the triumphant spell of capitalism, what remains less apparent in the aftermath of this dissolution is that the world ismoving incipiently toward a threshold that is far more radical and fantastic than any utopic vision since the dawn of the Enlightenment. Once again, the world is witnessing the rumblingsof a Promethean fire that is destined to irrupt into the universe of humanity, calling into question the nature and function of life world relations as they so far have existed. Theserumblings, stemming in large measure from the convergence of computation and biogenetics in the latter part of the twentieth century, have already begun to invoke gravid visions of theunthinkable: the unmasking of the primordial veil of reality.

The evolution of life and intelligence on Earth has finally reached the point where it isnow deemed possible to engender something almost out of nothing.ii In principle, a universe of possible worlds based on generative principles inherent within nature and the physicaluniverse is considered to be within the realm of the computable once quantum computing systems become a reality. For the first time, mankind is finally in possession of the power tochange and transform the genetic constitution of biological species, which, without a doubt, has profound implications for the future of life on Earth. By bringing into the foreground thehidden reservoir of life in all its potential manifestations through the manipulation of the genetic code, the unmasking or the transgression of what could be considered the firstprinciple of prohibition – the taking into possession of what was once presumed to be the power of God to create life – may lead to conditions that are so precarious and treacherous asto even threaten the future viability of the species, Homo sapiens, on Earth. At the same time, depending on how mankind navigates into the universe of possible worlds that are aboutto be siphoned through computation, it could once again bring forth a poetic re-enchantment of the world, one that resonates with all the attributes of a pre-modern era derived, in thisinstance, from the intersection of the seemingly irreconcilable domains of logos and mythos. Organically interconnected to form a new plane of immanence that is digital, computation isthe modern equivalent of a global alchemical system destined to transform the world into the sphere of hyper-intelligent beings.

Yet, what is the nature of computation that is destined to change the world includingarchitecture? No instrumental concept or logic of implementation since the invention of the wheel has fostered so much enthusiasm and promise as computation has. Beyond thenormative conception of computing machines as mere instruments for calculation, fabrication and communication, it is important to recognize the nature of the underlying ambitions ofcomputation and its relation to architecture. As controversial and provocative as it may seem, the underlying ambitions of computation are already apparent: the embodiment of artificial lifeand intelligence systems through abstract machines along with biomachinic mutation of organic and inorganic substances, and, most significantly, the subsequent sublimation ofphysical and actual worlds into higher forms of organic intelligence by extending into the computable domain of possible worlds. At the most prosaic level however, computation, likenatural languages, deals with information in its most general form. Computation functions as manipulator of integers, graphs, programs, and many other kinds of entities. In reality,however, computation only manipulates strings of symbols that represent the objects. It should also be pointed out that, according to the late Richard Feynman, computing systemscould be constructed at the atomic scale: swarms of nanobots, each functioning in accordance to a simple set of rules, could be made to infiltrate into host organisms or environmentsincluding the human body. In its simplest form, computation is a system that processes information through a discrete sequence of steps by taking the results of its preceding stageand transforming it to the next stage in accordance to a recursive function. Such an iterative procedure based on recursion has proved to be astonishingly powerful and is classified asbelonging to a class of machines having universal properties.

The power of computation is already evident in the fact that in less than seventy yearssince the inception of the Universal Turing Machine,iii it has ushered in the Information Revolution by giving rise to one of the most significant and now indispensable phenomenon inthe history of communication: the Internet, or, what could also be characterized as the universe of the Adjacent Possible.iv Stuart Kauffman defines the Adjacent Possible as theexpansion of the networks of reaction graphs within an interactive system into the neighborhood domain of connectivity which until then remain only in a state of purepotentiality. Kauffman suggests,

“The Universe has not explored all possible kinds of people, legal systems,economies or other complex systems,” and that “autonomous Agents tend to arrange work and coordination so that they are expanding into the AdjacentPossible as fast as they can get away with it.”5

Like every phase transition, the Internet marks a new world order by re-configuring the planetwith a virtual, albeit an interactive matrix that is becoming increasingly spatial, intelligent and autonomous: a global self-synthesizing organ bustling with neural intelligence possiblydetectable from every corner of the Milky Way and beyond. It is at the level of the construction of possible worlds that the implications for architecture are most pronounced.The thesis that will be advanced is that architecture is becoming increasingly dependent on genetic computation: the generative construction and the mutual coexistence of possibleworlds within the computable domain of modal space.

Notes

  1. Chaitin, Gregory, Leibniz, Information, Math and Physics. 2003. p. 9.
  2. Wolfram, Stephen, A New Kind of Science. (Champaign: Wolfram Research, 2002), p. 41.
  3. Turing, Alan, On Computable Numbers with an Application to the Entscheidungsproblem, Proceedings of the London Mathematical Society, ser 2, vol. 42 (1936). Alan Turing developed the conceptual blue print for an abstract machine now called the Turing machine inthe above mentioned paper for the first time.
  4. Kauffman, Stuart, Investigations. (New York: Oxford University Press. 2000). p. 142-144. Kauffman’s concept of the Adjacent Possible was applied in the context of his investigations into the originof life based on autocatalytic systems, which are derived from random interactions of nodes within Boolean networks.
  5. Kauffman, Stuart.

by Karl Chu

]]>
http://www.futurefeeder.com/2006/08/genetic-architecture/feed/ 2
Automason Version 1.0 http://www.futurefeeder.com/2006/08/automason-version-10/ http://www.futurefeeder.com/2006/08/automason-version-10/#comments Sun, 06 Aug 2006 17:50:34 +0000 http://futurefeeder.com/index.php/archives/2006/08/06/automason-version-10/
Contemporary architects are judged as much by their buildings as they are by the sophistication of the techniques used in design and construction. A certain fascination with technology is natural to any discipline that thrives on innovation and change. While new digital tools have had an especially profound impact on the representation of architectural space, only a few buildings today are actually put together with components fabricated on a CNC mill. This situation will perhaps change in time but for the present construction in America (and around the world) can best be described as an impure mixture of techniques supplemented by hand using traditional materials like brick and mortar. While portable masonry robots designed for both factory and on site applications are now in the early stages of development there are serious reasons to doubt that their employment will render human workers obsolete. In fact as we look back on the history of information technology, especially in the last thirty years the opposite seems to be the case. Instead of eliminating work automation has forced many to adopt new skills and become technically specialized as both products and production processes become increasingly more sophisticated. The narrow definition of Computer Aided Design and Manufacturing CAD/CAM must be expanded to include a much larger set of tools and procedures. In addition to robotic pipe layers, concrete pouring drones, and wall painting automatons we need to include new human machine-interfaces that operate on site as embedded technologies that can change the way buildings are made in the present. (augmented craft) This should be done with an eye on the body’s connection to, rather than its displacement by, technological innovations. What’s more we need to honor this relationship by developing new and powerfully expressive building forms.]]>

“Now that the use of commercial geometric modelers has become normative…the discipline of architecture appears ready to resume its longer-term engagement with structured knowledge representation. This often involves the development of small-scale, ad-hoc software…”

Malcolm McCullough, from “GROCS Letter”1

“…computational irreducibility occurs whenever a physical system can act as a computer. The behavior of the system can be found only by direct simulation or observation: no general predictive procedure is possible. Computational reducibility may well be the exception rather than the rule…”

Stephen Wolfram, from “Undecidability and Intractability in Theoretical Physics”2

“…the form arising out of work performance leads to every object receiving and retaining its own…shape.”

Hugo Haring from “The House as an Organic Structure”3

SJSU Museum of Art

Contemporary architects are judged as much by their buildings as they are by the sophistication of the techniques used in design and construction. A certain fascination with technology is natural to any discipline that thrives on innovation and change. While new digital tools have had an especially profound impact on the representation of architectural space, only a few buildings today are actually put together with components fabricated on a CNC mill. This situation will perhaps change in time but for the present construction in America (and around the world) can best be described as an impure mixture of techniques supplemented by hand using traditional materials like brick and mortar. While portable masonry robots designed for both factory and on site applications are now in the early stages of development there are serious reasons to doubt that their employment will render human workers obsolete. In fact as we look back on the history of information technology, especially in the last thirty years the opposite seems to be the case. Instead of eliminating work automation has forced many to adopt new skills and become technically specialized as both products and production processes become increasingly more sophisticated. The narrow definition of Computer Aided Design and Manufacturing CAD/CAM must be expanded to include a much larger set of tools and procedures. In addition to robotic pipe layers, concrete pouring drones, and wall painting automatons we need to include new human machine-interfaces that operate on site as embedded technologies that can change the way buildings are made in the present. (augmented craft) This should be done with an eye on the body’s connection to, rather than its displacement by, technological innovations. What’s more we need to honor this relationship by developing new and powerfully expressive building forms.

Because so much architecture today is conceived through descriptive techniques like AutoCAD, digital processes are used mostly for the purpose of design and representation. Computation is rarely a direct process responsible for and self-evident in the work itself. The opposite extreme restricts computation to the realm of the virtual. This is especially true for the “Evolutionary Architecture” of John Fraser, where genetic scripts evolve in a disembodied, electronic space. In Fraser’s thesis, materials are a secondary concern so that “actual processing and assembly is external to the model.”4 While natural selection and automorphosis are processes essential to the development of an organic architecture, if they are not already expressed in a material form then they are neither organic nor architectural. To underscore this point, we developed a system that is based on the analogous operation of cellular automaton programs and masonry construction.

SJSU Museum of Art

A cellular automaton program (CA) consists of a field of discreet cells divided into small groups or neighborhoods. Defined in terms of finite states, on or off, transparent or opaque, white or black, etc., a CA computation evolves over time. The configuration of each neighborhood is used to determine the future state of the next generation of cells. Both complex and repetitive patterns emerge as a result of the direct relationship between parts acting together to form a larger system from the ground up. The idea of using simple programs to drive the construction of brick and mortar structures comes from the realization that masons work much like Cellular Automaton programs. By following local procedures based on laws of adjacency and iteration, the mason builds by stacking one brick at a time. This process is dependent on the relationship of each masonry unit to its immediate neighbors and is capable of producing complexity from very simple rules.

In an automasonry wall form emerges from the direct expression of its materials and the way they are assembled. This follows one of the guiding principles of modernism but with a difference: structures driven by simple programs can be constructed without recourse to a limited inventory of pared down and platonic forms. The patterns created in the process are entirely natural to both the craftsman and the mathematics. With simple programs building details obtain their complexity for free, no external agent, author or extraneous system is needed to design them. This kind of complexity is not dependent on the incessant differentiation of parts (complexity for a price) but on the application of fixed rules in a discrete system that requires only two components. A close examination of cellular automata also reveals the fact that complex and simple, repetitive and aperiodic, symmetrical and asymmetrical patterns can all be evolved from a single coherent logic.

With simple programs programs, designers can push masonry to its limits while respecting constraints both functional and tectonic that define its potential for complexity. The behavior and intrinsic randomness of certain CA patterns also challenges the basic assumption that deterministic systems necessarily follow predetermined ends. (A brick does not necessarily want to become an arch.) When considering emergent phenomenon as global patterns produced from the bottom-up by local interactions that spread through a system, then the “existence will” of an unfolding event, process or entity becomes meaningful in a way that an archetype does not. The “pattern language” of Christopher Alexander or the “shape grammars” used in the early 90’s to derive computer-generated forms produced either idealized, static geometries or just simple variations on a known theme. Most patterns generated from the iteration of cellular automaton programs produce evolving, teleonomic structures that cannot be described by a definite shortcut, formula or ideal type. The only way to know how a given rule will behave is to set it in motion. In this regard Henri Atlan writes:

“A teleonomic process does not…function by virtue of final causes even though it seems as if it were oriented toward the realization of forms, that will appear only at the end of a process. What in fact determines it (i.e. a teleonomic process) are not forms as final causes but the realization of a program, as in a programmed machine whose function seems oriented toward a future state, while it is in fact causally determined by the sequence of states through which the preestablished program makes it pass.”5

With a CA
masonry system, details are self-organized whereas the overall pattern produced by the code forms a tight-fitting whole that is intentionally selected by the architect. In other words, the building’s design emerges naturally from the process of staking bricks while the overall pattern is constituted in response to specific design requirements. Again, the code self-organizes the parts while the manipulation of the initial conditions gives the designer power over the whole. “God” is not in the details because the details compose themselves. An architect’s desires, his or her personal reading of the client brief, institutional protocols, functional constraints and the logic of construction merge with the rigorous operation of simple programs to produce an organic architecture that satisfies the imperative of commodity, firmness and delight.

by Mike Silver

]]>
http://www.futurefeeder.com/2006/08/automason-version-10/feed/ 1
Genomic Architecture http://www.futurefeeder.com/2006/08/genomic-architecture/ http://www.futurefeeder.com/2006/08/genomic-architecture/#comments Sun, 06 Aug 2006 17:42:29 +0000 http://futurefeeder.com/index.php/archives/2006/08/06/genomic-architecture/
Genomic architecture is based on the manipulation of the architectural genome. Like its biological counterpart, this genome is universal and encompasses all architecture — past, present and future. At its root, this genome is defined by a unified morphological genome, a universal code for all morphologies — natural, human-made and artificial. Morphogenomics, a possible new science, deals with morphological informatics. It includes mapping the morphological genome as a basis for generative morphologies that underlie the shaping of architectural space and structure. Once mapped, the morphological genome will need to be layered with other genomes (also requiring mapping) to cover different aspects of architecture: physical (e.g. materials, construction technologies), sensorial, cognitive and behavioral. Genomic architecture, based on the layered genome, encompasses an integrated world of “artificial architecture� (used in the same sense as “artificial intelligence� and “artificial life�), a world of complexity evolving in parallel with the natural world. It is a morphologically structured network of information that determines architectural taxonomies and phylogenies, permits digital manipulation of form in the design process, and enables mass-customization in digital manufacturing.]]>
genomic1.gif

Genomic architecture is based on the manipulation of the architectural genome. Like its biological counterpart, this genome is universal and encompasses all architecture — past, present and future. At its root, this genome is defined by a unified morphological genome, a universal code for all morphologies — natural, human-made and artificial. Morphogenomics, a possible new science, deals with morphological informatics. It includes mapping the morphological genome as a basis for generative morphologies that underlie the shaping of architectural space and structure. Once mapped, the morphological genome will need to be layered with other genomes (also requiring mapping) to cover different aspects of architecture: physical (e.g. materials, construction technologies), sensorial, cognitive and behavioral. Genomic architecture, based on the layered genome, encompasses an integrated world of “artificial architecture” (used in the same sense as “artificial intelligence” and “artificial life”), a world of complexity evolving in parallel with the natural world. It is a morphologically structured network of information that determines architectural taxonomies and phylogenies, permits digital manipulation of form in the design process, and enables mass-customization in digital manufacturing.

Limits of Organic Architecture

The meaning of the term “organic architecture”, which draws its inspiration mostly from biology, keeps evolving with increasing knowledge of nature combined with foreseeable technologies. As new technologies emerge, architecture becomes more organic in its scope, intent and realization. The upper limit to this sort of bio-mimicry would be biology itself. Buildings would grow , respond, adapt and recycle, they would self-assemble and self-organize, they would remember and be self-aware, they would evolve, and they would reproduce and die. Organic architecture, were it to attain biology, would design itself. It would also perpetuate itself. Architecture would then become “life”, and paradoxically, buildings would no longer need architects. Organic architecture, in this limit case scenario, would also define the end of architecture (as we define it now).

Extrapolating from projected technologies of the future , a scenario like this one is quite possible, even inevitable, but it is flawed for two reasons. First, biology as a goal for organic architecture assumes that such a biology (namely, existing biology) is frozen in time since it is based on “life” as we know it presently. Extrapolation of architecture from present biology ignores past and future biologies. Nature’s ongoing experiment comprises structures that are extinct, structures that exist now, and structures that have yet to appear. The definition of ‘organic’ must thus encompass all biologies: past, present and the future. Second, it ignores the creation of the new, e.g. new materials (new chemistries) not found in nature, new technologies not found in nature and new organisms (based on known or new biologies) not existing in nature. Besides new natural biologies, the term ‘organic’ must thus include artificial biology as well. This is where the line between human designs and those made by nature becomes a continuum.

Unifying Laws

What unites the natural and the human-made (including the artificial) are fundamental laws, the laws of nature. Our knowledge of nature and human-made constructions evolves such that these laws become increasingly more encompassing, tending towards the natural upper limit of a single unifying law for everything (as in the current search in physics, for example). Whether this limit is attainable is an open question. The natural and the artificial are facets of organic architecture that are joined at this fundamental level. This is true of biology and buildings. The morphologic possibilities within these two worlds fall within a single morphological universe governed by unifying laws of form that are common to both. It is governed by the mathematics of space, structure and form. When physical constraints (size, material, movement, weight, stability, building method or forming process, etc.) are imposed on form, this universe shrinks through the elimination of mathematical structures that are physically unrealizable. The physics and chemistry of form delimits the morphological universe.

by Haresh Lalvani

]]>
http://www.futurefeeder.com/2006/08/genomic-architecture/feed/ 1