ArticlePDF Available

Abstract

Since the 1960s, innovation has become one of the sole purposes of architecture and membership of the avant-garde an underlying motive force. Philippe Morel of EZCT Architecture & Design Research juxtaposes the ‘sense and sensibilia’ of mathematics, now widely adopted by the ‘creative minorities’, against the idealism of mid 20th-century modernism. Copyright © 2011 John Wiley & Sons, Ltd.
1
ArchitecturAl Design
MatheMatics
of space
Guest-edited by
GeorGe L LeGendre
04|2011
ArchitecturAl Design
Vol 81, no 4
July/August 2011
issn 0003-8504
Profile no 212
isBn 978-0470-689806
122
SENSE AND
SENSIBILIA
Since the 1960s, innovation has become one of the
sole purposes of architecture and membership of the
avant-garde an underlying motive force. Philippe
Morel of EZCT Architecture & Design Research
juxtaposes the ‘sense and sensibilia’ of mathematics,
now widely adopted by the ‘creative minorities’,
against the idealism of mid 20th-century modernism.
Philippe Morel
Figure 1. EZCT Architecture &
Design Research, Universal House
and Assembly Element, 2009–11
The concept of a fully generic and
voxel-based architecture (free of
topological constraints) comes from
EZCT’s fi rst investigations, embodied
in the Studies on Optimisation:
Computational Chair Design using
Genetic Algorithm project (2004).
The current research, which refers
back to that of Nicholas Negroponte
and John Frazer, is oriented towards
a more precise, articulate and
constructive approach.
123
I can think much better about a formula than about a
geometrical object because I never trust that a geometric
picture, a drawing, is suffi ciently generic.
— Alain Connes in an Interview by Georges Skandalis
and Catherine Goldstein, Newsletter of the European
Mathematical Society, Issue 63, March 2007, pp 27–81
Returning to the Roots of Contemporary Sensibility
A brilliant intuition of the “radical” movement was that
normality as a shared value no longer existed, that all
society now constituted a set of creating minorities and
that the critical and creating methods of the avant-garde
had become the only practicable ones. The project
therefore changed its status, lost its methodological
unity and accepted innovation as the sole purpose of
creativity. For these intrinsic reasons, the “radical”
movement refused any stylistic unity, any recognisable
formal code, to act, on the contrary, as a movement
which destroyed within it any trace of the old search for
modern certainties. For this reason, it has always been
diffi cult, if not impossible, to fi le it under a critical category.
— Andrea Branzi, ‘Le mouvement radical ’, Architectures
expérimentales 1950–20002
It is with these words that Andrea Branzi commented
four decades later on the general philosophy of the Italian
movements of the second avant-garde. A philosophy that, in
a pop society with omnipresent technology, recognised that
the avant-garde had became the new normality. The title
of this article, ‘Sense and Sensibilia’, evokes the signifi cant
transformations that were taking place in the early 1960s, and
is named after John L Austin’s 1962 book of the same title3
rather than Aristotle’s classic philosophical text (De sensu
et sensibilibus). Austin’s work reasserts the pre-eminence of
ordinary language placed against a logical background. This
language, which moves on the logical ocean that characterises
Western civilisation, defi nes the current relationship between
‘sensible things’ and mathematics. These things now shared and
produced by ‘all the creating minorities’ are a thousand leagues
away from Modernist productions. As for their mathematics,
they have evacuated all traces of idealism related to the ‘old
search for modern certainties’.
It is thus against this context, today and not only in
architecture, that we are dealing with mathematics. It is a
context inseparable from the social transformations that
were established from 1960 onwards – the trigger for which
was the hedonistic consumption of goods and services
developed during the Second World War – and that defi ned
the theoretical directions taken at this time by architectural
research. Beyond their apparent incompatibility, this research
forms an extremely coherent set that can be defi ned as four
priority ‘topics’: 1) the type and role of permanent avant-
gardism; 2) mass communication as a source of semiotic pop;
3) the dimensions of the town and its architecture which reach
the sizes necessary for their autonomisation; 4) the role of
an idealised language of Modern architecture in the face of
its ‘vulgar’ versions, and the correspondence between a new
codifi ed language of architecture and the general codifi cation
that has appeared in linguistics and information sciences.4
It is possible to draw up a quick genealogy of these four
research directions, which were followed and encountered to
varying degrees by the vast majority of architects active in the
1960s. The fi rst was the principal motif of the neo avant-gardes
– led by Archizoom and Superstudio – before they turned their
attention to the widespread urban condition of global
megastructures. The second was in the US, which led the fi eld
of mass communications, and was the object of America’s pop
architecture that was to culminate in the ‘Learning from Las
Vegas’ study by Robert Venturi, Denise Scott-Brown and Steven
Izenour, published as a book in 1972.5 The third was the object
of the European functionalism critique from neo-regionalism to
Team X and Aldo Rossi’s Tendenza.6 The fourth was that of
Peter Eisenman who, in 1963, on fi nishing his doctorate,7
broached a ‘re-examination of the formal’. It was also, a decade
later, that of the humanist positivists such as Christopher Alexander.8
Of course it goes without saying that this categorisation has
in fact never been so clear. Above all, it says nothing about the
ideological positions taken by the theoretician architects. Despite
this, for an attentive observer, noticing for example that the
interest shown by Eisenman in language as such is inseparable
from a critique of functionalism, this categorisation clarifi es
that which has happened since 1960, and principally that which
is understood by ‘Postmodernism’ – a Postmodernism now
increasingly appearing as an attempt to create a synthesis of the
various issues of the 1960s mentioned above. It is thus through
this propensity for summary that it is possible to perceive both
the multiple interests in current Japanese architecture (neo-
materialism, advanced engineering, information technology and
electronics) and the architecture of Coop-Himmelb(l)au or Rem
Koolhaas – the most intentionally synthetic of all. It is also thus,
although from a different perspective, that we can understand
the works of FOA or the intricacy of Greg Lynn,9 the generic
Figure 2. Philippe Morel (curator), ‘Architecture Beyond Form: The
Computational Turn’, Exhibition at the Maison de l’Architecture et
de la Ville, Marseille, 22 February to 20 April 2007
The exhibition was a reading of the last 45 years of architecture, starting from 1963
and Peter Eisenman’s PhD thesis, ‘The Formal Basis of Modern Architecture’.
124
nature of which is derived from a different observation to that of
Koolhaas, but indirectly very well summarised by the personal
opinion of Anthony Vidler: ‘That in which I am involved is
another type of identity of the subject, built within self-generated
spaces by software which knows nothing of the distinction
between animal and human; an identity which, at least for the
time being, is more concerned with the morphological and
topological transformations of an external skin or a shell, than by
the human dimensions of an interior.’10
The summary of the major themes of the postwar critical
re-readings of Modernism is therefore the major project of
Postmodernism. The inherent diffi culty in such a project makes
it possible to understand both the theoretical infl ation which
appeared in the 1970s – when architects attempted to link
everything to the quasi-totality of surrounding theories with
deliberately funny or sometimes unintentionally grotesque
connections – and the relative failure of this summary. Since
1990, the onset of the Internet and information technology has
made the use of conceptual and practical tools a dominant focus.
Though it has shared with Postmodernism the theorisation of
social transformations, apparent in Fredric Jameson’s analysis of
the Bonaventure Hotel of 1988,11 it has also introduced a new
preoccupation with ‘deep structure’. This theorisation was partly
carried out by, among others, Lynn and by Alejandro Zaera-Polo,
for whom the question of form was in particular the refl ection
of a reading of the transformations of technological civilisation
and not the imprisonment within a new mathematical idealism.
Idealism often only offers an anachronistic update of Modernist
formal research or a new belief in the participation of architects
in the advancement of science. It is by the refusal of this
naive belief that Lynn and Zaera-Polo appear as ‘experienced
intellectuals’, the latter being ‘those who have understood that
they are not at the head of a change but in an experienced
rearguard which measures the difference and the progress of
technology in relation to the human sphere’.12
Conversely, it is due to the half-acceptance translated by the
zealous application of science – an application without any
scientifi c foundation or fact – that Alexander may be considered an
‘inexperienced intellectual’; intellectual, considering the
relationships between the social development and that of the
sciences within a historical invariance, deducing from the constant
validity of theorems and algorithms the permanence of sensible
things. For this belief in the logical continuity which extends from
the logico-mathematical laws to the acts had, well before Alexander,
been denounced as the pitfall par excellence of moral philosophy,
which did not prevent High-Modernism (for example, that of
Rudolf Carnap in The Logical Structure of the World)13 from running
aground. A pitfall as noted by Nietzsche in the following terms:
Socrates and Plato, great doubters and admirable
innovators were nevertheless incredibly naive in regard
to that fatal prejudice, that profound error which
maintains that “the right knowledge must necessarily be
followed by the right action”. … the contrary position is
in fact the naked reality which has been demonstrated
daily and hourly from time immemorial.14
The ‘Visible’ Is Not ‘Sensible’
As such, we cannot be anything other than surprised today
to still see in the many works broadly using algorithms and
mathematics – and which works can really do without them?
– a resurgence of an idealism in the style of Alexander. Just
as the latter’s relational graphs were infl uenced at the time
by progress in topology and their applications in the form
of operational research, the frenzied mathematical idealisms
and biomimicry tendencies of today’s architecture are nothing
less than a new Zeitgeist. As for the less complex approaches,
more coolly and visibly logical, although the best remind us
by their very radicalism that ‘logic is not necessarily as logical
as all that’, that we ‘use it exactly as we wish’ and that what is
important ‘is that things are logical, “in a certain way”’,15 the
Figure 3. EZCT Architecture & Design Research, Seroussi Cupboard, 2005–8
The fully scripted panel system leads to an entirely automated fabrication process.
The double-curvature surfaces of the panels are glued on to a light structure.
Everything is built from standard and cheap wood with vertical T-shape steel
reinforcements. The algorithms calculate the admissible deformation of the veneer
wood (which is supposed to be close to zero) and propose different solutions. Due to
very strict constraints, all stainless-steel hinges were conceived by EZCT.
125
majority of them come down to an opportunistic Postmodern
inclusion of an additional variable: a dose of easy spectacular
computation and additional effectiveness.
While for Postmodernism there are no longer any aesthetics
with an immediate value – ‘the ideologem of elegance and chic,
of the dear form, is united with its opposite, punk art, trash,
the sordid and the wretched’16 – the mathematics of sensible
things becomes foreign to the aim that it is supposed to serve.
It is then that the Postmodern paradox appears; at each new
attempt to perceptively approach a thing, the result is invariably
the distancing from this thing. Such is the Postmodern reality
which, on the one hand by the exponential rise in the number of
images in circulation and their importance in scientifi c research
submerges us in the fi eld of the apparently sensible, and on the
other by the infl ation in digital data and the not less exponential
growth in algorithmics of IT programming and mathematical
methods, freezes all of our immediate perceptions. Although
this paradox is not, by essence, Postmodern, it has reached a
degree previously unknown thanks to information technology,
which has allowed our relationship with the world to enter a
new era; and as Marshall McLuhan and Quentin Fiore point
out, not without humour: ‘Beside it, the wheel is a mere hula
hoop, even though the latter should not be entirely neglected.’17
The wheel was just a hoop insofar as even in the boldest
projections it was still only a representation of the world through
the image of a perpetual movement, a representation broadly
surpassed today in computer simulations. As observed by the
epistemologist Franck Varenne, whose recent works on the
integrative and pluri-formalised simulations in botany constitute
(using scientifi c practices) a rigorous critique of any mathematical
idealism, science faces two contradictions. The fi rst, a traditional
one, is that the researcher ‘still believes he is researching the
“laws of nature” whilst in practice he is fi rst of all contributing
to dispersing this kind of representation’;18 the other, specifi cally
linked to the arrival of information technology and its intrinsic
possibilities for repetitions identical to virtual experiences, is that
the latter are often preferable to any ‘real’ experience.
We should point out an amazing opinion among
engineers about the use of computer simulation in
industry – especially in aeronautics: they are more and
more convinced that in many cases, real experiments are
superfl uous. They think that a good simulation is far
better than an experiment on a prototype – apart from
the fi nancial considerations. Indeed, when you read Von
Neumann, you see that analogue models are inferior to
digital models because of the accuracy control limitations
in the fi rst ones. Following this argument, if you consider
a prototype, or a real experiment in natural sciences, is it
anything else than an analogue model of itself? … So the
possibilities to make sophisticated and accurate measures
on this model – ie to make sophisticated real experiments
– rapidly are decreasing, while your knowledge is
increasing. These considerations are troublesome because
it sounds as if nature was not a good model of itself
and had to be replaced and simulated to be properly
questioned and tested!19
This observation of the change in current scientifi c practices
throws precious light on that which is known today as the
mathematics of sensible things. Indeed, added to the
reductionism which began at the end of the 19th century and
ended in the logical positivism and attempts at a complete
axiomatisation (which although proved impossible will
nevertheless have a holding infl uence, for example with
Bourbakism in France) is a new distancing of the sensible specifi c
to information technology. This distancing is no longer based on
the Modernist abstraction as perceived in the universal grids of
Piet Mondrian or Mies van der Rohe, but on the contrary on a
new ‘logical fi guration’. Although stylistic Postmodernism had
perceived the cultural nature of this fi guration, recognising that
that which separated it from the void was nothing but its state of
‘capitalism transformed into image’, it had not really understood
the computational logic leading to our logical replication of the
world, the latter inheriting as much from Carnap as from the
history of scientifi c notation and symbolism, programming
languages or the development of material technologies20 as from
Guy Debord or Jameson. It is in this sense that the synthesis of
Postmodern architecture referred to earlier in this article is no
Figure 5. Karl Exner, Balance for
Equation, undated
During the 19th century and until the advent
of digital computers, scientists searched for
mechanical techniques to facilitate complex
calculations. From Uber eine Maschine zur
Aufl ösung höherer Gleichungen (About a
machine for the resolution of high-order
equations), Vienna, 1881.
Figure 4. Proposed connection between Peter Eisenman’s PhD thesis drawings (1963) and Gerrit
Mariè Mes’ Logic Diagrams (drawn in the 1960s), illustrating the Zeitgeist of the young Eisenman
Mes, a Dutch-born surgeon based in Krugersdorp, South Africa, developed this variation of logic
diagrams of Martin Gardner in the 1960s: directed and undirected lines, and a combination
of both (a line without an arrow means that travel may be either way). While being more or
less opposed to any kind of Zeitgeist, Peter Eisenman’s work is in fact highly related to its
surrounding visual and epistemological culture – the sign of a rationalist mind. Original image
from Martin Gardner, Logic Machines and Diagrams, University of Chicago Press, 1958.
126
Figure 6. Philippe Morel, Visual and Diagrammatic Representation of the
Mathematical Subject Classification (MSC) Concerning Geometry, 2005
The diagram was first made to illustrate a lecture entitled ‘Some Geometries’ at
the ‘Loopholes Between Theory and Practice’ symposium at Harvard Graduate
School of Design in April 2005. The idea was to show that geometry is not a
classically homogeneous field, but a very complex and intricate landscape.
127
128
Figure 7. Alessandro Mendini, Straw Chair, 1975
A temporary and self-destroying natural chair.
Figure 8. Italian radicals and German expressionists sharing geometry
Hermann Finsterlin, Composition with building blocks, 1916, and
Studio 65, Baby-lonia, 1973.
more complete than that of the four fundamental forces in
physics, if it can be or if it is desirable. To complete it we should
add to Venturi’s irony the complete disillusions of Stanley
Tigerman,21 but we most of all should add to the work of the
Austrian and Italian radicals, to the deconstructions of the 1980s
and to the digital research of the 1990s or the urban and cultural
theorisations of Koolhaas, a veritable investigation of the
‘computational logic of late capitalism’. Such an investigation
would have recourse to algorithms and mathematics beyond an
umpteenth passive formalism in order to evaluate them on an
entirely new ‘critical’ base. This goes against the new stylistic
unity of algorithmic, parametric or biomimetic architecture, or
calls for this unity, the new ‘recognisable formal code’ which
retains the ‘trace of the old search for modern certainties’.
De Novo Nature, Life as a ‘Good Simulation’
Broaching the mathematics of sensible things is, in reality,
the same as dealing with a crucial aspect of a civilisation in
which all the artefacts and stimuli are becoming mathematical
productions. With Modernist designers who embodied
mathematics in physical objects, the problem of the sensible
remained ‘easy’, but we have to admit that it is very different
today. Mathematics and logics exist now as a pervasive physical
and immaterial environment, as ‘a new domestic landscape’, which
is perfectly exemplifi ed by the annual production of theorems
estimated in 1976 by Stanislaw Ulam as 200,000.22 Such a
landscape, or ecosystem, which is inseparable from information
technology is proof of the omnipresence of information
technology mentioned above on the subject of a digital nature
which itself is a better model than the original. Thanks to this
quality, (computer) simulations are not at all a vulgar pretence;
this copy which could lead only to the search for the original – a
quest that is not just romantic but above all reactionary in its
ignorance of the reality of the facts – is a better original.
It is in this recognition of the arrival of digital reality,
a major scientifi c fact, that the entire difference between
stylistic and literary Postmodernism and scientifi c and effective
Postmodernism is played out. And this is what Eisenman had
summarised perfectly in the title of his essay ‘A Matrix in the
Jungle’ (2003), on this occasion returning to Jameson:
Several years ago, Fredric Jameson said that the
computer would be capable of giving us a new nature;
not an unnatural nature but a nature derived directly
from computerised algorithm and processes. Such
a thought means it would no longer be necessary to
look at nature with the same eyes through which Le
Corbusier observed the natural shapes of D’Arcy
Thompson. (It was precisely from the latter’s immense
body of work that Le Corbusier deduced most of the
plastic, spiralled shapes and complex proportional
relationships that produced his ‘Modulor’ system.)23
In its relationship to things, and more broadly in its relationship
to the environment, current Postmodernism allows little room
for hope. Insofar as it is not a language and therefore it ‘does not
give a representation that can be mobilised by a human spirit (a
concept)’,24 computer simulation does not lay itself open for the
linguistic research typical of the Postmodernism of the 1980s
any more than a ‘re-examination of the formal’ in the manner of
the young Eisenman, irrespective of the quality of their
transposition into a computational environment. Furthermore,
as it is itself experimental proof that nature need not search
behind the latest copies, but that it is produced afresh by our
computers, Postmodernism appears as a mirror which, to a
civilisation whose ‘only purpose is to “know”’, refl ects the image of
its own knowledge. From this mirror state comes the fact that
each reproach addressed to this civilisation is invariably referred
to us as an interrogation of our own choices, at best identical, at
worst increased, a sensation felt by everyone when, for example,
we wonder how contemporary technology was able to produce
such or such a social construction, including architecture.
In this regard we cannot but recall the paradox of the
contemporary factory where the staff, workmen or engineers
work to increase the performance of the robots that make them
obsolete. This paradox was already present in the 19th century
in the very term ‘manufacture’ to design that which is more
than mechanical, a paradox that was regularly brought to light
by Norbert Wiener25 and many others without ever fi nding
a satisfactory political response. Furthermore, to say that no
response is satisfactory, given the rise in importance of today’s
global, abstract and computational ambient factory which is
the deep cause of the actual crisis, is of course a euphemism.
129
In this framework, the mathematics of sensible things, which
have neither the elegance of minimal mathematical equations
– the Modernist ideal – nor the status of language specifi c to a
pure syntax, become, as stated by Peter Macapia, ‘dirty’.26 They
are beta-mathematics, the result of perpetually experimental
information technologies.
In fact, the critique of such a ‘deviance’ of mathematics in
architecture devoid of any preferred stylistic expression appears
increasingly often in two ways. First, by the expression of an
impossibility to fi nally reach a rationalism which would connect
the state of our knowledge and its application in the real, an
impossibility translated by a language which appears to be
‘inarticulate, arbitrary and non-dialectic’.
This is the language of the most recent Californian
architecture which seems to ‘produce repetitions not
developments’ and of which ‘consequently the resulting
concentric scribble is (specifi cally through its frustrated
ambiguity) the sign of an absolute protest, which “globally”
defeats the logic of the real, by refusing to admit the possibility
of whatever logic?’27 Second, by accepting a pop and deeply
experimental computationalism in which all historical discourses
have been replaced by the gross storage capacities of the now
more than one million servers and three million computers of
the Google Grid. This grid, by being our Thucydides, embodies
the very End of History and henceforth makes the dreams
of positivist historiography come true. Everything outside
computer memories is not historical facts but literature and
dreams. It is thus not a question of rebuilding either history
or theories, but of recording or producing a new digital reality
by simulation. Here, logical positivism is accepted, as are the
resulting technologies that are solving life problems in a way
‘theorised’ by Andy Warhol:
The acquisition of my tape recorder really fi nished
whatever emotional life I might have had, but I was
glad to see it go. Nothing ever caused me any problems
again, because a problem signifi ed a good recording
and when a problem turns into a good recording it’s no
longer a problem.28
At this very moment in our computation-based civilisation, the
situation is slightly different. On one side life is no more than a
good recording, but on the other side it is nothing more than
a good computer simulation; a simulation that effectively and
physically produces a (synthetic) life.29 We still have a choice. 1
Notes
1. Alain Connes in an Interview by Georges Skandalis and Catherine
Goldstein, Newsletter of the European Mathematical Society, EMS Publishing
House (Zurich), Issue 63, March 2007, pp 27–8.
2. Andrea Branzi, ‘Le mouvement radical’, Architectures expérimentales
1950–2000, collection from FRAC Centre, Editions HYX, June 2003.
3. John L Austin, Sense and Sensibilia, ed GJ Warnock, Oxford University
Press (Oxford), 1964.
4. I consider these four theoretical directions as an equivalent to the four
known fundamental interactions in physics: electromagnetism, strong
interaction, weak interaction and gravitation. As for physics, they ask for
a theoretical synthesis in the fi eld of architecture and social sciences. My
research agenda is oriented towards such a synthesis.
5. Robert Venturi (with Denise Scott Brown and Steven Izenour), Learning
from Las Vegas, MIT Press (Cambridge, MA), 1972, revised 1977.
6. Aldo Rossi, L’architettura della città, 1966. Translated as The Architecture
of the City, Oppositions Books and the MIT Press (New York), 1984.
7. Peter Eisenman, The Formal Basis of Modern Architecture: Dissertation
1963, Lars Müller Publishers (Baden), 2006.
8. Christopher Alexander (with Sarah Ishikawa and Murray Silverstein), A Pattern
Language: Towns, Buildings, Constructions, Oxford University Press (Oxford), 1977.
9. The ‘Intricacy’ exhibition was curated by Greg Lynn at the Institute of
Contemporary Art, Philadelphia, and ran from 18 January to 6 April 2003.
Catalogue published by ICA, University of Philadelphia.
10. Anthony Vidler, ‘From Anything to Biothing’, in Cynthia Davidson (ed),
Anything, MIT Press (Cambridge, MA), 2001.
11. See, for example: Fredric Jameson, The Prison-House of Language:
A Critical Account of Structuralism and Russian Formalism, Princeton
University Press (Princeton, NJ), 1972; Postmodernism: Or, the Cultural
Logic of Late Capitalism, Duke University Press (Durham, NC), 1991; The
Geopolitical Aesthetic: Cinema and Space in the World System, Indiana
University Press (Bloomington, IN), 1992.
12. Peter Sloterdijk, ‘La révolution “pluralisée”’, interview with Peter Sloterdijk
by Arnaud Spire in Regards, No 52, December 1999.
13. Rudolf Carnap, Der Logische Aufbau der Welt, 1928. Translated as The
Logical Structure of the World and Pseudoproblems in Philosophy, trans RA
George, University of California Press (Berkeley, CA), 1967.
14. From F Nietzsche, Aurore, Second book, trans Julien Hervier, Gallimard
(Paris), 1970 (original: Morgenröte – Gedanken über die moralischen
Vorurteile, 1881).
15. Donald Judd in ‘La petite logique de Donald Judd’ (trans Pascale Haas),
interview with Catherine Millet in Artpress 119, November 1987.
16. Frederic Jameson, Signatures of the Visible, Routledge (London), 1992.
17. Marshall McLuhan and Quentin Fiore, War and Peace in the Global
Village, Bantam (New York), 1968, p 34.
18. Franck Varenne, ‘Le destin des formalismes: à propos de la forme des
plantes – Pratiques et épistémologies des modèles face à l’ordinateur’,
PhD thesis, Université Lumière – Lyon II, 29 November 2004, p 10.
Partial content of the thesis is included in Franck Varenne, Du modèle à la
simulation informatique, Vrin (Paris), 2007.
19. Franck Varenne, ‘What does a Computer Simulation Prove? The Case of
Plant Modeling at CIRAD’, in N Giambiasi and C Frydman (eds), Proceedings
of the 13th European Simulation Symposium, Marseille, France, 18–20
October 2001, SCS Europe Bvba (Ghent), 2001, pp 549–54.
20. Ray Kurzweil is one of the few with a general and historical overview of
the problem, as had Marshall McLuhan, Guy Debord and Andrea Branzi in
different ways from the 1960s.
21. It seems to me that Tigerman’s irony is the sign of a complete disillusion,
not only towards any kind of political action but also towards the cynical and
opportunistic positions of most architects.
22. Stanislaw M Ulam, Adventures of a Mathematician, Scribner’s (New
York), 1976, p 288.
23. Peter Eisenman, ‘A Matrix in the Jungle’, in Written into the Void: Selected
Writings 1990–2004, Yale University Press (New Haven, CT), 2007, p 121.
24. Franck Varenne, ‘La simulation conçue comme expérience concrète’, in
Le statut épistémologique de la simulation, actes des 10èmes journées
de Rochebrune: rencontres interdisciplinaires sur les systèmes complexes
naturels et artifi ciels, Editions de l’Ecole Nationale Supérieure des
Télécommunications (Paris), 2003, pp 299–313.
25. See the talks that mathematician and scientist Norbert Wiener, inventor
of cybernetics, gave to the unions in the US about the evolution of production
towards automatic factories.
26. See ‘Turbulent Grid’, arch’it, February 2007, and ‘Dirty Geometry’, Log,
Issue 10, Summer/Fall 2007.
27. The metaphysical and metaphorical principle of Ferreri ‘is inarticulate,
arbitrary and non-dialectic, to the point that it produces repetitions, not
developments and that, consequently the resulting concentric scribble
is (specifi cally through its frustrated ambiguity) the sign of an absolute
protest, which “globally” defeats the logic of the real, by refusing to admit
the possibility of whatever logic?’ Pier Paolo Pasolini, Ecrits sur le cinéma,
Editions des Cahiers du Cinéma (Paris), 2000, p 199.
28. From Andy Warhol, The Philosophy of Andy Warhol (From A to B & Back
Again), Harcourt Brace Jovanovich (New York), 1977, pp 26–7.
29. See the work by Craig Venter on synthetic DNA in, among other
numerous articles, Victoria Gill, ‘“Artifi cial Life” Breakthrough Announced by
Scientists’, BBC News Science & Environment, 20 May 2010 (www.bbc.
co.uk/news/10132762).
Text © 2011 John Wiley & Sons Ltd. Images: pp 122-3 © Philippe Morel; p 124 © EZCT
Architecture & Design Research 2008; pp 125(t), 128(b) Courtesy Philippe Morel; pp
126-7 © Philippe Morel, 2005; p 128(t) © Alessandro Mendini
Contributors include:
Daniel Bosia
Mark Burry
Bernard Cache
Amy Dahan-Dalmedico
Ana María Flor Ortiz
Max Kahlen
Philippe Morel
Antoine Picon
Fabien Scheurer
Dennis R Shelden
Rodia Valladares Sánchez
Michael Weinstock
Topics include:
Advanced geometry
Computational design techniques
Design engineering
History and theory of technology
applied to design
Mathematics in practice
Over the last 15 years, contemporary architecture has been
profoundly altered by the advent of computation and
information technology. The ubiquitous dissemination
of design software and numerical fabrication machinery
have re-actualised the traditional role of geometry
in architecture and opened it up to the wondrous
possibilities afforded by topology, non-Euclidean
geometry, parametric surface design and other areas of
mathematics. From the technical aspects of scripting
code to the biomorphic paradigms of form and its
associations with genetics, the impact of computation
on the discipline has been widely documented. What is
less clear, and has largely escaped scrutiny so far, is the
role mathematics itself has played in this revolution.
Hence the time has come for designers, computational
designers and engineers to tease the mathematics out
of their respective works, not to merely show how it is
done – a hard and futile challenge for the audience –
but to reflect on the roots of the process and the way it
shapes practices and intellectual agendas, while helping
define new directions. This issue of 2 asks: Where do
we stand today? What is up with mathematics in design?
Who is doing the most interesting work? The impact
of mathematics on contemporary creativity is effectively
explored on its own terms.
MatheMatics
of space
Guest-edited by GeorGe L LeGendre
1
ARCHITECTURAL DESIGN
MATHEMATICS of SpACE
JULy/AUGUST 2011
pRofILE No 212
... Carpo outlined that the continuous, spline-based work has little in common with today's computation itself, which is essentially a discrete process (Carpo 2014). A new generation of architects started to criticise the accepted notion of digital production as a form of mass-customisation of curved form, such as EZCT with the Generative Chair and Universal House (Morel 2011). Other architects, such as Jose Sanchez, began to advocate an agenda that has a certain social consciousness, attempting to democratise the process of design while also criticising the starchitect driven model with its doubtful labour practices (Sanchez 2018). ...
Conference Paper
This paper describes a framework for discrete computational design and fabrication in the context of automation. Whereas digital design and fabrication are technical notions, automation immediately has societal and political repercussions. Automation relates to industrialization and mechanisation—allowing to historically reconnect the digital while bypassing the post-modern, deconstructivist, or parametric decades. Using a series of built prototypes making use of timber, this paper will describe how the combined technologies of automation and discreteness enable both technical efficiencies and new architectural interest. Both projects are based on timber sheet materials, cut and folded into larger elements that are then assembled into functional structures. Both projects are also fragments of larger housing blocks. Discrete building blocks are presented from a technical perspective as occupying a space in between programmable matter and modular prefabrication. Timber is identified as an ideal material for automated discrete construction. From an architectural perspective, the paper discusses the implications of an architecture based on parts that remain autonomous from the whole.
... The work of Alfred Bemis and Leonardo Mosso present a historic precedent of a voxel-based architecture (Botazzi, 2018), as well as Frank Lloyd Wright's Textile Block houses. More contemporary, Philippe Morel (EZCT) developed the Universal House as a discrete voxel-based building system (Morel, 2011), while Jose Sanchez' Polyomino project proposed discrete building blocks as platforms for collaborative design (Sanchez, 2018). The interest in this approach is reinforced by the theoretical and historical work on Discreteness and Computation by Mario Carpo (Carpo, 2014). ...
Article
Using a number of built demonstrators, this paper describes a computational design and fabrication method for timber assembly, based on the notion of discreteness. This research attempts to combine aspects of the field of Digital Materials and Programmable Matter with the architectural field of Prefabrication and Modularity. While these two fields are at opposing ends of the spectrum in terms of scale and functional operation, this research proposes that many of the properties and challenges are transferable.
... As explained before, these building blocks act the same way as Digital data, which means that they can be recombined, are reversible, universal and versatile. The first important precedents of this approach is EZCT's Universal House project ( Morel, 2011) , which proposes a physical building block that can be assembled into multiple different buildings. With a kind of dark humor, the building block itself is literally a cube or voxel, suggesting a logical and rational endpoint for architecture where all questions concerning syntax and part-to-whole relations are irrelevant. ...
Conference Paper
... Rather, they think, perceptual illusions are pervasive; in particular, they think 4 Other philosophers take the most threatening step of the Argument to be the generalizing move from illusory cases to veridical cases, cf. Austin (1962: 52); McDowell (1998: 386-37). For discussions, see Smith (2002: 25-34). ...
Article
I discuss the so-called "problem of perception" in relation to the Argument from Illusion: Can we directly perceive the external world? According to Direct Realism, perception provides direct and immediate awareness of reality. But the Argument from Illusion threatens to undermine the possibility of direct perception of the world. In The Problem of Perception (2002), A. D. Smith proposes a novel defense of Direct Realism based on a careful study of perceptual phenomenology. According to his theory, the intentionality of perception is explained in terms of three phenomenological features of perception: phenomenal three-dimensional spatiality, movement, and the Anstoss. He argues that this account of perceptual intentionality can resist a central premise of the Argument from Illusion, i.e. the "sense-datum inference." After presenting Smith's theory, I argue that he fails to distinguish two independent tasks for the direct realist, and that he underestimates the threat of the so-called "sense-datum infection." My contention is that even if Smith's theory of perceptual intentionality is correct, Direct Realism has not been saved from the Argument from Illusion. To resist the Argument from Illusion, it is not enough to merely consider how to block the sense-datum inference. The direct realist must also find a way to undermine the
... Anyone who veered to far from that ordinary use committed a cardinal philosophical sin. Much unhelpful philosophical baggage was sifted out in this manner, for instance, thanks to J. L. Austin (1962) discussions of perception are no longer lumbered by the arcane terminology of 'sense data'. However, if ordinary language philosophy debunked a certain kind of philosophical hubris (the view of the philosopher as some strange hybrid of poet and super-scientist) at its worst, it replaced it with another (the philosopher as pedantic grammar teacher arbitrating the correct uses of concepts). ...
Article
A striking feature of minimally well-functioning legal systems is their ability to decide cases which turn on morally contested concepts. A natural lawyer, for whom there is an intimate connection between law and morality, may view this as proof of an underlying moral consensus which obtains despite the superficial appearance of moral disagreement It is more plausible, I suggest, to view this facility in terms of the legal positivist insight that law is a mechanism for social control and regulation, which operates despite the lack of moral consensus. In the present climate, there can be few concepts more contested than that of religion. On the one hand, the barbarians are at the gate, in the form of the “new atheists,” such as Richard Dawkins, Daniel Dennett and Sam Harris, whose increasingly raucous diatribes against religion weigh down newspaper columns and websites. On the other, messianic zanies of all persuasions seem determined to fulfil scriptural prophecies of Apocalypse. Both sides hold wildly differing views on the nature of religious belief which in turn diverge dramatically from mainstream views. Nevertheless, in the context of this chapter, I will examine a few instances, where it seems to me that the American legal system is able to work reasonably well despite the profound disagreement that bedevils American society over issues of religion and spirituality. This is all the more striking given that some of the cases involve Americans’ peculiarly infantile attitude to drugs and alcohol. ISBN: 9781920899318
... In most cases I will prefer to use the term 'cognitive' which is somewhat less loaded than 'idealist'. 3. See Popper, 1969; Gregory, 1980; Austin, 1962; Swartz, 1965; Wittgenstein, 1963; to name just a few -including Kantian philosophy and all of its numerous derivatives. 4. Davis has read an early draft of this article, and his reaction is a exemplary instance of academic open–mindedness. ...
Article
Full-text available
During our century the demarcation lines between art and non-art have become vague to the extent that the continuation of art as a valuable component of culture is questionable. History of art and aesthetics have so far failed to delineate clearly those demarcation lines. Hence, an understanding of the origins of art is needed now more than ever because it may reveal the most important attributes of art in its very beginnings. This essay examines three theories which attempt to explain the origins of art from very different epistemological points of view: a naive empiricist point of view (H. Breuil), a rather simplistic cognitive point of view (E.H. Gombrich) and an extreme behaviorist point of view (W. Davis), the analysis and refutation of which comprise the major part of this essay. The analysis of these approaches to the problem shows that none offers an adequate explanation of the origins of art, mainly because each disregards either empirical or epistemological considerations or both. The behaviorist rejects all epistemological factors, but this hardly makes them immaterial; it only conceals them as implied and inevitable assumptions. An interdisciplinary approach is called for in order to elucidate the problem of the origins of art.
Article
Full-text available
Dans le design et l'architecture, les modèles mathématiques jouent couramment un rôle d'abstraction en même temps que d'uniformisation et de prescription. Aujourd'hui, l'essor des simulations à base de programmes orientés objets assouplit, complexifie et enrichit l'apport des modèles formels. Au contraire des modélisations mathématiques traditionnelles venues de la physique des matériaux ou d'autres branches de la physique, les approches par simulations à agents permettent d'intriquer des représentations et des contraintes diverses, hétérogènes, évolutives et sensibles à leur environnement, environnement considéré à une échelle toujours plus fine, donc plus locale et laissant toujours plus sa place à la singularité plutôt qu'à la standardisation et à l'uniformité. La capacité à intégrer pas à pas, dès la conception sur ordinateur, des contraintes tant naturelles et environnementales qu'esthétiques ou éthiques autorise à repérer dans ces nouveaux régimes de la conception un effet d'accrétion et de concrétisation dû lui-même à une inédite circonspection de ces organes de formalisation et de conception que sont les simulations à agents. À bien des égards, cette capacité, nouvelle dans les techniques de formalisation pour la conception, est paradoxale et contre-intuitive. Elle demande qu'on la laisse advenir sans que l'on cherche à en réduire d'emblée la signification et la portée. C'est pourquoi nous proposons de promouvoir un parti-pris des choses computationnelles, en particulier dans les domaines émergents du design et de l'architecture computationnels.
Article
Full-text available
Jerry Fodor (1985) has joked that philosophers have always been prone to eccentric worries such as an anxiety about the existence of tables and chairs, but with the issue of mental representation they have found a problem that is real and crucial for progress in the cognitive sciences. However, given Fodor's 'methodological solipsism' of computational symbols and their 'formality condition', Jackendoff (1992) has facetiously asked "Why, if our understanding has no direct access to the real world, aren't we always bumping into things?" It is no accident that Jackendoff's parody recalls Samuel Johnson's famous retort to Berkeley's "ingenious sophistry" by kicking a stone. There is an acute irony in the fact that cognitive science has simply rediscovered the philosophers' traditional worry about tables and chairs. Accordingly, it is not surprising that Fodor's latest book Hume Variations endorses the classical Empiricist 'idea' idea of Locke, Berkeley and Hume. The paper explores Fodor's concept of ideas as mental objects in relation to its historical antecedents.
Article
Since the demise of the Sense-Datum Theory and Phenomenalism in the last century, Direct Realism in the philosophy of perception has enjoyed a resurgence of popularity. 1 Curiously, however, although there have been attempts in the literature to refute some of the arguments against Direct Realism, there has been, as of yet, no systematic treatment of all eight of the main arguments against it. 2 The aim of this paper is to fill this lacuna in the literature by discussing all eight of these arguments against Direct Realism and the argumentative strategies Direct Realists may deploy to counter them.
Conference Paper
Full-text available
Par un procédé d'objections/réponses, nous passons d'abord en revue certains des arguments en faveur ou en défaveur du caractère empirique de la simulation informatique. A l'issue de ce chemin clarificateur, nous proposons des arguments en faveur du caractère concret des objets simulés en science, ce qui légitime le fait que l'on parle à leur sujet d'une expérience, plus spécifiquement d'une expérience concrète du second genre.
Conference Paper
Full-text available
The credibility of digital computer simulations has always been a problem. Today, through the debate on verification and validation, it has become a key issue. I will review the existing theses on that question. I will show that, due to the role of epistemological beliefs in science, no general agreement can be found on this matter. Hence, the complexity of the construction of sciences must be acknowledged. I illustrate these claims with a recent historical example. Finally I temperate this diversity by insisting on recent trends in environmental sciences and in industrial sciences.
Article
"Original submitted in August 1963." "Facsimile reprint"--Colophon. Thesis (Ph. D.)--Trinity College, University of Cambridge, 1963. Includes bibliographical references (p. 355-371).
Le mouvement radical', Architectures expérimentales 1950-2000, collection from FRAC Centre, Editions HYX
  • Andrea Branzi
Andrea Branzi, 'Le mouvement radical', Architectures expérimentales 1950-2000, collection from FRAC Centre, Editions HYX, June 2003.
Translated as The Architecture of the City, Oppositions Books and the
  • Aldo Rossi
  • L Della
Aldo Rossi, L'architettura della città, 1966. Translated as The Architecture of the City, Oppositions Books and the MIT Press (New York), 1984.