ArticlePDF Available

Abstract

Although many active scientists deplore the publicity about Drexler’s futuristic scenario, I will argue that the controversies it has generated are very useful, at least in one respect. They help clarify the metaphysical assumptions underlying nanotechnologies, which may prove very helpful for understanding their public and cultural impact. Both Drexler and his opponents take inspiration from living systems, which they both describe as machines. However there is a striking contrast in their respective views of molecular machineries. This paper based on semipopular publications is an attempt to characterize the rival models of nanomachines and to disentangle the worldviews underpinning the uses of biological reference on both sides. Finally, in an effort to point out the historical roots of the contrast in the concepts of nanomachines, I raise the question of a divide between two cultures of nanotechnology.
HYLE – International Journal for Philosophy of Chemistry, Vol. 10 (2004), No. 2, 65-82.
Copyright
2004 by HYLE and Bernadette Bensaude-Vincent.
Two Cultures of Nanotechnology?
Bernadette Bensaude-Vincent
Abstract: Although many active scientists deplore the publicity about Drex-
ler’s futuristic scenario, I will argue that the controversies it has generated are
very useful, at least in one respect. They help clarify the metaphysical assump-
tions underlying nanotechnologies, which may prove very helpful for under-
standing their public and cultural impact. Both Drexler and his opponents take
inspiration from living systems, which they both describe as machines. How-
ever there is a striking contrast in their respective views of molecular machin-
eries. This paper based on semipopular publications is an attempt to character-
ize the rival models of nanomachines and to disentangle the worldviews un-
derpinning the uses of biological reference on both sides. Finally, in an effort
to point out the historical roots of the contrast in the concepts of nanomachi-
nes, I raise the question of a divide between two cultures of nanotechnology.
Keywords: nanotechnology, self-assembly, molecular assembler, biomimetism,
mechanism, dynamism.
1. Introduction
Over the past decade, Eric Drexler’s successful volume Engines of Creation
(1986) and the debates generated by its futuristic visions have been prominent
in drawing public attention toward nanotechnology. Most scientists active in
the field think that too much attention has been paid to this debate and they
try to distance their own ‘serious’ research programs from Drexler’s unrealistic
scenario. At least the rejection of Drexler’s rhetoric acts as a unifying principle
in the otherwise heterogeneous crowd of scientists involved in nano-initiatives.
However as with many controversies in science, debates about Drexler’s
universal assemblers and the grey goo scenario have been extremely profita-
ble as long as they helped clarify the philosophical assumptions underlying
projects of nanoscience.
1
Without claiming that the future of nanotechnology
hinges on such debates, I will argue that they enlighten the public about the
cultural roots and cultural projects of nanoscientists and engineers. In this
respect, it is equally important to point out convergences and divergences be-
tween Drexler and his followers on one side and chemists such as Richard
66 Bernadette Bensaude-Vincent
Smalley and George Whitesides who criticized Drexler’s views of universal
assemblers on the other side.
Drexler and his opponents share a common interest in biological systems.
Already in Richard Feynman’s almost legendary prophecy, there was a quick
reference to biological material, where enormous amounts of information
could be stored in exquisitely small spaces. Since 1959 and the early days of
molecular biology, chemists, materials scientists and engineers have intensi-
fied and diversified their references to biology, even before the term ‘nano-
technology’ was coined. Bio-inspirations prevailed when the ‘bottom-up’ ap-
proach, the design of structures molecule by molecule (rather than atom by
atom) became one of the major goals of nanotechnology. In contrast to the
structures usually designed by engineers at the macrolevel, biomaterials are
built from bottom up. Life operates by bonding atoms or groups of atoms
instead of by carving a structure from raw materials. The convergence of
nanotechnologies and biotechnologies is rooted in the claim that ‘bio is
nano’, that biomaterials are structured from bottom up.
It is not my purpose to discuss the validity of such claims through a com-
parison of nature’s strategies and nanoscientists’ biomimetic attempts (see
Ball 2002). Rather I would like to emphasize that the debate about the poten-
tialities of nanotechnology basically boils down to the question ‘what is a na-
nomachine?’ However the notion of machine is itself polysemic, so that it
can support dissimilar views of living systems and teach quite different les-
sons to nanoscientists and engineers.
2. Machine: An all-pervading metaphor
Over the past decades, the machine metaphor has invaded the language of bi-
ologists. In the early times of molecular biology, such metaphors were exclu-
sively used for DNA transcription and translation. Nowadays each entity ac-
tive in the cell is described as a machine: ribosomes are assembly lines, ATP
synthases are motors, polymerases are copy machines, proteases and proteo-
somes are bulldozers, membranes are electric fences, and so on (Godsell
2003, Zhang 2003). Although biologists generally agree that living systems
are the product of evolution rather than of design, they describe them as de-
vices designed for specific tasks. Indeed, if biology can teach us about engi-
neering and manufacturing, it is because the living cell is now viewed as a fac-
tory crowded with numerous bionanomachines in action.
At the same time, in chemistry and materials science, machine metaphors
have also become prominent. One major objective of nanotechnology pro-
grams is to build nanomachines that will do a better job than conventional
Two Cultures of Nanotechnology? 67
machines. As they seek to design functional materials, physicists and chem-
ists readily redefine the product of their design as machines: wheelbarrow
molecules, cantilever molecules, springs, and switches are specimens of the
inventions commonly reported in materials journals.
Thus the languages of molecular biology and materials science remarkably
converge in a stream of machine metaphors. Through a continuous process of
mutual transfer of concepts and images, they have built a common paradigm
based on an artificialist view of nature. Nature is populated with nanomachines
that human technology should be able to mimic or even to surpass.
Drexler and other advocates of the nano revolution primarily find in mo-
lecular biology a reply to all nanoskeptics. The data of molecular biology is a
chief argument about the feasibility of nanofabrication:
2
One might doubt that artificial nanomachines could even equal the abilities of
nanomachines in the cell, if there were reason to think that cells contained some
special magic that makes them work. This is called vitalism. Biologists have aban-
doned it because they found chemical and physical explanations for every aspect
of living cells yet studied, including their motion, growth, and reproduction.
Drexler thus rejuvenated the positivist crusade of 19
th
-century synthetic
chemists like Marcellin Berthelot against the limits imposed by superstition
or by the metaphysical belief in a vital force. The existence of life itself is the
proof that nanomachines are feasible according to Marvin Minsky from the
MIT Media Lab and AI Lab:
3
It seems quite strange for anyone to argue that you cannot build powerful
(but microscopic) machinery – considering that our very own cells prove that
such machines can indeed exist. And then if you look inside your cells you will
find smaller machines that cause disease. Most arguments against nanotech-
nologies are arguments against life itself.
From this quotation, it is clear that life provides more than just an invitation to
build nanomachines; it rather constitutes an imperative. Life is a source of crea-
tivity, a legitimation of the enterprise as well as a reason to believe in its future.
In Drexler’s view, nanotechnology is ‘molecular manufacture’. The notion
of molecular engineering is nothing new. As early as the 1950s, the term was
used by a number of scientists who worked for the promotion of materials
science and engineering (MSE) in American universities. Before the label
‘MSE department’ was adopted, this new branch was often referred to as
‘molecular engineering’.
4
What is specific about Drexler’s program is the no-
tion of manufacture, which conveys the vision of mass-production that will
transform society. From the publication of his very first article in 1981,
Drexler shifted from the notion of molecular engineering to that of manufac-
ture.
This early presentation of what could be a bottom-up process was clear-
ly inspired by biology.
68 Bernadette Bensaude-Vincent
Biochemical systems exhibit a ‘microtechnology’ quite different from ours:
they are not built down from the macroscopic level but up from the atomic.
Biochemical microtechnology provides a beachhead at the molecular level
from which to develop new molecular systems by providing a variety of ‘tools’
and ‘devices’ to use and to copy. Building with these tools, themselves made to
atomic specifications, we can begin on the far side of the barrier facing con-
ventional microtechnology. [Drexler 1981, p. 5275]
The artificialist view of biological systems thus encouraged a project which
focused on the imagination of small machines that could ‘pick and place’ and
assemble pieces on the model of robots and assembly-lines in a car factory. A
few years later, given the scale of operation, Drexler was embarked in the fic-
tion of self-replicating assemblers which raised the prospect of myriads of
nanoassemblers copying themselves and consuming all the resources of the
earth. The now too familiar grey goo scenario was a direct and logical conse-
quence of Drexler’s choice of a manufacturing model. Although Drexler re-
cently regretted his speculations on the grey goo, it is important to emphasize
that for him engineering and technology basically consist in manufacturing.
5
While the controversy raised by Drexler focused on the feasibility of uni-
versal assemblers, it became increasingly obvious that his opponents ques-
tioned the model of manufacture without rejecting the machine metaphor.
Significantly, George Whitesides, a professor of chemistry at Harvard Uni-
versity, developed his argumentation against Drexler’s molecular assemblers
in a paper entitled ‘The Once and Future Nanomachines’ (Whitesides 2001).
Whitesides contrasts human-made machines with natural machines but he
never questions the machine metaphor.
Nanoscale machines already do exist, in the form of the functional molecular
components of living cells – such as molecules of protein or RNA, aggregates
of molecules, and organelles (‘little organs’) – in enormous variety and sophis-
tication. The broader question of whether nanoscale machines exist is thus one
that was answered in the affirmative by biologists many years ago. The ques-
tion now is: What are the most interesting designs to use for future na-
nomachines? And what, if any, risks would they pose? [Whitesides 2001, p.
78]
Drexler’s molecular manufacture is depicted as an old fashioned and outdated
model that has to be replaced by a more modern and more fashionable model
taken from living cells. Mimicking human-scale machines is both inadequate
and inefficient given the constraints of fabrication at the nanoscale. By con-
trast, mimicking the simplest cellular nanomachines is a marvelous challenge.
In other terms, the dispute between Drexler and Whitesides seems to rest
on two rival models of machinery. Both of them agree that nanotechnology
should take inspiration from living organisms, but they part company when it
comes to the ways of making those nanomachines.
Two Cultures of Nanotechnology? 69
3. Drexler’s mechanical machines
What is ‘life’ for Drexler and his colleagues of the Foresight Institute? From
the outset, Drexler explicitly based his plan on a close comparison between
biochemical components and the operating units of macroscopic machines as
shown in his 1981 article (Table 1).
Table 1. Comparison of macroscopic and microscopic compo-
nents (source: Drexler 1981).
Technology Function Molecular example(s)
Struts, beams, casings Transmit force,
hold positions
Microtubules, cellulose,
mineral structures
Cables Transmit tension Collagen
Fasteners, glue Connect parts Intermolecular forces
Solenoids, actuators Move things Conformation-changing
proteins, actin/myosin
Motors Turn shafts Flagellar motor
Drive shafts Transmit torque Bacterial flagella
Bearings Support moving parts Sigma bonds
Containers Hold fluids Vesicles
Pipes Carry fluids Various tubular structures
Pumps Move fluids Flagella, membrane proteins
Conveyor belts Move components RNA moved by fixed
ribosome (partial analog)
Clamps Hold workpieces Enzymatic binding sites
Tools Modify workpieces Metallic complexes,
functional groups
Production lines Construct devices Enzyme systems, ribosomes
Numerical control systems Store and read programs Genetic system
With struts, cables, fasteners, glue, motors, bearings, containers, pumps, and
clamps, Drexler’s living bodies are surprisingly reminiscent of Descartes’ ani-
mal-machines. In both cases, the living machine is made of a set of independ-
ent pieces a few building blocks mechanically assembled by a designer.
Drexler described molecules as rigid building blocks similar to the parts of
tinker toys – whether they are Meccano or Lego construction sets. The func-
tions performed by the various pieces of molecular machinery are also essen-
tially mechanical. They position, move, transmit forces, carry, hold, store, etc.
Although Drexler declared that his molecular manufacture is the extrapolation
70 Bernadette Bensaude-Vincent
to the smallest scale – by a process of ‘mental shrinking’ of today’s automated
factories (Drexler 2001, p. 74), his automata look like Vaucanson’s automata
performing complex tasks thanks to an assembly of simple mechanisms.
Drexler is fond of the metaphor of ‘molecular hands’ manipulating nano-
objects and placing them wherever they need to go to perform the desired
function. Nanosystems are like factories engaged in a rigid framework of con-
trolled motions using the building blocks of matter as raw materials.
As in Descartes’ theory of animal-machines, the tasks to be performed by
the nanomachine, i.e. the direction of its movements, are embedded by the
designer in the mechanical devices. The assembly process itself is described
with the metaphor of “mechanosynthesis” or “the use of mechanical control
to guide the placement of molecules so as to build complex objects” (Drexler
1995, p. 6). The keyword is “molecular assembler”. This is the magic wand
that binds together the pieces in an arrangement allowing them to perform
useful tasks. Molecular assemblers are “devices able to guide chemical reac-
tions by positioning reactive molecules with atomic precision” (Drexler
2003b). They are neither specific nor individual molecules. They are de-
scribed as universal, all-purpose assemblers that can assemble all kinds of ma-
terials in the same way that ribosomes can assemble all kinds of proteins.
We know that Drexler shaped his program of molecular manufacturing
while he was a research affiliate at MIT Space Systems Laboratory then MIT
Artificial Intelligence Laboratory, under the sponsorship of Marvin Minsky.
It is therefore not unlikely that his program was influenced by cyberneti-
cians’ concepts. Although Drexler’s references to von Neumann in Engines of
creation is limited to his studies on self-replicating machines, he might also
have borrowed his notion of ‘universal assemblers’ which were able to grab
components out of their location and put them together according to pro-
grammed instructions. Similarly, Drexler’s assemblers would move atoms,
place them in the right position, and selectively bind them.
Drexler’s program thus seems to combine two models of machines. On the
one hand, his description of molecular manufacture rests on classical mechan-
ics, requiring only space, matter, and motion. In this sense, his matter is like
Boyle’s uniform, catholic matter, deprived of spontaneity as well as of individu-
ality. Molecular machines, like clock mechanisms, require the hands and the
brain of a clock-maker. As Georges Canguilhem emphasized in a commentary
on Cartesian mechanism, such mechanical machines are not deprived of finali-
ty: all the teleology is concentrated at the starting point, in the act of design;
and it is naively anthropomorphic (Canguilhem 1952, pp. 113-4). Canguilhem
characterized the teleology inherent in Cartesian mechanism as ‘technological
anthropomorphism’ as opposed to ‘political anthropomorphism’. On the other
hand, Drexler implicitly refers to computational machines, but without facing
the challenge of complexity that von Neumann clearly prophesized.
6
Two Cultures of Nanotechnology? 71
A second major feature that Drexler retained from biological systems is
that they operate under programmatic control. He consequently shaped “a
world in which digital data can be used to control general-purpose machines
that will put the fundamental building blocks of matter in place to build al-
most everything” (Drexler 1995, p. 17). The DNA-RNA system provides the
code and the instructions for the machine to operate. Protein assembly works
according to rigid instructions, in a clean and efficient manner. Drexler’s mo-
lecular manufacture is described in stark contrast with chemical manufacture.
Conventional chemical reactions are extraordinarily messy:
Chemists today make complex molecular structures by taking smaller pieces,
putting them together, stirring, and hoping that they will fall together to make
the right product. If you imagine trying to make an automobile by taking parts,
putting them into a box, shaking, hoping that they will fall togteher to make a
working machine, you will conclude that it is very useful to have robots or
hands, or something like them involved in the process. [Drexler, 1995, p. 2]
Chemistry looks so primitive and dirty when compared to protein machines
that Drexler wonders how chemists, lacking the “molecular hands with which
to put the parts where they want them”, have managed to achieve such re-
markable things. In living things, then, Drexler finds a precious guide to im-
proving chemical technologies. Enzymes are his favorite model of assemblers.
“[Enzymes] assemble large molecules,” he explains, “by ‘grabbing’ small mol-
ecules from the water around them, holding them together so that a bond
forms.” In this manner they assemble DNA, proteins, and many other bio-
logical items. It should therefore be possible to put them to work on metal
ions or complex structures in order to wield molecules with the precision of
programmed machines. However, if enzymes and proteins show the way to
build nanomachines, they do not provide a perfect model for nanotechnolo-
gy. Drexler proposes to use protein machines only for the first generation of
nanomachines because they present serious flaws as engineering materials.
The amino acids of which they are composed are simply not tough enough
for the construction of nanomachines. Drexler’s ambition is to mimic life’s
devices working under genetic instructions in order to build machines more
robust than organisms.
Finally, Drexler borrowed a third concept from biology evolution in
order to legitimize his program. Drexler advocates an evolutionary model of
technological changes, presenting human technology as the continuation of
natural evolution. Chapter 2 of Engines of Creation placed the emergence of
molecular manufacturing in a grandiose picture starting with cosmic order
out of chaos then gradually evolving towards organization, then replications,
and technology. Evolutionary principles guide Drexler’s foresight exercises.
They are supposed to determine what paths are open and possible as well as
the limits of technological achievements. Drexler thus uses evolutionary bi-
72 Bernadette Bensaude-Vincent
ology in order to ‘naturalize’ the kind of technology that he encourages. In
this respect he paved the way for Ray Kurzweil’s prophecies of spiritual ma-
chines and universal intelligence.
For Ray Kurzweil, a staunch supporter of Drexler’s program, and active
promoter of Artificial Intelligence, nanotechnology is the means, but artifi-
cial intelligence is the end. Kurzweil uses evolutionary biology in order to
‘naturalize’ the kind of technology that he encourages. According to him, it
is the evolution of life itself that tended to overcome the limitations of hu-
man brain by inventing computational technology and now presides over the
building of nanobots. This vague notion of a process of hominization is all
Kurzweil needs to establish himself as the prophet of a new era of spiritual
machines. His argument rests on two postulates: (i) human technologies are
the continuation of biological evolution; just as the flint chipper was an ex-
tension of the human hand, so the nanorobot extends the human brain; (ii)
exponential growth is the feature of any evolutionary process of which tech-
nology is a primary example (Moore’s law). The logical conclusion of this
syllogism is this: the golden age of nanotechnology will come within a couple
of decades as an unavoidable future. Because it is the continuation of the nat-
ural process of evolution, we have no choice over the matter. We must simply
accept it and adapt our society to a world shared with nanobots.
7
To sum up this section, Drexler and his supporters have developed a con-
cept of machine that combines an old mechanistic model inherited from Car-
tesian mechanics – a passive matter moved by external agents with a more
recent computational model of machines inherited from cybernetics. Both
the mechanistic model and the cybernetic one rest on the assumption of a
blind mechanism operating without intentionality under the control of a pro-
gram. Biological evolution itself is conceived of as a blind mechanism operat-
ed and controlled by an all-powerful algorithm.
4. The Dynamic Model
A quite different perspective is conveyed by the chemists who vigorously
criticized Drexler’s model of machine. George Whiteside’s frequent use of
the term ‘art’ in his papers on nanotechnology epitomizes their approach to
the field.
8
Nanostructures belong to ‘art’ both in the Aristotelian sense of
technê, or design for specific purposes, and in the sense of skill, since they re-
quire the invention of astute and unconventional methods of nanofabrica-
tion. For chemists, the age of nanotechnology is not exactly a radical break.
After all, building molecular architectures is what chemistry has done for
many centuries and chemists took inspiration from living structures before
Two Cultures of Nanotechnology? 73
the term nanotechnology became fashionable. In 1978, for example, bio-
inspiration led to the creation of a new branch of chemistry – supramolecular
chemistry – whose aim is to obtain molecular recognition without the help of
genetic code through chemical processes that mimic the selectivity of biolog-
ical processes. According to Jean-Marie Lehn, who coined the term ‘supra-
molecular chemistry’, “it is one of the major chemist’s motivation to see that
biology successfully made highly complex properties on a molecular basis.”
9
In their bio-inspiration, materials chemists are less concerned with genetic
programs and genetic engineering than with the stuff of which living things
are made. Their main purpose is to understand what is unique about biologi-
cal materials both in their structure and in the dynamics of their development
and morphogenesis.
10
Living organisms are models for nanodesign first and
foremost because they present materials adapted, by design, to a set of per-
formances.
Like Drexler, materials scientists and engineers have shaped an artificialist
view of nature. For them, biological evolution is a kind of engineer designing
efficient systems. Unlike Drexler and Kurzweil, however, they assume that
nature is an insuperable engineer. Nature is not so much a model of order as a
model of ingeniosity (ingenium). It is a wizard, an astute designer playing
tricks with nature’s laws. For instance, Richard Smalley, who was awarded the
Nobel Prize for the discovery of C
60
, describes the works of nature with su-
perlative and playful terms:
Nature has played the game at this level [the nanoscale] for billions of years,
building stuff with atomic precision. Every living thing is made of cells that are
chok-full of nanomachines – proteins, DNA, RNA, etc. – each jiggling around
in the water of the cell, rubbing up against other molecules, going about the
business of life. Each one is perfect right down to the last atom. The workings
are so exquisite that changing the location or identity of any atom would cause
damage. [Smalley 1999]
In trying to understand the tricks used by nature to solve her ‘engineering
problems’, materials chemists received three major lessons from biology.
First, biomaterials are interesting because they are never homogeneous.
Whereas engineered materials are usually processed for a single property, bi-
omaterials are multifunctional composite structures. The interest of material
scientists, especially chemists working on high performance composites, is to
learn something about the art of associating heterogeneous structures from
nature itself. In their effort to design composite structures at the molecular
level, they either turned their attention to such familiar materials as wood,
bone, or mucus, or to mollusk shells, insect cuticles, spider-silk, etc. These
composite structures associating hard and soft, combining inorganic and
organic components, and capable of high performance – appeared to be ideal
models for human technology for various reasons. They are models of func-
74 Bernadette Bensaude-Vincent
tional diversity, being adapted for a variety of tasks including growth, repair,
and recycling. Unlike Drexler’s machines with rigid parts each of them de-
signed for one specific function, biological nanomachines may not be mechani-
cally robust and they may not have optimal performances, but they offer a
good compromise between properties for different environments. The key to
success of living organisms does not lie in a single engineered building block
that concentrates all the instructions or information for operating the machine.
Rather, biology teaches chemists that success comes with improving the art of
mixing heterogeneous components and working out elegant solutions to com-
plex problems. Consequently, the focus is less on the ultimate components of
matter than on the relations between them. Interfaces and surfaces are crucial
because they determine the properties of the components of composite materi-
als and how they work together. Nanochemistry distinguishes itself from the
culture of purity and high vacuum chambers by advancing an impure process of
composition and hybridization that mimics natural materials. Biology does not
provide a model of highly concentrated information as suggested by Feyn-
man’s famous talk. It is a model of interaction and composition. Nature chal-
lenges nanomaterials scientists to design a composite displaying more proper-
ties than the sum of the properties of its components. In this case biology pro-
vides a model of emergence.
The major objections raised by Whitesides and Smalley concern Drexler’s
view of universal assemblers. Drexler saw in enzymes the model of universal
assemblers, a sort of molecular hands capable of moving parts to the right po-
sition for assembly. This assertion has provoked the skepticism of chemists
who are well aware of the constraints of atoms’ reactivity. Smalley (2001)
raised two objections: not only would ‘molecular fingers’ obviously take up
too much space and prevent the closeness needed for reactions at the na-
noscale (the ‘fat fingersproblem); but they would also adhere to the atom
being moved, making it impossible to move a building block where you want
it to go (the ‘sticky fingers’ problem). Drexler replied to these objections in
an open letter:
My proposal is, and always has been to guide molecular synthesis of complex
structures by mechanically positioning reactive molecules, not by manipulat-
ing atoms. This proposal has been defended successfully again and again, in
journal articles, in my MIT doctoral thesis […][Drexler 2003]
He complained that Smalley attempted to undermine his scientific credentials
and that for positioning reactive molecules no computer-controlled “Smalley
fingers” are required. Smalley responded by asking, “So, if the assembler
doesn’t use fingers, what does it use?” If there is some kind of enzyme or ri-
bosome in self-replicating nanorobots, he reasoned, then there should be wa-
ter inside because enzymes and ribosomes can only work in water where they
find all the nutrients necessary for living systems. Since there is no possibility
Two Cultures of Nanotechnology? 75
of fine chemistry without solvent, Smalley denied that nanorobots working
in high-vacuum are chemically plausible. As Philip Ball (2003) noticed, “It is
becoming increasingly clear that the debate about the ultimate scope and pos-
sibilities of nanotech revolves around questions of basic chemistry”.
For Whitesides, Drexler’s program to force chemical reactions by placing
the reagents in the right position is useless. “Fabrication based on the assem-
bler is not, in my opinion, a workable strategy and thus not a concern. For
the foreseeable future, we have nothing to fear about the grey goo.” (White-
sides 2001, p. 83) Materials chemists simply dismiss Drexler’s scenario be-
cause their main objective is to dispense with assemblers, by self-assembly.
The top of their ‘art’ consists in making heterogeneous components sponta-
neously converge in the right location and assemble into larger aggregates
without any external intervention. In fact, neither manipulating the mole-
cules nor programming the machines requires outside intervention because
the components move by themselves. A fascinating perspective was opened
up by George Whitesides (1995):
Our world is populated with machines, non living entities assembled by hu-
man beings from components that humankind has made […] In the 21
st
cen-
tury, scientists will introduce a manufacturing strategy based on machines and
materials that virtually make themselves; what is called self-assembly is easiest
to define by what it is not. A self-assembling process is one in which humans
are not actively involved, in which atoms, molecules, aggregates of molecules
and components arrange themselves into ordered, functioning entities without
human intervention […] People may design the process, and they may launch
it, but once under way it proceeds according to its own internal plan, either
toward an energetically stable form or toward some system whose form and
function are encoded in its parts.
To be sure, Whitesides provides here only a negative definition of self-
assembly, but this does not mean that it would be an obscure process that
chemists do not understand. Many processes are explored to make variants of
nature’s highly-directional self-assembly. Chemists use templates such as
mesoporous silica or they conduct synthesis in compartments (Ball 2002, pp.
25-26). They take advantage of all possible resources of chemistry and ther-
modynamics in an effort to mobilize all sorts of interactions between atoms
and molecules. Instead of using covalent bonds like traditional organic chem-
ists, they make use of weak interactions such as hydrogen bonds, Van der
Waals, and electrostatic interactions. They use microfluidics and surfactants
in order to produce self-assembled monolayers which, in turn, permit them
to move from atomic and molecular level structure to macroscopic property.
Self-assembly presupposes that the instructions for assembly are integral
to the material components themselves or that they are embedded in their re-
lations. Matter can no longer be viewed as a passive receptacle upon which in-
76 Bernadette Bensaude-Vincent
formation is imprinted from the outside because self-assembly rests on spon-
taneous reactions between materials. Molecules have an inherent activity, an
intrinsic dynamis allowing the construction of a variety of geometrical shapes
(helix, spiral, etc.). It is not an obscure and mysterious vital force, a breath, or
animus that would come from the outside to give life to inanimate matter. It is
more like Claude Bernard’s inner force guiding phenomena generated by
physico-chemical causes. But ironically, it is the reductionist approach of mo-
lecular biology the understanding of the mechanisms of molecular recogni-
tion as well as the process of morphogenesis – that eventually allowed chem-
ists to develop such emergentist views of molecular architectures.
11
A third contrast between the chemists’ and Drexler’s views of na-
nomachines resides in their attention to complexity. Here, this term is taken
in a weak sense, referring to non-linear processes. Complexity became a prob-
lem when chemists started to examine the behavior of single molecules instead
of dealing with Avogadro numbers of molecules. How do molecules cooper-
ate to produce the average properties and behavior of familiar macroscopic
chemicals, became a puzzling question (Whitesides & Ismagilov 1999). In
fact, chemists had suspected that nanoparticles behave differently than macro-
scopic chemical substances long before the coming of nanoscience.
12
Gold,
usually characterized by its yellow color, becomes red when processed in nan-
ospheres. More generally, the color of metal and semiconductor nanoparticles
depends on their size, a property commonly used in the glass industry. Today,
it is also used to design magnetic materials with iron/platinum colloids, an ap-
plication that has rendered colloid synthesis a highly sophisticated and prom-
ising domain of nanochemistry (Evans & Wennerstrom 1999). Given this
long-standing attention to size-sensitive properties, the discovery that the
semi-conductor behavior of bulk graphite can be modified into metallic behav-
ior according to the size and geometry of carbon nanotubes did not come as a
revelation in theoretical chemistry. Chemists were prepared to admit that el-
ements have special properties and behavior when processed at the nanoscale.
Unlike computer scientists, who are eager to replicate conventional machines
at the nanolevel, materials scientists focus mainly on size-sensitive properties.
Their work comprises the entire hierarchy of structures in living systems,
from large molecules that assemble at the nanoscale to form organelles, to
cells, tissues, and organs that ultimately compose unique organisms. There-
fore, they cannot rely on a uniform view of nature as being the same at all
scales. While it is true that the laws of nature are universal, chemists do not
assume that they apply equally to all scales.
To sum up, chemists working on the design of nanomaterials seem to rely
on a specific underlying view of machines that revives a number of anti-
mechanistic notions. They do not deprive matter of spontaneity or dynamis;
instead of assembling prefabricated building blocks, they play with composi-
Two Cultures of Nanotechnology? 77
tion and interfaces; instead of inferring from the macro to the nanoscales, they
assume a hierarchy of structures. While Drexler’s efforts are aimed at eliminat-
ing chemistry in order to work under the strict control of a program, they mo-
bilize all possible resources of chemistry, of kinetics and thermodynamics.
5. Historical roots
Clearly engineers and chemists have two irreconcilable views of na-
nomachines. So striking is the contrast that it raises the question: are there
two cultures within the field named nanotechnology? In their revolutionary
claims, Drexler and his followers never mention earlier attempts at taking in-
spiration from life. His emphasis on the bottom-up approach creates a dis-
continuity with more traditional materials processes. Moreover, thanks to the
reference to Feynman, nanotechnology seems to be rooted in quantum phys-
ics thus proceeding from a ‘noble’ theoretical science rather than from ‘dirty’
experimental physics or materials engineering. However this was not the first
biomimetic. There had been many previous attempts at mimicking living or-
ganisms at the macro and the microlevels.
Biomimetism has been a leitmotif in technology from mythical attempts –
the wings of Daedalus – up to the more recent examples like velcro. In many
technological areas, such as aeronautics, architecture, and textiles, mimicking
living things has been a current practice that has lead to some brilliant re-
sults
13
. Biomimetism is more than a handful of occasionally successful bio-
inspired inventions. It became a research program in the 20
th
century initiated
by Darcy Thompson, a zoologist who applied mathematics to the study of
living shapes and physics to the study of their growth. In On Growth and
Forms (1992 [1942]) he argued that the different parts of an organism are op-
timally shaped. This book was the root of a joint approach of living organ-
isms by biologists and engineers. Bionics (literally, ‘units of life’) was an at-
tempt to evaluate the efficiency of an organism or a machine, to measure the
structures and processes by which the ‘purposes’ or ends of the system were
fulfilled
14
. In the postwar period, biomimetism benefited from strong sup-
port from the US army, Naval research, and the National Institute of Health.
The term ‘biomimesis’ was introduced in 1961 at the second symposium on
bionics by Warren S. McCulloch, a neuroscientist member of the Research
Laboratory of Electronic at MIT, as a generic concept. Taking the term in its
most extensive sense, “the imitation of one form of life by another”, McCul-
loch (1962) included the mimetic strategies to avoid enemies or catching
preys that are predetermined in the genes of insects. McCulloch divided bio-
mimesis into two distinct fields, cybernetics and bionics. Cybernetics, he ar-
78 Bernadette Bensaude-Vincent
gued, deals with control functions rather than with mechanical work.
15
It is
mainly concerned with regulation mechanisms and feedback control. By con-
trast, ‘bionics’ was defined “as an attempt to understand sufficiently well the
tricks that nature actually uses to solve her problems, this enabling us to turn
them into hardware” (ibid., p. 393). According to McCulloch the latter re-
quires more than interdisciplinarity, new skills. He called for a novel science
and a new organization of scientific research because: “one has to have a rea-
sonable knowledge of both engineering and biology in his own head” for the
purpose of understanding living systems. First he called logicians to join the
program because new skills in logic and mathematics are necessary to under-
stand the complex organization of living systems. Second, he called for in-
creasing work on the thermodynamics of open systems because the major
development that he saw coming was the understanding of natural processes
that go on along with ever-increasing entropy: how order evolves from the
inside instead of being forced upon a material after torturing it. In bionics the
emphasis was on the holistic structure of living organisms. For instance, in an
introductory paper entitled ‘Bio-logic’, Heinz von Foerster argued that the
fundamental principle in life was ‘coalition’ rather than self-reproduction.
What I call coalition is an aggregate of elements which jointly can do things
which all of them separately could never achieve. It is characterized by a su-
peradditive nonlinear composition where the whole is more than the sum of
the measure of the parts. [Foerster 1962]
Finally McCulloch identified a third, but minor trend of biomimesis: the de-
sign of artificial organisms that are capable of evolving and learning. At this
time, it was just a small group interacting with the community, but it should
become extremely fashionable in Materials Science and Engineering over the
past decades. Materials scientists look at Nature as an insuperable designer of
optimal, multi-functional, and self-repairing structures (Bensaude-Vincent et
al. 2002). They are trying to understand ‘the tricks that nature actually uses to
solve her problems’, and to mimic them in order to solve their own problems.
Beyond McCulloch’s dual genealogy of biomimetism, the current divorce
between two paradigms of nanotechnology resonates with an older philo-
sophical problem. The current trend generates serious ‘epistemological risks’.
The mechanistic model may have a heuristic power for some time as it had,
for instance, in the history of medicine. However, its epistemic relevance as a
simplifying model may lead to epistemic obstacle because it ignores inner dy-
namics and power at work both in living organisms and in technological sys-
tems. Moreover, as George Canguilhem suggested in a paper on ‘machine and
organism’, the mechanization of life is inseparable from a project of instru-
mentalization of life and control over nature. Descartes’ theory of animal-
machines rested on a systematic depreciation of animals in order to legitimize
Two Cultures of Nanotechnology? 79
their utilization as tools by humans (Canguilhem 1952, p. 111). Ethical and
epistemological issues are closely intertwined.
At this critical point, it may be helpful to go back to the ancient Greek
notion of technê.
16
It is well-known that, while Aristotle defined technê as a
mimesis of nature, he did not hesitate to draw analogies from arts to describe
nature as a craftsman displaying the ingeniosity associated with mechanics.
There is nothing new in the current artificialization of nature. Already in an-
tiquity, there were two different and occasionally conflicting views of tech-
nology. On the one hand, the arts or technai were considered as working
against nature, as contrary to nature. This meaning of the term para-physin
provided the ground for repeated condemnations of mechanics and alchemy.
On the other hand, the arts – especially agriculture, cooking, and medicine –
were considered as assisting or even improving on nature by employing the
dynameis or powers of nature. In the former perspective, the artisan, like Pla-
to’s demiurgos, builds up a world by imposing his own rules and rationality
on a passive matter. Technology is a matter of control. In the latter perspec-
tive the artisan is more like the ship-pilot at sea. He conducts or guides forces
and processes supplied by nature, thus revealing the powers inherent in mat-
ter.
17
Undoubtedly the mechanicist model of nanotechnology belongs to the
demiurgic tradition. It is a technology fascinated by the control and the over-
taking of nature.
Nanotechnology and biotechnology are mainly concerned with the con-
trol of nature at the most basic level, i.e. the level of atomic building blocks.
It does not really matter whether the control of the molecular machinery is in
the hands of humans or in the hands of posthuman cyborgs. The grey goo
scenario is just the continuation of a long tradition of mythologies and fic-
tions ranging from Prometheus to Faust and Frankenstein. Yet there re-
mains an alternative future that could make nanotechnology more akin to ag-
riculture or traditional medicine. Susan Linquist from MIT Whitehead Insti-
tute once said: “About 10,000 years ago, [humans] began to domesticate
plant and animals. Now it’s time to domesticate molecules.” (quoted in
Zhang 2003, p. 1177) In this case, blurring the boundary between life and
matter invites neither reductionism nor dreams of control. On the contrary,
nanoscientists dealing with isolated molecules cannot adopt the standard sub-
ject-object relation. Isolated molecules tend to become more like individuals
or partners whom science and technology try to domesticate. If scientists and
engineers were ready to behave more like farmers relying on plants and ani-
mals or like pilots in relying on winds to guide their sea boat, our future
might be less tragic as it seems today. Sailors know that all journeys are risky,
that their jobs require many precautions because they have to negotiate with
natural elements, necessarily involving a good deal of uncertainties.
80 Bernadette Bensaude-Vincent
Notes
1
A major source on this controversy is the special issue of Scientific American, Sep-
tember 2001. See also the open correspondence between Richard Smalley and Eric
Drexler available on the website of the Foresight Institute; Chemical & Engineer-
ing News, December 1, 2003, Vol. 81, No. 48, pp. 37-42
2
Drexler 1986, p. 17. See also the comment posted by Lenester on Mind X
04/17/2003 on Drexler’s “An open letter to Richard Smalley” [www.kurzweilai.
net/meme/frame.html]: The very idea that something which is clearly done in na-
ture cannot also be done by us, is counter to the most basic spirit of science. It
hearkens back to an age of magical descriptions, implying that there’s some mystic
Stuff out there which is beyond our mortal ken.
3
Minsky 1995, p. 193; Rietman 2001, p. 2.
4
For instance, as early as 1956, Arthur von Hippel, professor at MIT advocated an
interdepartmental research center named ‘molecular engineering’. The emerging dis-
cipline was aimed at designing new materials on the basis of molecular understand-
ing. It comprised the structure, formation, and properties of atoms, molecules, ions,
of gases, liquids, solids, and their interfaces. Electrical, magnetic, mechanical parame-
ters were considered the most fundamental (MIT archives, AC 12, Box 71).
5
Nature, vol. 429, 10 June 2004, p. 591. See also Phoenix & Drexler 2004.
6
Dupuy 2000. Drexler did his Ph.D. in Marvin Minski’s laboratory, who wrote his
doctoral thesis under von Neumann.
7
This is the conclusion of Kurzweil’s Testimony quoted above. Technology has
always been a double-edged sword, so we simply need to implement ‘defensive
technologies’ against self-replicating nanobots in the same way as our society is
defending itself against computer viruses. See also, Kurzweil 1998.
8
See for instance Whitesides 1998, Whitesides & Love 2001.
9
Lehn 2004, see also Lehn 1995.
10
See for instance Sarikaya & Aksay 1995.
11
Emergence here should be understood in thermodynamic terms as the production of
higher order out of lower order, which according to Norbert Wiener was the major
characteristics of machines and living organisms as well. Self-assembly is a process
leading from less ordered to higher thermodynamically ordered ensembles of mole-
cules or macromolecules. The resulting aggregates have new properties that could
not have been predicted from the characteristics of individual components. A major
difference lies in the fact that aggregates formed in a laboratory environment are in a
state of equilibrium, whereas in living beings most of them are out of equilibrium.
12
This phenomenon was observed in metal colloids or hydrosols by Michael Faraday
in the mid-19
th
century and became known as the ‘Tyndall effect’ after Tyndall ex-
tended Faraday’s earlier observations. Suspended particles that are small relative to
the wavelength of visible light (with radii of approximately 20 nm) are brilliantly
colored in red, green, and violet because the interaction with the incoming light is
a combination of absorption and scattering (Arribart 2004, p. 363).
13
Vogel 1998, pp. 249-75. Among the most famous examples of successful copies
are the Crystal Palace designed by Joseph Paxton whose roof allegedly copied a
giant water lily; the spinneret for extruding textile fibers inspired by the organ of
silkworms; barbed wire; and the velcro invented by the Swiss engineer Georges
Mestral on the model of the hooked burs that clung to his socks.
Two Cultures of Nanotechnology? 81
14
See for instance Howland 1962.
15
According to McCulloch, cybernetics emerged from the steam engine, when Jul-
ian Bigelow pointed out that it was only the information concerning the outcome
of the previous act that had to return.
16
See for instance Schiefsky (forthcoming) and Staden (forthcoming).
17
On the contrast between the two definitions of technology in the case of genet-
ically modified organisms, see for instance Larrère 2002.
References
Arribart, H.: 2004, ‘Les nanomatériaux autres que ceux des technologies de
l’information et des communications (TICS)’, in: Académie des sciences,
Académie des technologies (eds.), Nanosciences, nanotechnologies, éditions Tec
& Doc, Paris, pp. 361-382.
Ball, P.: 2002, ‘Natural Strategies for the molecular engineer’, Nanotechnology, 13, 15-28.
Ball, P.: 2003, ‘Nanotechnology in the firing line’, Nanotechweb.org, 23 December
[http://www.nanotechweb.org/articles/society/2/12/1/1].
Bensaude-Vincent, B.; Arribart, B.Y.; Sanchez, C.: 2002, ‘Chemists at the School of
Nature’, New Journal of Chemistry, 26, 1-5.
Canguilhem, G.: 1952, Machine et organisme’ in La connaissance de la vie, Hachette,
Paris [quoted from the fourth edition Vrin, Paris, 1971].
Drexler, K.E.: 1981, ‘Molecular engineering: An approach to the development of gen-
eral capabilities for molecular manipulation’, Proceedings of the National Acad-
emy of Sciences, 78, no. 9, chemistry section, 5275-78.
Drexler, K.E.: 1986, Engines of Creation, Anchor Books, New York.
Drexler, K.E.: 1992, Nanosystems. Molecular machinery, manufacturing and computa-
tion, John Wiley & Sons, New York.
Drexler K.E.: 1995, ‘Introduction to nanotechnology’, in: Krummenacker, M. & Lew-
is, J. (eds.), Prospects in Nanotechology. Proceedings of the 1
st
general conference
on nanotechnology: developments, applications, and opportunities, November 11-
14, 1992, Palo-Alto, John Wiley & Sons, New York, pp. 1-20.
Drexler, K.E.: 2001, ‘Machine-Phase nanotechnology’, Scientific American, (Sept.),
66-67.
Drexler, K.E.: 2003a, ‘Open Letter’, Chemical & Engineering News, 81, no. 48, 37-42.
Drexler, K.E.: 2003b, ‘An open letter to Richard Smalley’, April 16, [published on
KurzweilAI.net].
Dupuy, J.P.: 2000, The Mechanization of the Mind, Princeton University Press, Prince-
ton N.J.
Evans, D.F.; Wennerstrom, H.: 1999, The Colloidal Domain: Where Physics, Chemis-
try, Biology and Technology Meet, John Wiley & Sons, New York.
Foerster, H.v.: 1962, ‘Bio-Logic’, in: E.E. Bernard & M.R. Kare (eds.), Biological pro-
totypes and synthetic systems, Plenum Press, New York, vol. 1., pp. 1-12
Godsell, D.: 2003, Living Machinery. Bionanotechnology: Lessons from Nature, Wiley-
Liss, New-York.
Howland, H.: 1962, Structural, hydraulic and economic aspects of leaf venation and
shape ’, in: E.E. Bernard & M.R. Kare (eds.), Biological prototypes and synthetic
systems, Plenum Press, New York, vol. 1, pp. 183-192 .
82 Bernadette Bensaude-Vincent
Kurzweil, R.: 1998, The Age of Spiritual Machines, How We Will Live, Work and Think
in the New Age of Intelligent Machines, Phoenix, New York.
Larrère, R.: 2002, ‘Agriculture: artificialisation ou manipulation de la nature?’, Cos-
mpolitiques, 1 (June), 158-173.
Lehn, J.M., 2004: ‘Une chimie supramoléculaire foisonnante’, La lettre de l’Académie
des sciences, 10, 12-13.
Lehn, J.M.: 1995, Supramolecular Chemistry, VCH, Weinheim.
McCulloch, W.S.: 1962, ‘The imitation of one form of life by another – Biomimesis’,
in: E.E. Bernard & M.R. Kare (eds.), Biological prototypes and synthetic systems,
Plenum Press, New York, vol. 1., p. 393-97.
Minsky, M.: 1995, ‘Virtual Molecular Reality’, in: Krummenacker, M. & Lewis, J.
(eds.), Prospects in Nanotechology. Proceedings of the 1
st
general conference on
nanotechnology: developments, applications, and opportunities, November 11-14,
1992, Palo-Alto, John Wiley & Sons, New York, pp. 187-205.
Phoenix, C.; Drexler, E.: 2004, ‘Safe Exponential Manufacturing’, Nanotechnology, 15,
869-72.
Rietman, E.A.: 2001, ‘Drexler hypothesis of a universal assembler is supported not by
theoretical arguments alone but by existence proof in the form of biological
life’, in: Molecular Engineering of Nanosystems, Springer, New York & Berlin.
Sarikaya, M.; Aksay, I. (eds.), 1995, Biomimetics. Design and Processing of Materials,
AIP Press, Woodbury, New York.
Schiefsky, M.J.: forthcoming, ‘Art and Nature in Ancient Mechanics’, in: W.R. New-
man & B. Bensaude-Vincent (eds.), The Artificial and the Natural: An Ancient
Debate and its Modern Descendants, MIT Press, Cambridge, MA.
Smalley, R.E.: 1999, ‘Prepared written statement and supplemental material’, Rice
University, 22 June [http//www.house.gov/science/smalley_062299.htm].
Smalley, R.E.: 2001, ‘Of Chemistry, Love and Nanobots’, Scientific American, (Sept.),
76-77.
Staden, H.v.: forthcoming, ‘Physis and Technê in Greek Medicine’, in: W.R. Newman
& B. Bensaude-Vincent (eds.), The Artificial and the Natural: An Ancient De-
bate and its Modern Descendants, MIT Press, Cambridge, MA.
Thompson, D.: 1992 [1942], On Growth and Forms, Cambridge University Press,
Cambridge.
Vogel, S.: 1998, Cats’ paws and catapults, Norton & Cy, New York, London.
Whitesides G.: 1995, ‘Self-Assembling Materials,’ Scientific American, (Sept.), 146-9.
Whitesides G. 2001 ‘The Once and Future nanomachines’, Scientific American,
(Sept.), 78-83.
Whitesides, G.: 1998, ‘Nanotechnology: Art of the Possible’, Technology, MIT Maga-
zine of Innovation, (Nov.-Dec.), pp. 8-13.
Whitesides, G.; Love, J.C.: 2001, ‘The Art of Building Small’, Scientific American,
(Sept.), 38-47.
Whitesides, G.M.; Ismagilov, R.F.: 1999, ‘Complexity in Chemistry’, Science, 284, 89-92.
Zhang, Shuguang (2003) ‘Fabrication of novel biomaterials through molecular self-
assembly’, Nature Biotechnology, 21, no. 10, 1171-78
Bernadette Bensaude-Vincent:
Département de philosophie, Université Paris X, Avenue de la Répu-
blique 200, 92001 Nanterre Cedex, France; bensaude@u-paris10.fr
... En esta charla académica Feynmann planteó la posibilidad de manipular la materia átomo por átomo para crear materiales con propiedades insólitas o poco conocidas (Feynmann, 1960). A su vez, marcó la pauta para la obtención de materiales por los dos enfoques más importantes que posee la nanotecnología, la metodología bottom-up (síntesis ascendente) tradicionalmente implementada por los químicos, la cual consiste en la construcción desde lo pequeño hacia lo grande y la metodología top-down (síntesis descendente) basada en la miniaturización que es convencionalmente usado por la física (Bensaude-Vincent, 2004). Cabe señalar que, aunque estos dos enfoques fueron desarrollados de forma independiente en la actualidad no rivalizan, sino que son complementarios. ...
Article
Full-text available
La nanoquímica es un área de conocimientos enfocada en la obtención y la caracterización de materiales que tienen su origen en la escala nanométrica. Se fundamenta en métodos de síntesis ascendente (bottom-up) que involucran procesos de autoensamblaje molecular, así como herramientas, modelos y aproximaciones que son propios de la química. En las últimas décadas esta subdisciplina ha hecho valiosas contribuciones a los ámbitos de la investigación, la industria, la tecnología y el cuidado del medio ambiente. En este trabajo, se analiza el desarrollo histórico, las nociones básicas, así como algunas técnicas y aplicaciones representativas de la nanoquímica con la finalidad de mostrar su valor disciplinar. Posteriormente, se discuten aspectos que permiten justificar su relevancia educativa. En la última parte de este manuscrito se ofrece una descripción de diversas propuestas que se han desarrollado en el bachillerato y en las etapas iniciales de la formación universitaria en torno a la enseñanza de contenidos que guardan relación con la nanoquímica, ya que estas pueden orientar y robustecer el trabajo pedagógico en este campo.
... Sociologija. Mintis ir veiksmas 2014/2(35), Online ISSN 2335-8890 socialiniams mokslams, išskyrus kvietimą atsisakyti savo dalyko ir užsiimti pavienių konkrečių atvejų išgyvenamo tvarkingumo aprašymais" (Wilson 2003;493). Užuot gręžęsis į save, praksiologas nusigręžia nuo savęs gręždamasis ir nuo kitų. ...
Article
Antroje esė dalyje nagrinėjama subjektyvumo problema. Pirma, eksplikuojamas žmogaus subjektyvumo paradoksas trijuose – filosofijos, socialinių mokslų metodologijos ir natūraliosios laikysenos – lygmenyse. Antra, tikslinami galimo profesinio darbo pasidalijimo tarp filosofijos ir socialinių mokslų subjektyvumo tyrimų srityje aspektai. Trečia, išskleidžiama fenomenologiškai grindžiama Aš ir Tai skirtis. Ketvirta, analizuojama praktinių tipizacijų ir idealiųjų tipų skirtis atskleidžiant ikipredikatinio subjektyvumo kasdieniame pasaulyje ir tematizuojamo subjektyvumo moksliniame pasaulyje reikšmes. Kritiškai svarstoma skirtis tarp natūralistinės laikysenos ir fenomenologinės laikysenos.
... Synthetic chemistry relies on partially understood self-organization, and can be seen as exemplifying another scientific/engineering tradition than physics and mechanical engineering. The two traditions are now competing in nanotechnology (Bensaude-Vincent 2006). ...
Chapter
Engineers add to the furniture of the world, and thus shift its ontology – if we use the term “ontology” in a simplistic way (Rip 2000: 8). This ‘adding” is not a simple, linear activity of first making something, and making it available, which is then added to the world. There is a strong prospective element. Artefacts start as technological options, a promise of functionalities, in other words ‘hopeful monstrosities’ (Mokyr 1990, Stoelhorst 1997). This is visible, sometimes literally, in the prototypes: these embody a prospective.
Chapter
In this article, three main approaches of nano-ethics were analyzed from the view of ethical theories and methods, in order to explore the key issues in ethics of high-tech. The first approach is the reduction of ethics as a “heuristic of fear” that serves as a guide to reflect on what the human truly valued is, to appeal for suspending nanotechnology development; The second one is the reduction of ethics as consequentialist evaluation and calculation, which takes the ethical issues as accounting of the risks and benefits and whether man are willing to accept these risks. And the third one is putting nanotechnology in a certain social and cultural context, trying to open the “black box” of the development of nanotechnology, which reveals the close relationship between technology and society, in order to better understand the relationship between nanotechnology and social cultural values. However, these approaches are still difficult to solve the problem of co-existence with uncertainty. Based on the analysis of the approaches above, this article puts forward the conditions of future-oriented nano-ethics framework from the Chinese perspective, utilizing from the ideological resources of the coexistence with uncertainty in Chinese philosophy.
Method
Der vorliegende Bericht beschäftigt sich mit der Zusammenarbeit von Medizin, Natur- und Ingenieurwissenschaften sowie Geistes-, Sozial-und Kulturwissenschaften im Bereich neuer Technologien am Beispiel der Nanowissenschaften und Nanotechnologien (NuN). International besteht die Tendenz, dass in Forschungsprogrammen zu neuen Technologien rund 1 % der Mittel in die geistes-, sozial-und kulturwissenschaftliche ELSI-und Governance-Projekte fliessen. Diese Entwicklung ist in der Schweiz bisher nicht feststellbar. Allgemein und nicht nur auf den Bereich der NuN bezogen ist die ELSI-Forschung in der Schweiz zwar vorhanden, allerdings nicht als langfristig bestehender Forschungsansatz verankert, sondern lediglich über Projektgelder finanziert.
Conference Paper
This is a qualitative empirical study of the governance of nanotechnologies and nanosciences that explores decision making and the decision-making processes in thelight of the emergence of a novel technology and scientific field in Sweden, Finland and the UK. The study, which took place between 2008 and 2011, particularly utilised Gibbons et al. and Rip’s models for science governance, the Mode 1/Mode 2 theses, and Arie Rip’s Strategic Science, alongside thinking around the Knowledge-based Economy as a springboard to examine a selection of characteristics related to organisational diversity and social accountability in decision-making. The data was collected through 42 semi-structured interviews held with 46 actors involved with nanotechnologies and nanosciences related decision-making in the three countries. Additionally, a case study utilising interpretative phenomenological analysis was conducted to capture more detailed accounts of the experiences of three interviewees concerning their participation in decision-making. 'The main conclusions drawn from this study is that the decision-making processes and policy outcomes were very different in the three countries, despite their similarities in terms of socio-economic characteristics, geographic location, and the importance of 'R&D to their economies. The differences were caused by the structures of their respective science governance systems, past controversies, and, possibly, cultural characteristics. The novelty of nanotechnologies and nanosciences, didn't affect policy outcomes, and more organisational diversity and social accountability did not make them more robust. However, the study found support for a more balanced discussion that included both regulatory issues and the promotion of nanotechnologies and nanosciences in the UK as opposed to both Sweden and Finland, which could be related to more organisational diversity and social accountability as noted for the UK. Exploring Gibbons et al. and Rip’s models of science governance showed that neither model is generally applicable to all three countries, and that there is a need for flexibility in order to capture national differences in science governance. The Strategic Science model came across as being more easily applicable in these circumstances.
Article
Full-text available
Le but de cet article est d’étudier la question de l’appartenance ou non des romans de science-fiction au discours scientifique, au même titre que la vulgarisation scientifique ou les articles de recherche, et de voir si le concept de scénario peut mettre en lumière les enjeux scientifiques et politiques liés à cette question. L’analyse de trois exemples de récits de « glue grise » relevant du domaine spécialisé des nanotechnologies, montre qu’ils peuvent être considérés comme des scénarios explorant l’évolution future des techniques émergentes, tout en soulignant l’ambiguïté de la relation entre science et science-fiction pour les technologies novatrices.
Article
Full-text available
The term biomimicry first appeared in 1962 as a generic term including both cybernetics and bionics. It referred to all sorts of imitation of one form of life by another one, while the term “bionics”, defined as “an attempt to understand sufficiently well the tricks that nature actually uses to solve her problems”, is closer to the meaning of “biomimicry” as it has been used by materials scientists since the 1980s. Biomimetism is an umbrella covering a variety of research fields ranging from the chemistry of natural products to nanocomposites, via biomaterials and supramolecular chemistry. It is an informal movement and the concept itself is so loose that one might wonder whether biomimetism is more than a slogan forged by chemists in order to hop on the “green” bandwagon. Or could it bring a revolution into chemistry with a profound transformation of its practices? It is too early to judge, but a historical perspective helps to highlight some trends and tendencies.
Chapter
Notions of nature and art as they have been defined and redefined in Western culture, from the Hippocratic writers and Aristotle of Ancient Greece to nineteenth-century chemistry and twenty-first century biomimetics. Genetically modified food, art in the form of a phosphorescent rabbit implanted with jellyfish DNA, and robots that simulate human emotion would seem to be evidence for the blurring boundary between the natural and the artificial. Yet because the deeply rooted concept of nature functions as a cultural value, a social norm, and a moral authority, we cannot simply dismiss the distinction between art and nature as a nostalgic relic. Disentangling the cultural roots of many current debates about new technologies, the essays in this volume examine notions of nature and art as they have been defined and redefined in Western culture, from the Hippocratic writers' ideas of physis and technē and Aristotle's designation of mimetic arts to nineteenth-century chemistry and twenty-first century biomimetics. These essays—by specialists of different periods and various disciplines—reveal that the division between nature and art has been continually challenged and reassessed in Western thought. In antiquity, for example, mechanical devices were seen as working “against nature”; centuries later, Descartes not only claimed the opposite but argued that nature itself was mechanical. Nature and art, the essays show, are mutually constructed, defining and redefining themselves, partners in a continuous dance over the centuries. ContributorsBernadette Bensaude-Vincent, Horst Bredekamp, John Hedley Brooke, Dennis Des Chene, Alan Gabbey, Anthony Grafton, Roald Hoffmann, Thomas DaCosta Kaufmann, William R. Newman, Jessica Riskin, Heinrich Von Staden, Francis Wolff, Mark J. Schiefsky
Article
Le développement contemporain de la production agro-industrielle a changé la définition de l'agriculture : fondée et développée depuis les origines comme un "art du pilotage" des interactions entre l'homme et son environnement, elle serait devenue depuis un "artifice" industriel à la fois coupé de son environnement écologique et facteur de risques techniques croissant pour ce dernier. Cependant, la crise économique et la perte de confiance des consommateurs envers cette agriculture industrielle laissent penser que ce modèle productiviste perd de son hégémonie au profit du renouveau d'une agriculture "pilotée".
Chapter
I would like first to place bionics in a broader framework because there are several fields of interest which surround it on all sides. I therefore pick for my title “The Imitation of One Form of Life by Another—Biomimesis.” There is nothing new in biomimesis. It is so important in avoiding enemies and catching prey that it is determined in the genes of many insects: the walking stick, the velvet ant, and so on. It has been of enormous importance; it has given us the images of our gods and the costumes of our witch doctors. It has given us, since the wings of Daedalus, all sorts of transportation. We have mimicked and mimicked. There are a few things of which we can boast, like the wheel and the seeking of power from indirect sources—first from the sun by eating each other, so to speak, and then from steam engines and what not—all one way or another from the sun until we got fission and now fusion. But aside from sources of power, and from the wheel, most of what we have done has been an imitation. The imitation used to be primarily for our muscular chores:to pump out a mine, to carry material from one place to another, and so forth. The first major deviation from that, curiously enough, was in the steam engine itself. It was not the replacement of the boy who used to push the slide valve by a stick; the boy still did that. That gave us the regenerative cycle. It was by replacement of the boy at the throttle, the boy who controlled the rate at which the steam was admitted to the slide valve, that the first major deviation from doing mere mechanical work came about. I would like to start from that point with what seems to me the major problem we have to face in bionics, as opposed to cybernetics.
Chapter
I would like to begin by outlining the relationships between the methods of this investigation and the central bionic hypothesis.
Article