ArticlePDF Available

Probing the History of Scanning Tunneling Microscopy

Authors:

Abstract and Figures

We present a brief history of the development of scanning tunneling mi- croscopy (STM). These microscopes, developed in 1981 by Gerd Binnig and Heinrich Rohrer (Nobel prize 1986), are capable of imaging and manipulating at an atomic level. STMs, and the group of instruments corporately referred to as scanning probe microscopes that evolved from them, are part of the instrumentation that has enabled nanotechnology. In our history we examine how these instruments have been used (perhaps wrongly) in the "standard story" of the emergence of nanotech- nology. Nanotechnology has developed in a context sometimes referred to as "post- academic", because of the increased emphasis on aspects of commercialization. We examine how this "post-academic" context has influenced the development of these instruments. Our history of STM shows an epistemological shift that is part of post- academic science and nanotechnology policy.
Content may be subject to copyright.
D. Baird, A. Nordmann & J. Schummer (eds.), Discovering the Nanoscale, Amsterdam: IOS Press, 2004.
Copyright © 2004 Davis Baird and Ashley Shew.
ISBN: 1-58603-467-7
Probing the History of Scanning Tunneling
Microscopy
Davis BAIRD & Ashley SHEW
Department of Philosophy, University of South Carolina
db@sc.edu
Abstract. We present a brief history of the development of scanning tunneling mi-
croscopy (STM). These microscopes, developed in 1981 by Gerd Binnig and
Heinrich Rohrer (Nobel prize 1986), are capable of imaging and manipulating at an
atomic level. STMs, and the group of instruments corporately referred to as scanning
probe microscopes that evolved from them, are part of the instrumentation that has
enabled nanotechnology. In our history we examine how these instruments have
been used (perhaps wrongly) in the “standard story” of the emergence of nanotech-
nology. Nanotechnology has developed in a context sometimes referred to as “post-
academic”, because of the increased emphasis on aspects of commercialization. We
examine how this “post-academic” context has influenced the development of these
instruments. Our history of STM shows an epistemological shift that is part of post-
academic science and nanotechnology policy.
1. In the Beginning was Little Big Blue
Figure 1 ‘The Beginning’. Courtesy: IBM Research, Almaden Research Center.
In 1990 in the journal Nature D. M. Eigler and E. K. Schweizer first published this now
well known image of I.B.M.’s initials spelled out with 35 individual xenon atoms.
1
The
image now ‘hangs’ in I.B.M.’s ‘STM Image Gallery’ where it joins 15 other striking, and in
many ways beautiful images of the atomic world (Eigler and Schweizer 1990). The images
are made with a scanning tunneling microscope [STM], which was invented in 1981 by
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 146
Gerd Binnig and Heinrich Rohrer, both employed by I.B.M. Research in Zurich. Binnig and
Rohrer won the 1986 Nobel Prize in physics for their invention.
There is much that is remarkable about Eigler and Schweizer’s ‘IBM.’ Most immedi-
ately it is the interlocked precision technology and science allowing us to ‘see’ these indi-
vidual xenon atoms that we marvel at. But we are not just seeing them; we are placing them
just so. The image shows our hands and eyes reaching to an atomic level of precision. ‘An
atomic level of precision’ now is more commonly called ‘nanoscale precision.’ A nano-
meter, one-billionth of a meter, is roughly ten hydrogen atoms side-by-side. ‘Nanotechnol-
ogy’ is the study and exercise of hands and eyes with sufficient precision to ‘see,’ and in
some cases manipulate, individual atoms.
In I.B.M.’s STM Image Gallery, Eigler and Schweizer’s ‘IBM’ is titled, ‘The Begin-
ning.’ It is an appropriate, if immodest, title, for ‘The Beginning’ is emblematic of the be-
ginning of genuine atomic precision, genuine nanotechnology. There are ‘nano-visionaries’
who see in nanotechnology nothing short of a complete transformation in human life on
Earth, with nanotech solutions to energy, disease, pollution, even mortality. ‘IBM’ is a
crude beginning indeed.
In viewing this image one also may be struck by the notion that in the beginning was
a corporation, IBM. To be sure, nanotechnology is pursued in academic settings where the
unfettered pursuit of truth at least is the stated ideal. IBM, along with the raft of other high
tech companies that are pursuing nanotechnology, no doubt seeks truth, but not at the ex-
pense of shareholder value. Indeed, Eigler and Schweizer say of their image:
Artists have almost always needed the support of patrons (scientists too!). Here, the
artist, shortly after discovering how to move atoms with the STM, found a way to
give something back to the corporation which gave him a job when he needed one
and provided him with the tools he needed in order to be successful.
(www.almaden.ibm.com/vis/stm/gallery).
Nanotechnology, including the instruments that make it possible, such as the scanning tun-
neling microscope, is developing in a much more thoroughly integrated academic/
commer-
cial matrix. One nanotech researcher tells us, tongue only half in cheek, that an assistant
professor probably should not get tenure unless he or she has two ‘start-ups’ to show for
him- or herself (Tour 2002). John Ziman calls this ‘post-academic science’ (Ziman 2000,
ch. 4).
We are interested here in the development of scanning tunneling microscopy, and in
particular how its development in a ‘post academic’ context impacts the design constraints
on STMs, and the various off-shoots, generically called ‘scanning probe microscopy’
[SPM]. We argue that the epistemic needs that underlie commercial development differ
from those that underlie academic development. Thus, through our examination of STM
and its relation to nanotechnology, we articulate a key epistemological difference between
‘academic’ and ‘post-academic’ science.
2. Scanning Tunneling Microscopy
2
Scanning tunneling microscopy is conceptually simple. Imaging with STM involves mov-
ing a tip over a surface to obtain topographic information about the surface. One can com-
pare STM to Braille reading or the way the tumblers in a lock ‘read’ a key’s shape. STM
relies on the phenomenon of electron tunneling to image surfaces. Tunneling is a quantum-
mechanical phenomenon that is manifested in a current induced by a voltage differential
between the scanning tip and the sample (Chen 1993). The level of the tunneling current is
directly proportional to the distance between the tip and the surface. The closer the tip is to
the surface the higher the current.
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 147
The components of an STM include a probe tip, a piezo-electric material that controls
the tip’s location in all three dimensions, a voltage source, a means to measure current flow
from sample to tip, and finally computing power both to transform current data into an im-
age and to control tip movement (Chen 1993). The scanning tip, which ideally is atomically
sharp, is usually made of tungsten or platinum-iridium. Typically, a topographic image is
produced by running the tip back and forth over the sample surface such that, by means of
an electronic feedback loop, the tip is moved up or down to keep the tunneling current –
and consequently the tip’s distance above the surface – at a constant value. By taking note
of the amount the tip has had to be moved up or down, a topographic image of the surface
can be produced with the aid of computer imaging software (Griffith & Kochanski 1990).
When all works right, we see on the computer screen an image that looks as though we
were looking at the landscape of atoms on the sample surface.
Although simple in concept, the researchers creating STM had to solve several diffi-
cult problems: precise control of the tip’s location and movement, control of vibration, and
making a tip with the necessary atomic sharpness. The tip must come within a few nanome-
ters of the surface. Finding a material that can move the tip without crashing the tip into the
surface – or worse – was a huge problem. Piezoelectric ceramics were the answer. Piezo-
electric ceramics deform only slightly when an electric voltage is passed through them. By
appropriately varying the voltage in the piezoelectric positioner, an STM achieves precise
control over the tip’s location over the sample. The tunneling voltage, working in conjunc-
tion with the feedback system and the piezoelectric material, allows for precise control of
the tip’s height and placement over the surface.
Because STM is done with such a high degree of precision, where the tip is only na-
nometers from a surface, external and internal vibrations can present substantial problems.
3
Early STMs were operated at night with everyone silent. Vibration also can be reduced by
building the instrument with sufficient mechanical rigidity and through an appropriate con-
figuration of the piezoelectric transducers. Sometimes STMs are hung on a double bungee
cord sling to manage vibration. Further vibration isolation systems have also been made
with springs and frames (Baum 1986).
Making tips remains something of a dark art. One takes a piece of tungsten or plati-
num-iridium wire and cuts it with wire cutters, being careful to pull away from the end that
will serve as the tip. Some researchers develop a good knack at this, while others do not.
While tips are usually diagramed as nice symmetrical ice-cream cone structures, in reality
they are messy affairs resembling a jagged mountain range. But what is crucial is that one
peak from this range be sufficiently higher than all the others and itself be atomically sharp;
it then can serve as the point through which the tunneling current passes (Myrick 2002a).
There was some lag between Binnig and Rohrer’s development of STM in 1981 and
its acceptance. Initially surface scientists were skeptical, but when Binnig and Rohrer
solved a well-known outstanding problem in surface science – the structure of so called
crystalline silicon (1,1,1) 7 X 7 – they began to take notice (Mody 2004). As the 1980s pro-
gressed, Binnig and other collaborators developed the scanning tunneling microscope in a
variety of directions, including atomic force microscopy (AFM). Because STM depends on
a current passing from sample to tip, only conducting samples could be imaged. AFM,
which Binnig, Christoph Gerber and Calvin Quate developed in 1986 (Binnig, Quate &
Gerber 1986), avoids this limitation by measuring the tiny deflections that a sharp probe
experiences when dragged over a surface. As the surface goes up in elevation, the probe is
deflected up, and this deflection can be measured. Combining measurements from the
whole surface allows researchers to produce an image of the topography of the surface.
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 148
3. Elements of the Commercial History to STM
While STM and its early siblings, AFM and the other techniques of probe microscopy,
were developed in what officially is a corporate context – IBM – the work was essentially
academic research pursued in an industrial research lab. Through most of the 1980s, STM
and AFM remained primarily of academic interest. It took some time for the technique to
catch on. There are a variety of reasons for this.
4
Some are disciplinary or structural. While
the first arena where STM could and did make a significant contribution was surface sci-
ence, neither Binnig nor Rohrer came from this academic community, and their claims for
STM were not, for this reason, immediately accepted by the surface science community.
There were epistemological hurdles to jump as well. The images that one can produce with
a STM are very nice, but on what grounds are they to be believed to be genuine images of
individual atoms? Finally there were pragmatic reasons that slowed the development and
acceptance of STM. Prior to the commercialization of STM in the late 1980s, the STM
probe was not integrated with a computer, and this made the instrument much more diffi-
cult and time consuming to use (Myrick 2002b).
These issues – disciplinary insulation, epistemological acceptability and pragmatic
ease of use – create a kind of ‘chicken and egg’ problem for the commercialization of STM
and SPM more generally. Profits require a large enough market to offset the costs of re-
search and development. Broad markets, by their nature, cross disciplinary boundaries, but
they also require instruments whose results can be relied on, and which can be used by peo-
ple other than those academics willing to spend hours coaxing the instrument to work. Fas-
cination with instrumental possibility, with pushing the limits of resolution, of what it is
possible to ‘see,’ makes for good academic research, but not for an instrument that serves
‘transparently’ or ‘instrumentally’ in the pursuit of other concerns with broad market ap-
peal. At the same time, these broad markets will not develop unless there are instruments
available ‘off the shelf.’ Such instruments are for people who are not themselves interested
in instrumental development. Navigating this chicken and egg problem is the fundamental
story of the commercialization of STM and SPM during the late 1980s and 1990s.
Veeco
(founded 1945)
Digital Instruments
(founded 1986)
ThermoSpectra
Park Scientific
(founded 1988)
TopoMatrix
(founded 1990)
ThermoSpectra
buys Park in
1997
ThermoSpectra buys
TopoMatrix and
combines Park and
TopoMatrix, forming
ThermoMicroscopes
in 1998
Veeco buys
ThermoMicroscopes
and calls it
TM in 2001
Veeco buys Digital
in 1998
Figure 2. Veeco’s Story.
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 149
Although some researchers still chose to build their own STMs or SPMs, a large number of
commercial instrument makers have gone into the SPM market. By the late 1990s some
instruments could be purchased for as little as $50,000 (Amato 1997) or even less –
$15,000 – for a ‘teaching instrument.’
5
Most instrument makers are willing to customize
their instruments to the specifications of the buyer. The main players in the SPM market
have been Digital Instruments (DI) (founded in 1986), Omicron Nanotechnology (founded
in 1984), RHK Technology (founded in 1977), Park Scientific (founded in 1988), TopoMa-
trix (founded in 1990), and Molecular Imaging (founded in 1993). During the 1990s,
through a series of mergers, this diversity of individual makers has been concentrated in a
much smaller number of major players in the SPM market. See figure 2. Veeco has become
the 2,500-pound gorilla in the SPM world, and this has implications for how the instru-
ments develop. For example, Veeco’s coloring scheme – taken over from DI – has become
a de facto standard in SPM images. More generally, a smaller number of makers will lead
to more standardization and less diversity.
4. Post Academic Science
Understanding the context in which the history of STM is taking place is essential to under-
standing the history of STM. Stated most generally this context involves a much closer rela-
tionship between academic scientists and commercial concerns. There are a variety of
forces driving the move to ‘post-academic science,’ and a full discussion would go well
beyond the scope of this paper. Here we briefly discuss three salient points: the Bayh-Dole
act of 1980, the National Nanotechnology Initiative of 2000, and ‘nanovisionary hype.’
The Bayh-Dole Act of 1980 allowed Universities to patent and collect royalties on the
fruits of research conducted with federal funds. In this way universities were pushed to
partner with the industrial sector to transfer the fruits of federally funded research in the
academy, and thereby to profit from them in the commercial sector. Bayh-Dole accelerates
‘technology transfer,’ and has had a broad impact. Prior to 1980 it was a rare event for a
university to patent – fewer than 250 patents were issued to universities per year. Now the
number of patents issued to universities is nearly 2,000. According to the Cornell Research
Foundation:
Academic technology transfer in FY 1999, specifically the licensing of innovations
by U.S. universities, teaching hospitals, research institutes, and patent management
firms, added about $40 billion to the U.S. economy and supported 260,000 jobs. It
has helped to spawn new businesses, create industries, and open new markets. More-
over, it has led to new products and services that save lives, reduce suffering, and im-
prove our quality of life. (Cornell Research Foundation 2001, p. 2)
Of course in addition to these cheery consequences of Bayh-Dole are consequences about
how universities function. Bayh-Dole pushes universities toward a more corporate profit-
centered style of operation, and this is having – and will continue to have – fundamental
consequences for the way research is done (Press and Washburn 2000).
There has been a concerted effort through legislation such as Bayh-Dole to increase
the rate of technology transfer, or, put in other terms, to decrease the ‘time-to-market’ for
discoveries. The National Nanotechnology Initiative [NNI] takes another big step in this
direction. At the end of his presidency, Bill Clinton proposed the NNI with a $225 million
dollar budget for FY 2001 – an 83% increase over expenditures on nanotechnology in the
previous year – and hefty budget increases projected into the first decade of the new cen-
tury (National Science and Technology Council 2000). The initiative is a large project in-
volving numerous governmental agencies. It is managed by the National Science and Tech-
nology Council, which coordinates nanotechnology initiatives at a large number of gov-
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 150
ernment agencies, including the Departments of Defense, Energy, Justice, Transportation,
Agriculture, the Environmental Protection Agency, NASA, the National Institutes of
Health, the National Institute of Standards and Technology, and the National Science
Foundation. The budget devoted to nanotechnology at these institutes in FY 2002 was $604
million dollars, and this is projected to increase to nearly a billion million dollars in FY
2004 (National Science and Technology Council 2002, p. 5; Kanellos 2004). While the U.
S. investment in nanotechnology in FY 2000 exceeded all other countries, in FY 2001, Ja-
pan took the lead in nanotechnology investment, and a recent publication by the European
Nanobusiness Association argues that the European Union is now investing more heavily in
nanotechnology than the United States (Roman 2002). According to a recent publication,
“Corporations, governments, universities and others are expected to spend an estimated
$8.6 billion on nanotechnology research and development in 2004, and the private sector
will account for a bigger proportion of the total” (Kanellos 2004).
It is no accident that the NNI is a nanotechnology and not a nanoscience initiative.
This was a point of discussion in its development, and those with a focus on technology
won the day (Lane 2002). While work at the nanoscale holds some interest because the be-
havior of nano-sized materials (objects 1-100 nanometers in size) cannot be explained by
current quantum mechanical models, it is the technological promise of work at the nano-
scale that is compelling. A central aim of the NNI is to quickly move nanoscientific discov-
eries into commercial development. In 2002 the Massachusetts Institute of Technology re-
ceived a 50 million dollar grant from the US Army to develop better uniforms, uniforms
that would use nanotechnology to stop bullets and other toxins, to monitor the health status
of the wearer, to provide extra strength to the wearer, and to communicate with remote
sites. But, M.I.T. materials scientist Edwin Thomas notes, the Army “didn’t want just pa-
pers in Science and Nature. They wanted real stuff” (quoted in Talbot 2002, p. 46). It took
24 years to take the discovery of the semiconducting properties of germanium in 1931 to
the production of a commercial transistor in 1954; it took nine years to take the discovery
of carbon nanotubes in 1991 to the production of a commercial nanotube product in 2000
(National Science and Technology Council 2002, p. 79). Technological visionaries expect
this ‘time-to-market’ to continue to decrease, and the NNI is pushing this trend. Ray Kurz-
weil has a whole futurology divined from this kind of exponential increase in the rate of
discovery and decrease in the time for technology transfer and commercialization (Kurz-
weil 1999).
Much is expected from nanotechnology. In a recent report from the United States
Government National Nanotechnology Initiative we read: “The impact of nanotechnology
on the health, wealth, and lives of people could be at least as significant as the combined
influences of microelectronics, medical imaging, computer-aided engineering, and man-
made polymers developed in the century just past” (National Science and Technology
Council 2002, p. 11). But, relative to the predictions of some ‘nano-visionaries’ these gov-
ernmental predictions can seem modest. There are serious theoreticians who suggest that a
‘universal assembler’ is not science fiction, but less than a generation or two away (Drexler
1986, 1992). What is a ‘universal assembler’? Roughly put, it is a device that can be pro-
grammed to mechanically place individual atoms (or the assembled parts made by standard
chemistry) in specified places. Since everything in our material world consists of particular
arrangements of atoms – into molecules and thence concatenations of bulk materials, in
theory a universal assembler should be able to make anything, and make it with atomic pre-
cision. Give the device enough raw materials, and a (no-doubt very complex) blueprint or
assembly program, and it will assemble anything you want. In theory it will be possible to
do this inexpensively and quickly: dirt in, couches, cars and carrots out. At a theoretical
level, these ‘nano-visionaries’ argue, biology provides an existence proof for such an as-
sembler: Given a DNA program and the right materials and conditions provided in a womb
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 151
and our ‘biological assembler’ puts together a human baby. In his 1986 book, Engines of
Creation, written for a popular audience, Eric Drexler spelled out how we are on the verge
of being able to do biology one better. The vision is breathtaking, and if true it would radi-
cally and fundamentally transform everything.
Not surprisingly, there have been many skeptics. But in the afterword to the second,
1990, edition of Engines of Creation, Drexler remained convinced:
To summarize some indicators of technological progress: Engines speculates about
when we might reach the milestone of designing a protein molecule from scratch, but
this was actually accomplished in 1988 by William F. DeGrado of Du Pont and his
colleagues. … At IBM, John Foster’s group has observed and modified individual
molecules using the technology of the scanning tunneling microscope [work that led
to Eigler and Schweizer’s ‘IBM’]; this (or the related atomic force microscope) may
within a few years provide a positioning mechanism for a crude protoassembler.
(Drexler 1986, pp. 240-241)
Through the 1990s our understanding, and more importantly our ability in the lab to inter-
vene and control atoms, while nothing remotely like Drexler’s assembler, has moved stead-
ily ahead. In 1991 Robert F. Curl, Harold W. Kroto and Richard Smalley discovered carbon
nanotubes. These are tubular structures made of carbon atoms. Like graphite and diamond,
they are another crystalline form of molecular carbon. Carbon nanotubes are a few nano-
meters in diameter. We are steadily moving ahead on controlling the synthesis of carbon
nanotubes and on increasing their length. They have remarkable properties in terms of
strength to weight, conductivity, magnetic properties, etc. Radically new and useful materi-
als made with carbon nanotubes will be commercially available in the near term. Whether
by way of a ‘universal assembler’ that seems like science fiction or by way of more prosaic
incremental technological development, such as carbon nanotubes, nanotechnology is hav-
ing and will have a significant impact on society’s technological infrastructure.
5. The Standard Story
There is a standard story about how nanotechnology appeared, and scanning tunneling mi-
croscopy plays a central role in this story (National Science and Technology Council 2000;
Drexler, 1986). It starts with a talk Richard Feynman gave to the American Physical Soci-
ety on December 29, 1959, ‘Plenty of Room at the Bottom’ (Feynman1960). Feynman dis-
cusses how much space it would take to store written material on the nanoscale:
For each bit I allow 100 atoms. And it turns out that all of the information that man
has carefully accumulated in all the books in the world can be written in this form in a
cube of material one two-hundredth of an inch wide – which is the barest piece of
dust that can be made out by the human eye. So there is plenty of room at the bottom!
Don’t tell me about microfilm! (Feynman 1960, p. 3)
He goes on, as the standard story goes, to prophetically suggest how real progress could be
made:
We have friends in other fields – in biology, for instance. We physicists often look at
them and say, “You know the reason you fellows are making so little progress?” (Ac-
tually I don’t know any field where they are making more rapid progress than they
are in biology today.) “You should use more mathematics, like we do.” They could
answer us – but they’re polite, so I’ll answer for them: “What you should do in order
for us to make more rapid progress is to make the electron microscope 100 times bet-
ter. (Feynman 1960, p. 5)
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 152
With such a microscope we could see individual atoms, and then we would really be able to
do things. Feynman talks about how this would help biology, how we could make miniature
computers, surgeons that one would swallow, and which would then do their work from the
inside. He discusses problems of manufacture at the nanoscale. In short, 40 years before we
began to get there, he imagined the possibilities that nanotechnology is now opening up.
And, while there have been advances on many fronts, the scanning tunneling microscope –
not quite Feynman’s electron microscope, but with some of the same abilities he talks about
– is widely hailed as the first major step down this road.
So the standard story has Feynman mapping the way to nanotechnology. First we
need a microscope. Binnig and Rohrer gave us that in 1981. Then we start to design and
manufacture on the nanoscale. Drexler’s Engines of Creation and – more fundamentally –
Nanosystems begin the design process for atomic manufacture. Eigler and Schweizer’s
‘IBM’ shows genuine atomic scale writing. Given enough time, we could imagine all the
words written in the world in a dust particle. By the beginning of the new millennium we
have the National Nanotechnology Initiative harnessing a powerful economic motivator to
push the development of nanotechnology.
There are many problems with the standard story. The electron microscope has pro-
vided atomic level resolution – in the best circumstances – since the 1950s, and it is a much
more stable instrumental technology than SPM is at this point. Dana Dunkleberger, Direc-
tor of USC’s Electron Microscopy Lab is not impressed with SPM. He tells us that it can
take two days fiddling with an STM to get something that might be useful, whereas 10
minutes with one of his electron microscopes will produce the goods (Dunkleberger 2002).
And, indeed, the electron microscope is itself very useful in nanoscale research.
Talk to nearly any lab scientist and they will express substantial skepticism over
Drexler’s notion of a universal assembler. New York University chemist Nadrian Seeman
can construct a variety of nanoscale structures using DNA as the primary building material.
But he has been struggling with this for nearly 20 years and as he says, most of the time
you work in the lab for several months and, if you are lucky, one of 500 carefully controlled
chemical constructions will work. His methods remain biochemical, not ‘nano-engineered’
or ‘assembled’ (Seeman 1999, 2002, Liu et al. 1999; Winfree et al. 1998, Mao et al. 1999).
Despite the remarkable, but special case of Eigler and Schweizer’s ‘IBM,’ we do not have
the ability to place atoms just as we please.
The fact that there are problems with the standard story makes it all the more interest-
ing why this story is so widely reported. Drexler uses it. It is used in the narrative of the
National Nanotechnology Initiative. It is used in numerous articles that provide a potted
history of how we got to nanotechnology. Why not report advances in electron microscopy?
What is so special about STM?
As Eigler and Schweizer’s ‘IBM’ proves, STM – as opposed to electron microscopy
– is not simply an imaging technique, but a ‘touching and rearranging’ technique as well. It
is, in a sense, appropriate for Drexler to say that it may lead to a ‘proto-assembler.’ This is
central to Feynman’s vision. It is central to Drexler’s vision. It is central to the fact that we
have a national nanotechnology and not a nanoscience initiative. On this vision, nanotech-
nology is chemistry by other means. We are not just mixing, heating, stirring and generally
coaxing atoms to rearrange themselves in desirable ways – following standard chemical
practice – but we are in some sense directly touching and placing atoms. This is what is so
striking about nanotechnology and why, despite its problems as a genuine historical narra-
tive, the standard story is so compelling.
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 153
6. Post Academic Innovation
We came to write this paper as part of an effort to understand the instrumental basis for
nanotechnology. This itself is part of a larger project that seeks to show how societal under-
standing and control of this new and potentially transformative technology can and should
be informed by the instrumental and theoretical understanding and control of nanotech-
nological phenomena.
6
We were introduced to STM through ‘the standard story’ – as any-
one would be from reading of the nanotechnology literature. Consequently, we were very
surprised to hear Dana Dunkleberger, Head of USC’s Electron Microscopy Lab dismiss
probe microscopy. He called SPMs “squirrelly” (Dunkleberger 2002). There are, no doubt,
reasons for his dislike of probe microscopes to be found in his background and training,
which started in the 1960s and has focused almost exclusively on electron microscopy. But
we believe there is more here, and we close this paper considering what this ‘more’ could
be.
To put the matter in a nutshell, electron microscopy has developed to the stage where,
for the scientist and industrial researcher, it is akin to a ‘one-hour photo lab.’ The analogy
operates on several levels. First, like a one-hour photo lab, researchers can send materials to
an e/m lab and expect to get back useful results – e/m images – in fairly short order. Useful
results do not depend on the technician operating the microscope knowing much about the
source of the sample. Second, the technicians also do not have to know much about the
operation of the microscope. It is possible for them to produce good images through fairly
routine adjustments to the instrument, adjustments that can be made with a minimal knowl-
edge of the principles behind the instrument’s operation. Consequently – and third – it is
possible for any reasonable competent researcher to take a sample to an e/m lab and to get
useful results him- or herself, without extensive training and experience with the instru-
ment. Indeed, USC’s e/m lab is set up for just this kind of use.
None of this is true for probe microscopy. The instruments are finicky, requiring an
experienced hand to operate. Those using them have to have some initial understanding of
what they are looking for to get useful results, and it takes a good bit of time to get these
results. Properly interpreting the results themselves requires a nuanced understanding of the
sample under investigation and the way in which the instrument interacts with the sample.
There have been notorious misreadings of STM images, including an image presented on
the cover of Nature (Driscoll et al. 1990) that purported to show DNA, but which very
likely is an artifact (Myrick 2002a).
We can characterize the difference between electron microscopy and probe micros-
copy in terms of six points:
1. Robustness of structure;
2. Ease of operation;
3. Through-put;
4. Versatility of use;
5. Ease of reliable interpretation of the output;
6. Ability of the output to ‘stand on its own’ as ‘a fact.’
In 2002, Professor Harry Ploehn of USC’s Department of Chemical Engineering, purchased
two STMs. Two graduate students were assigned to learn how to work with them so they
could be used in research applications. Both were soon broken (Myrick 2002b). This is not
to be blamed on clumsy graduate students, but rather on the state of the art of STM instru-
mentation. STMs require an experienced hand, and are easy to break in inexperienced
hands. Even then, they are difficult to use, and they take a long time to produce useful im-
ages. While STMs have been used on nearly everything under the sun (Mody 2004), they
do not regularly produce useful results across this spectrum of uses. Finally, despite the
striking successes of such images as Eigler and Schweizer’s IBM, the images that one can
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 154
get from an STM are not routinely reliable, and cannot now be interpreted independently of
a prior understanding of the sample being imaged.
From the point of view of someone with little interest in probe microscopy per se, but
for whom images – and possibly even manipulation – of atoms is a desired end, probe mi-
croscopy is deficient in regard to these six points. Among those who have been working on
SPMs since their inception, Stanford researcher, Calvin Quate recently has concerned him-
self attacking these issues:
The major limitation for scanning probe imaging and lithography is throughput. A
major thrust of the work in our group is geared toward increasing throughput by
scanning simultaneously with multiple probes all moving at high speeds. (Quate
2002).
7
Other researchers have pointed out to us how difficult overcoming these obstacles will be
for SPM (Myrick 2002b). A significant difference between the electron microscope and
probe microscopes is the ability to radically alter the field of vision. With an electron mi-
croscope one can put a specimen in the instrument and ‘see it’ with a field of vision large
enough to allow comparison with images of the same specimen produced by more ordinary
means, such as light microscopy. Then one can ‘zoom in’ on a particular feature, producing
magnification beyond what is possible with light microscopy. This ability to ‘zoom in’ has
two epistemologically important consequences. First, it provides compelling evidence that
what the scope shows is not an artifact of the instrument. Here we can compare and cali-
brate (some of) the output of an electron microscope against the output of older and more
established light microscopes. Second, it provides those using the instrument the ability to
know where on the specimen they are looking, and this in turn provides more confidence in
the interpretation of the resulting image.
We are not here concerned with making predictions about whether or when SPMs
will be developed that resolve these issues. But we are concerned with making two points
about them. First, the success of SPMs as commercial products depends on improvements
on the six points we spell out above. Second, these points are not epistemologically neutral,
but involve developing SPMs to satisfy certain epistemological ends and not other possible
epistemological ends. Together these points articulate one respect in which ‘post academic
science,’ and in particular its instantiation in the development of SPMs, is not epistemo-
logically neutral.
There is a general term of art from the science studies literature that is used to de-
scribe resolving the six points we identify above: black boxing (Latour 1987, 1996, Baird
2004). Typically, in the science studies literature, the rhetorical strategy has been to open
up, or ‘deconstruct,’ a black-boxed theory or instrument. Our interest, however, is in the
process of closing the box, and what this means on an epistemological level. The on-going
story of SPMs is an excellent case to follow to see the epistemology of post academic sci-
ence in action.
Perhaps the most epistemologically compelling aspect to black boxing SPM is in the
interpretation of the images. Images are not neutral data. They immediately invoke our
powerful and experienced neural systems for processing and interpreting visual data. SPMs,
in terms of their epistemological basis, are not visual – and in this respect they different
fundamentally from electron microscopy – they are tactile. But we present this “tactile
data” visually, and we do this because, as human beings, we can quickly and easily – virtu-
ally transparently – ‘know what we are seeing.’ For this reason, it is not enough to make
images from SPM data. The images have to accommodate our built-in or experientially
acquired way of understanding images. Of course, it is possible for an expert on probe mi-
croscopy to train him- or herself to ‘see the visual data’ as it ‘should be seen’ given an un-
derstanding of how the data were acquired. But, if the instrument is going to be used by
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 155
‘non-SPM-experts,’ this can pose substantial problems. Thus, the kinds of images that a
black-boxed SPM produces are significantly constrained by how humans interpret images.
Indeed, part of our interpretation of images is our ability to move and see how the visual
impression correlates with our motion. In this way SPMs are one important device for what
Alfred Nordmann describes as “inhabiting the nanoscale” (Nordmann 2004). Thus it is no
small difference that the electron microscope allows for more significant control over the
field of vision. On similar grounds, it was no small improvement in SPMs when DI devel-
oped computer assisted digital controllers for their SPMs. These controllers allowed users
to interact with SPM images in a manner more like the way we have become accustomed to
interacting with other visual images.
One could imagine a world – indeed this was the world of the 1980s – when each
researcher who wanted to use an SPM made it him- or herself. The instrument would be
tailored to the specific research concerns because of which the researcher wished to use the
SPM in the first place, and the output of the instrument could be in any format, because the
researcher would know how the image was generated and what aspects of the output repre-
sented genuine interactions with the sample. In such a world one would expect a prolifera-
tion of SPMs varying in numerous respects from each other. In the world of commercial
SPMs, with a need for broad markets and a need to deskill the instrument – both in terms of
its use and the interpretation of its data – one expects less variation. Here, then, in the case
of the developing story of SPMs, is a significant epistemological consequence of our move
to post academic science.
Notes
1
It is important to note that ‘IBM’ was created at very low temperatures, roughly 4 degrees Kelvin, in part
to control for thermal motion.
2
There is an excellent overview of the operation and history of STM as part of the Dibner Institute’s “His-
tory of Recent Science and Technology” website. There the development of scanning tunneling micros-
copy figures prominently in their history of materials research (hrst.mit.edu/hrs/materials/public/
STM_intro). Another excellent source of information on STM/SPM put together by John Cross is the web-
site www.mobot.org/jwcross/spm.
3
Binnig and Rohrer 1986.
4
Cyrus Mody explores this history very nicely in his essay in this volume (Mody 2004).
5
Burleigh Instruments, founded in 1972, started making SPMs for educational use in 1992. In a December
1992 advertisement, Burleigh advertised an Instructional STM for less than $15,000 (Burleigh Instruments
1992).
6
This is a large multidisciplinary project funded by the National Science Foundation at the University of
South Carolina (www.cla.sc.edu/cpecs/nirt/index.html).
7
Quate’s work also is quote on the Dibner Institute website (Dibner 2002).
References
Amato, I.: 1997, ‘Candid Cameras for the Nanoworld’, Science, 276, 1982-1985.
Baird, D.: 2004, Thing Knowledge, University of California Press.
Baum, R.: 1986, ‘Scanning tunneling microscope achieves atomic resolution’, Chemical and Engineering
News, 64, 22-25.
Binnig, G. & Rohrer, H.: 1986, ‘Scanning tunneling microscopy’, IBM Journal of Research and Develop-
ment, 30, 355-369.
Binnig, G.; Quate, C.F. & Gerber, C.: 1986, ‘Atomic Force Microscopy’, Physical Review Letters, 56, 930-3.
Burleigh Instruments: 1992, Advertisement, Review of Scientific Instruments, 63.
Business and Company ASAP: 2002, University of South Carolina Libraries
(online: http://web4.infotrac.galegroup.com/itw/infomark/0/1/1/purl=rc6_BCPM?sw_aep=usclibs
accessed September 10, 2002).
Chembytes Infozone: 1999, All Report, Microscopy, SPM and NSOM
(online: www.chemsoc.com/chembytes/aiireports/aii_MB27.htm, accessed August 30, 2002).
D. Baird & A. Shew: Probing the History of Scanning Tunneling Microscopy 156
Chen, C.J.:1993, Scanning Tunneling Microscopy, New York: Oxford University Press.
Cornell Research Foundation: 2001, Bayh-Dole Act
(online: http://www.crf.cornell.edu/bayh-dole.html ; accessed October 29, 2002).
Dibner Institute: 2002, ‘History of Recent Science and Technology’
(online: http://hrst.mit.edu/hrs/materials/public/STM_intro.htm, accessed October 29, 2002).
Drexler, K.E.: 1986, Engines of creation: The Coming Era of Nanotechnology, Garden City, NY: Anchor
Press.
Drexler, K.E.: 1992, Nanosystems: Molecular Machinery, Manufacturing and Computation, John Wiley.
Driscoll, R.J.; Youngquist M.G. & Baldeschwieler J.D.: 1990, ‘Atomic-scale imaging of DNA using scanning
tunneling microscopy’, Nature 346, 294-6.
Dunkleburger, D.: 2002, Conversation with Davis Baird and Ashley Shew, unpublished.
Economist: 1993, ‘The promise of atomic eyes’, 314, 91 -92.
Eigler, D.M. & Schweizer, E.K.: 1990, ‘Positioning single atoms with a scanning tunneling microscope’,
Nature, 344, 524-526 (online on STM Image Gallery at www.almaden.ibm.com/vis/stm/atomo.html).
Fasca, C.: 1998, ‘Veeco deals for Digital Instruments’, Electronic News, 44, 1.
Feynman, R.P.: 1960, ‘There’s Plenty of Room at the Bottom’, Engineering and Science
(online: www.zyvex.com/nanotech/feynman.html ; accessed: October 29, 2002).
Flowers, M.R.: 2001, ‘RHK Technology, Inc.’, Laboratory Equipment, 38, 6.
Griffith, J.E. & Kochanski, G.P.: 1990, ‘Scanning Tunneling Microscopy’, Annual Review of Material Sci-
ence, 20, 219-44.
Hamers, R.J.: 1996, ‘Scanned Probe Microscopies in Chemistry’, Journal of Physical Chemistry, 100, 13103-
13120.
Kanellos, M.: 2004, ‘Funding to grow to $8.6 Billion’, CNET News.com (online:
http://news.com.com/Nanotech+funding+to+grow+to+%248.6+billion/2100-7337_3-5310762.html
accessed August 18, 2004).
Kurzweil, R.: 1999, Age of Spiritual Machines: when computers exceed human intelligence, New York: Viking.
Lane, N.: 2002, Science Advisor to President Clinton. Conversation with Davis Baird on July 23, 2002.
Latour, B.: 1987, Science in Action. Cambridge, MA: Harvard University Press.
Latour, B.: 1996, Pandora’s Hope, Cambridge, MA: Harvard University Press.
Liu, F.; Sha, R. & Seeman, N.: 1999, ‘Modifying the Surface Features of Two-Dimensional DNA Crystals’,
Journal of the American Chemical Society, 121, 917-922.
Mao, C.; Sun, W.; Shen, S. & Seeman, N.: 1999, ‘A Nanomechanical Device Based on the B-Z Transition of
DNA’, Nature, 397, 144-146.
Mody, C.C.M.: 2004, ‘How Probe Microscopists Became Nanotechnologists’, in: D. Baird, A. Nordmann &
J. Schummer (eds.), Discovering the Nanoscale, Amsterdam: IOS Press, pp. 119-133 (this volume).
Myrick, M.: 2002a, Presentation at University of South Carolina, September 18.
Myrick, M.: 2002b, Conversation with Davis Baird, October 11.
National Science and Technology Council, Committee on Technology, Subcommittee on Nanoscale Science,
Engineering, and Technology: 2000, National Nanotechnology Initiative: The Initiative and its Imple-
mentation Plan. (online: www.nano.gov/nni2.pdf ; accessed October 29, 2002).
National Science and Technology Council, Committee on Technology, Subcommittee on Nanoscale Science,
Engineering, and Technology: 2000, National Nanotechnology Initiative: The Initiative and its Imple-
mentation Plan, Detailed Technical Report Associated with the Supplemental Report to the President’s
FY 2003 Budget (online: www.nano.gov/nni03_aug02.pdf, accessed 29 October 2002).
Nordmann, A.: 2004. ‘Molecular Disjunctions: Staking Claims at the Nanoscale’ (this volume).
Pascal, R.: 1998, Scanning Tunneling Microscopy
(online: www.physnet.uni-hamburg.de/home/vms/pascal/index.html, accessed August 6, 2002).
Press, E. & Washburn, J.: 2002, ‘The Kept University’, Atlantic Monthly
(online: www.theatlantic.com/issues/2002/03/press.htm)
Quate Group: 2002, Quate Group, Introduction
(online: www.stanford.edu/group/quate_group/index.html ; accessed: October 29, 2002).
Roman, C.: 2002, ‘It’s Ours to Lose: An Analysis of EU Nanotechnology Funding and the Sixth Framework
Programme’, European NanoBusiness Association (online available http://www.nanoeurope.org).
Seeman, N.: 2002, Conversation with Davis Baird, August 5.
Seeman, N.: 1999, ‘DNA Engineering and its Application to Nanotechnology’, Tibtech, 437-443.
Talbot, D.: 2002, ‘Super Soldiers’, Technology Review, 46.
Tour, J.: 2002, Conversation with Davis Baird, Unpublished, July 23.
Veeco: 2002, About Veeco (online: www.veeco.com/html/about_history.asp ; accessed August 6, 2002).
Weinstein, R.: 2001, ‘Microscope focuses on small details’, Business Journal, 21, 15.
Winfree, E.; Liu, F.; Wenzler, L. & Seeman, N.: 1998, ‘Design and Self-Assembly of Two-Dimensional DNA
Crystals’, Nature, 394, 539-544.
Ziman, J.: 2000, Real Science, New York: Cambridge University Press, Chapter 4.
... In contrast to material science, where the EM was capable to detail the smallest elements with great precision and clarity resolution (1 nm), it cannot be used to study living cells. Thereafter, the invention of scanning probe microscopy in form of a scanning tunneling microscope (STM) made by Gerd Bennig and Heinrich Rohrer in 1981 (Baird & Shew, 2004) has allowed specimens to be seen at the atomic level . It was Ernst Ruska who designed the first transmission electron microscope (TEM) in 1931, and in 1986 received the Nobel Prize. ...
Article
Synthetic chemicals, such as fertilizers and pesticides, are abundantly used in agriculture to enhance soil fertility and prevent the occurrence of diseases, respectively. Many studies have reported a negative influence of these chemicals on the soil environment. Natural sources from earthworms and their products, as a result of vermicomposting, may be considered better alternatives. The aim of this review was to reveal the source of antifungal efficiency of vermicompost and its derivatives, such as vermiwash, coelomic fluid, skin secretion of earthworms, and metabolites from decomposer bacteria in vermicompost, in order to highlight their application in agriculture. The synergistic activity of bioactive compounds present in coelomic fluid, mucus, skin secretion, and metabolites from associated bacteria (decomposer) assisted crop plants for effective action against various soil pathogenic fungi, such as Rhizoctoniasolani, Alternaria solani, Aspergillus niger, A.flavus, Fusariumoxysporum, and F. graminearum. Thus, these bioactive metabolites can be recommended to suppress plant fungal diseases. Vermicompost and its derivatives should be considered for use in agricultural fields to control harmful soil fungi and increase crop productivity.
... In contrast to material science, where the EM was capable to detail the smallest elements with great precision and clarity resolution (1 nm), it cannot be used to study living cells. Thereafter, the invention of scanning probe microscopy in form of a scanning tunneling microscope (STM) made by Gerd Bennig and Heinrich Rohrer in 1981 (Baird & Shew, 2004) has allowed specimens to be seen at the atomic level . It was Ernst Ruska who designed the first transmission electron microscope (TEM) in 1931, and in 1986 received the Nobel Prize. ...
Article
Electron microscope (EM) was developed in 1931 and since then microscopical examination of both the biological and non-biological samples has been revolutionized. Modifications in electron microscopy techniques, such as scanning EM and transmission EM, have widened their applicability in the various sectors such as understanding of drug toxicity, development of mechanism, criminal site investigation, and characterization of the nano-molecule. The present review summarizes its role in important aspects such as toxicity assessment and disease diagnosis in special reference to SARS-COV2. In the biological system, EM studies have elucidated the impact of toxicants at the ultra-structural level in various tissue in conformity to physiological alterations. Thus, EM can be concluded as an important tool in toxicity assessment and disease prognosis.
... Examples of optical microscopes that are widely used are fluorescent microscopy and polarized light microscopy [73] while the commonly used electron microscopes are transmission electron microscopy (TEM) and scanning electron microscopy (SEM). SEM visualizes samples down to 1 nm and smaller while TEM visualizes down to 0.1 nm and smaller; however before the discovery of these microscopy techniques, AFM and scanning tunneling microscopy are commonly used [74]. AFM provides information about the image, measurement at the nanoscale level [75]. ...
Chapter
Antimicrobial polymers are more than ever before receiving great attention from researchers, and this is chiefly due to the great antimicrobial potential they possess as well as the increasing rise in the challenges associated with the treatment and management of pathogenic microorganisms. Various research groups have extensively carried out investigations to explore the antimicrobial efficacy of several polymers and results have shown the potential application of these polymers in the eradication of diverse strains of disease-causing microorganisms. Two fundamental mechanisms of action have been clearly reported; however, more precise and in-depth mechanisms of microbial killing are required to further give clarity especially for biofilmassociated mechanisms. The combination of these mechanisms will enhance the overall antimicrobial efficacy of antimicrobial polymers. In addition, the most appropriate method of synthesis that will lead to harnessing the maximum potential of these antimicrobial polymers should be employed during fabrication. Although, at the moment, the use of antimicrobial polymers appears to be very promising in the treatment of infectious pathogens/diseases. However, the fabrication of long-acting and reusable antimicrobial polymers with a broad range of antimicrobial activity should be carefully considered to overcome some of the current challenges encountered with the use of some of the existing antimicrobial polymers.
... Auf diskursiver Ebene wird die Entstehung des Feldes vor allem auf Erwartungen (Selin 2007), forschungs-und technologiepolitische Strategien (Schaper-Rinkel 2010) sowie Zukunftsbilder und Visionen (Lösch 2010) zurückgeführt. Auf der Praxisebene schliesslich werden wissenschaftlich-technische Entwicklungen, wie Rastersonden-und Rasterkraftmikroskopie (Baird und Shew 2004;Mody 2004), und Ressourcen-Beziehungen (Merz und Biniok 2015) als zentrale Faktoren der Entfaltung der Nanowissenschaften hervorgehoben. 1 In diesem Artikel wird die Diskussion ausgeblendet, ob es sich um die Nanowissenschaft (Singular) oder Nanowissenschaften (Plural) handelt, und/oder ob von Nanotechnologie(n) oder Nanowissenschaft und -technologie gesprochen werden sollte (siehe dazu etwa Baus 2012 oder Biniok 2013). 2 ...
Article
Full-text available
Zusammenfassung: Wissenschaftliche Events werden bei der Entstehung und Entwicklung neuer Wissenschaftsfelder kaum thematisiert. Am Beispiel der Swiss NanoConvention wird gezeigt, inwiefern eine langjährige Veranstaltungsreihe die Schweizer Nanowissenschaften prägt und ihrerseits geprägt wird. Die Convention bietet die Möglichkeit, sowohl neue For-men von Wissenschaft und Forschung gegenüber heterogenen Publika zu präsentieren und zu legitimieren, als auch interne Strukturen des Feldes zu festigen. Die Analyse gewährt so Einblick in die Konturen der Nanowissenschaften. Abstract: Scientific events are rarely discussed in the formation and development of new fields of science. Using the example of the Swiss NanoConvention, it will be shown to what extent a long series of events shapes the Swiss nanosciences and is shaped by them. The convention offers the opportunity to both, present and legitimize new forms of science and research to heterogeneous publics, and to consolidate internal structures of the field. The analysis thus provides insight into the contours of nanoscience.
Article
La química tal y como la conocemos hoy es un sistema de prácticas científicas, experimentales, estabilizadas y enseñadas de la misma manera en la mayoría de los países. La química es heredera de una multitud de oficios que coinciden en transformar y crear nuevas sustancias. Aquí se discute el límite material de la sustancia química, ejemplificado a través del carbono, particularmente del experimento de difracción del C60. Con ello se intentará precisar el concepto de sustancia química y su relevancia en la discusión filosófica actual.
Article
Full-text available
Nanoscale entities are by definition invisible to the unmediated senses. Yet generating images of these objects has been crucial to the rhetoric of nanotech boosters. Thus, bringing microscopes and microscopists under the nano umbrella has been central to the work of nano proponents. No instruments have been more crucial to this process than the scanning tunneling microscope (STM) and atomic force mi-croscope (AFM). Yet STM and AFM have long histories that precede the advent of nano. I outline this history and show that the connection between probe microscopy and nano is contingent rather than self-evident. The drafting of the probe micros-copy community into nano was inspired by role differentiation within that commu-nity following the widespread commercialization of the instruments in the early 1990s. As probe microscopists move into nano, it is likely they will remake the field in light of the history of their community.
Article
A technique called scanning tunneling microscopy is generating a lot of excitement among the scientists—so far, primarily physicists—who are aware of its capabilities. The technique allows researchers to produce images of surfaces in which individual atoms are resolved. Being able "to see" atoms, and features even smaller than atoms, will have an effect on fields as diverse as surface science, microelectronics, electrochemistry, and biochemistry, these researchers say.
Article
SINCE its invention in the early 1980s by Binnig and Rohrer 1,2, the scanning tunnelling microscope (STM) has provided images of surfaces and adsorbed atoms and molecules with unprecedented resolution. The STM has also been used to modify surfaces, for example by locally pinning molecules to a surface3 and by transfer of an atom from the STM tip to the surface4. Here we report the use of the STM at low temperatures (4 K) to position individual xenon atoms on a single-crystal nickel surface with atomic precision. This capacity has allowed us to fabricate rudimentary structures of our own design, atom by atom. The processes we describe are in principle applicable to molecules also. In view of the device-like characteristics reported for single atoms on surfaces5,6, the possibilities for perhaps the ultimate in device miniaturization are evident.
Article
DNA double-crossover (DX) molecules are rigid DNA motifs that contain two double helices linked at two different points. It is possible to form hydrogen-bonded two-dimensional crystals from DX molecules and to observe those arrays by atomic force microscopy (AFM) [Winfree, E.; Liu, F.; Wenzler, L. A.; Seeman, N. C. Nature 1998, 394, 539−544]. The sticky ends that hold the arrays together can be varied, so as to include diverse periodic arrangements of molecules in the crystal. The inclusion of extra DNA hairpins designed to protrude from the plane of the crystal provides a topographic label that is detected readily in AFM images:  By using these labels, it is possible to produce stripes at predicted spacings on the surface of the crystal. The experiments presented here demonstrate that it is possible to modify these patterns, by both enzymatic and nonenzymatic procedures. We show that a hairpin containing a restriction site can be removed quantitatively from the array. We also demonstrate that a sticky end protruding from the array can be ligated to a hairpin containing its complement. In addition, it is possible to anneal a hairpin to the crystalline array by hydrogen bonding, both in solution and after deposition on a mica surface. The ability to modify these arrays increases the diversity of patterns that can be produced from an initial set of DX components. Thus, a single array can be modified in a large number of ways that can alter its physical or chemical features.
Article
Nanoscience may be surrounded by controversy but is characterized by its absence. Evidence for this comes from the reconstruction of a peculiarly muted sci-entific "debate" regarding the claim that a single organic molecule may serve as a wire in electronic circuitry. Even though there are fundamentally different theoreti-cal approaches, the debate remains entirely implicit. This is because the research in question is motivated by interest neither in a true representation of nature, nor sim-ply in the invention of devices or production of new substances. As a place-oriented enterprise NanoTechnoScience consists mostly in the settlement and staking of claims on the nanoscale.
Article
Presented here is an overview of the present status and future prospects of scanning tunneling microscopy. Topics covered include the physical basis of the scanning tunneling microscope, its instrumentation aspects, and its use for structural and spectroscopic imaging—on a scale which extends to atomic dimensions. Associated experimental and theoretical studies are reviewed, including several which suggest potential applicability of this new type of microscope to a relatively broad range of biological, chemical, and technological areas.
Article
The theory and applications of scanned probe microscopies in chemistry are reviewed. The review includes scanning tunneling microscopy (STM), atomic force microscopy (AFM), near-field scanning optical microscopy (NSOM), and other related techniques. Applications to chemical and biochemical imaging, molecular identification, and other systems of importance in chemistry are described.