To read the full-text of this research, you can request a copy directly from the authors.
Abstract
We present some instances of binary interactions among the social, material and semiotic systems of chemical knowledge. We highlight the relevance of the different temporalities of each system for the purposes of modelling the evolution of chemical knowledge. Finally, we discuss the relationship of chemical knowledge with other kinds of scientific knowledge.
The periodic system, which intertwines order and similarity among chemical elements, arose from knowledge about substances constituting the chemical space. Little is known, however, about how the expansion of the space contributed to the emergence of the system—formulated in the 1860s. Here, we show by analyzing the space between 1800 and 1869 that after an unstable period culminating around 1826, chemical space led the system to converge to a backbone structure clearly recognizable in the 1840s. Hence, the system was already encoded in the space for about two and half decades before its formulation. Chemical events in 1826 and in the 1840s were driven by the discovery of new forms of combination standing the test of time. Emphasis of the space upon organic chemicals after 1830 prompted the recognition of relationships among elements participating in the organic turn and obscured some of the relationships among transition metals. To account for the role of nineteenth century atomic weights upon the system, we introduced an algorithm to adjust the space according to different sets of weights, which allowed for estimating the resulting periodic systems of chemists using one or the other weights. By analyzing these systems, from Dalton up to Mendeleev, Gmelin’s atomic weights of 1843 produce systems remarkably similar to that of 1869, a similarity that was reinforced by the atomic weights on the years to come. Although our approach is computational rather than historical, we hope it can complement other tools of the history of chemistry.
The origins of innovation in science are typically understood using historical narratives that tend to be focused on small sets of influential authors, an approach that is rigorous but limited in scope. Here, we develop a framework for rigorously identifying innovation across an entire scientific field through automated analysis of a corpus of over 6000 documents that includes every paper published in the field of evolutionary medicine. This comprehensive approach allows us to explore statistical properties of innovation, asking where innovative ideas tend to originate within a field’s pre-existing conceptual framework. First, we develop a measure of innovation based on novelty and persistence, quantifying the collective acceptance of novel language and ideas. Second, we study the field’s conceptual landscape through a bibliographic coupling network. We find that innovations are disproportionately more likely in the periphery of the bibliographic coupling network, suggesting that the relative freedom allowed by remaining unconnected with well-established lines of research could be beneficial to creating novel and lasting change. In this way, the emergence of collective computation in scientific disciplines may have robustness–adaptability trade-offs that are similar to those found in other biosocial complex systems.
Jacques-Louis David’s (1748–1825) iconic portrait of Antoine Laurent Lavoisier (1743–1794) and Marie-Anne Lavoisier (Marie-Anne Pierrette Paulze, 1758–1836) has come to epitomize a modern couple born of the Enlightenment. An analytical approach that combined macro-X-ray fluorescence with the examination and microanalysis of samples by Raman spectroscopy and scanning electron microscopy-energy dispersive X-ray spectrometry to investigate imprecise indications of changes to the composition observed by microscopy and infrared refectography allowed the visualization of a hidden composition with a high level of detail. The results revealed that the first version depicted not the progressive, scientific-minded couple that we see today, but their other identity, that of wealthy tax collectors and fashionable luxury consumers. The first version and the changes to the composition are placed in the context of David’s mastery of the oil painting technique by examining how he concealed colorful features in the first composition by using paint mixtures that allowed for maximum coverage with thin paint layers. The limitations of the analytical techniques used are also discussed. To our knowledge, this is the first in-depth technical study of a painting by J.-L. David.
Oganesson (Og) is the last entry into the Periodic Table completing the seventh period of elements and group 18 of the noble gases. Only five atoms of Og have been successfully produced in nuclear collision experiments, with an estimate half‐life for 294118 Og of 0.69+0.64-0.22 ms.[1] With such a short lifetime, chemical and physical properties inevitably have to come from accurate relativistic quantum theory. Here, we employ two complementary computational approaches, namely parallel tempering Monte‐Carlo (PTMC) simulations and first‐principles thermodynamic integration (TI), both calibrated against a highly accurate coupled‐cluster reference to pin‐down the melting and boiling points of this super‐heavy element. In excellent agreement, these approaches show Og to be a solid at ambient conditions with a melting point of ≈325 K. In contrast, calculations in the nonrelativistic limit reveal a melting point for Og of 220 K, suggesting a gaseous state as expected for a typical noble gas element. Accordingly, relativistic effects shift the solid‐to‐liquid phase transition by about 100 K.
Over the past several decades, the Nobel Prize program has slowly but steadily been modified in both transparent and opaque ways. A transparent change has been the creation of the Nobel Prize in Economic Sciences, officially known as the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel. An opaque change has been the mutation of the Nobel Prize in Chemistry into what is effectively the “Nobel Prize in Chemistry or Life Sciences.” This paper presents a detailed study of this opaque change, including evidence that the disciplines of chemistry and biochemistry cover, today, intellectually quite distinct and generally scientifically‐unrelated intellectual territory. This paper supports the evolution of the Nobel Prizes, and encourages the Nobel Prize program to move from opaque to transparent change processes for the next generations of achievement in the sciences.
We celebrate 150 years of periodic systems that reached their maturity in the 1860s. They began as pedagogical efforts to project corpuses of substances on the similarity and order relationships of the chemical elements. However, these elements are not the canned substances wrongly displayed in many periodic tables, but rather the abstract preserved entities in compound transformations. We celebrate the systems, rather than their tables or ultimate table. The periodic law, we argue, is not an all‐encompassing achievement, as it does not apply to every property of all elements and compounds. Periodic systems have been generalised as ordered hypergraphs, which solves the long‐lasting question on the mathematical structure of the systems. In this essay, it is shown that these hypergraphs may solve current issues such as order reversals in super‐heavy elements and lack of system predictive power. We discuss research in extending the limits of the systems in the super‐heavy‐atom region and draw attention to other limits: the antimatter region and the limit arising from compounds under extreme conditions. As systems depend on the known chemical substances (chemical space) and such a space grows exponentially, we wonder whether systems still aim at projecting knowledge of compounds on the relationships among the elements. We claim that systems are not based on compounds anymore, rather on 20th century projections of the 1860s systems of elements on systems of atoms. These projections bring about oversimplifications based on entities far from being related to compounds. A linked oversimplification is the myth of vertical group similarity, which raises questions on the approaches to locate new elements in the system. Finally, we propose bringing back chemistry to the systems by exploring similarity and order relationships of elements using the current information of the chemical space. We ponder whether 19th century periodic systems are still there or whether they have faded away, leaving us with an empty 150th celebration.
Mendeleev came across with his first attempt to a periodic system by classifying and ordering the known elements by 1869. Order and similarity were based on knowledge of chemical compounds, which gathered together constitute the chemical space by 1869. Despite its importance, very little is known about the size and diversity of this space and even less is known about its influence upon Mendeleev's periodic system. Here we show, by analysing 11.484 substances reported in the scientific literature up to 1869 and stored in Reaxys database, that 80\% of the space was accounted by 12 elements, oxygen and hydrogen being those with most compounds. We found that the space included more than 2,000 combinations of elements, of which 5\%, made of organogenic elements, gathered half of the substances of the space. By exploring the temporal report of compounds containing typical molecular fragments, we found that Meyer's and Mendeleev's available chemical space had a balance of organic, inorganic and organometallic compounds, which was, after 1830, drastically overpopulated by organic substances. The size and diversity of the space show that knowledge of organogenic elements sufficed to have a panoramic idea of the space. We determined similarities among the 60 elements known by 1869 taking into account the resemblance of their combinations and we found that Meyer's and Mendeleev's similarities for the chemical elements agree to a large extent with the similarities allowed by the chemical space.
Stand das 20. Jahrhundert mit dem Aufkommen der elektronischen Medien und dem Internet als Massenphänomen für eine grundlegende Umwälzung der Informationsvermittlung und -organisation, so befinden wir uns nun in einer Umbruchphase vom Informations- zum Netzwerkzeitalter. Diese konfrontiert uns mit einer grundlegenden Veränderung der Organisation des Wissens und den Struk- turen seiner Speicherung. Dieser Umbruch ist eine Herausforderung an die Informatik und an die Geisteswissenschaften gleichermaßen. Vielversprechende Antworten auf diese Herausforderungen ergeben sich aus meiner Sicht vor allem aus zwei Ansätzen - der semantischen Modellierung und der mathematischen Analyse von Netzwerken. Diese methodischen Ansätze werden jedoch bisher zumeist getrennt voneinander behandelt, obwohl beide Ansätze dazu eingesetzt werden, Wissen zu strukturieren und zugleich maschinenlesbar zu machen. Wissensstrukturen werden damit einer Auswertung mit Mitteln der Informatik zugänglich. Die Verbindung dieser beiden Ansätze ist der Leit- gedanke dieser Arbeit. Notwendig ist eine Theorie zur netzwerk- und modellierungstheoretischen Beschreibung (NMB) historischer Wissenssysteme. Diese Theorie basiert ausgehend vom wissen- schaftshistorischen Ansatz der historischen Epistemologie auf drei miteinander verschränkten Netz- werkebenen: dem semantische Netz, das die Wissensbasis beschreibt, dem semiotischen Netz, das die Kodierung von Wissen repräsentiert, und schließlich dem Netz der sozialen Akteure, ohne die Entstehung und Organisation von Wissen nicht stattfinden kann. In Teil I der Arbeit wird in die mathe- matischen, graphentheoretischen und modellierungstheoretischen Ansätze soweit eingeführt, dass die Umsetzung in konkrete Anwendungen verständlich ist, gleiches gilt für den notwendigen Hinter- grund von Seiten der Wissenschaftsgeschichte. Teil II der Arbeit ist der konkreten Umsetzung in vier Fallstudien gewidmet. Anhand von Beispielen aus unterschiedlichen historischen Zeitabschnitten verbunden mit unterschiedlichen Fragestellungen wird aufgezeigt, welche Hilfsmittel eingesetzt werden können, um diese Fragestellungen zu beant- worten, und welche Konsequenzen sich aus dem Ansatz der NMB für die interdisziplinäre historische Forschung ergeben. Im Zentrum stehen bei den Fallstudien zur Geschichte der Allgemeinen Relativi- tätstheorie (ART) und dem Teilprojekt aus dem Projekt zur Geschichte der Max-Planck-Gesellschaft (GMPG) der Aufbau einer Infrastruktur, die auf Grundlage einer semantischen Modellierung unter- schiedliche Datenbestände zusammenbringt und diese dann einer soziale Netzwerkanalyse zugäng- lich macht. Als Beispiel für die Verbindung von semiotischem, semantischem und sozialen Netz wird in der Fallstudie zur ART die Verbindung zwischen bibliometrischer Analyse und dem Kooperations- netzwerk der beteiligten Wissenschaftlerinnen und Wissenschaftler vorgestellt. Wir untersuchen hier zugleich die Auswirkungen von historischen Annahmen sowie ungenauer oder fehlender Daten auf die Ergebnisse der Netzwerkanalyse. In der Fallstudie zur GMPG schauen wir darüber hinaus dar- auf wie Gremien (in diesem Falle Kommissionen), als semiotische Netzwerke verstanden werden können. Die Studien zum Bau der Kuppel des Florentiner Doms und der Sphaera des Sacrobosco legen den Schwerpunkt auf die Verbindung von semiotischem Netz repräsentiert durch Schriftzeug- nisse in Archiven und semantischem Netz. Es wird eine beispielhafte Ontologie vorgestellt, die dieses unterstützt. Die Ergebnis dieser Arbeit verdeutlichen, dass der dargestellte Ansatz dabei hilft, Methoden der In- formatik für die geisteswissenschaftliche Forschung zugänglicher zu machen, quantitative Datenaus- wertung wird dadurch besser verständlich auch für vorrangig an qualitativen Ergebnissen interessier- ten Forscherinnen und Forschern aus den Geisteswissenschaften. Es ist mit vertretbarem Aufwand möglich historische Datenbestände mit Methoden der NMB aufzubereiten und Fragen an diese Da- ten, wie etwa nach Abhängigkeitsbeziehungen oder der Relevanz einzelner Personen in historischen Prozessen zu stellen und zu beantworten. Die Arbeit zeigt, dass dazu keine monolithische neue In- frastruktur notwendig ist, sondern dass durch flexible Kombination existierender Methoden und deren Übertragung auf neue Anwendungsbereiche bereits erhebliche Fortschritte und neue Erkenntnisse erzielt werden können.
Significance
Natural products research seems to be at a critical juncture in terms of its relevance to modern biological science. We have evaluated this landscape of chemical diversity to ask key questions, including the following. How has the rate of discovery of new natural products progressed over the past 70 y? Has natural product structural novelty changed as a function of time? Has the rate of novel discovery declined in recent years? Does exploring novel taxonomic space afford an advantage in terms of novel compound discovery? Is it possible to estimate how close we are to describing all of the chemical space covered by natural products? And, finally, is there still value in exploring natural products space for novel biologically active natural products?
Historians often feel that standard philosophical doctrines about the nature and development of science are not adequate for representing the real history of science. However, when philosophers of science fail to make sense of certain historical events, it is also possible that there is something wrong with the standard historical descriptions of those events, precluding any sensible explanation. If so, philosophical failure can be useful as a guide for improving historiography, and this constitutes a significant mode of productive interaction between the history and the philosophy of science. I illustrate this methodological claim through the case of the Chemical Revolution. I argue that no standard philosophical theory of scientific method can explain why European chemists made a sudden and nearly unanimous switch of allegiance from the phlogiston theory to Lavoisier's theory. A careful re-examination of the history reveals that the shift was neither so quick nor so unanimous as imagined even by many historians. In closing I offer brief reflections on how best to explain the general drift toward Lavoisier's theory that did take place.
This work provides an introduction to mathematical modeling of molecules and the resulting applications (structure generation, structure elucidation, QSAR/QSPR etc.). Most chemists have experimented with some software that represents molecules in an electronic form, and such models and applications are of increasing interest in diverse and growing fields such as drug discovery, environmental science and metabolomics. Furthermore, structure generation remains the only way to systematically create molecules that are not (yet) present in a database. This book starts with the mathematical theory behind representing molecules, explaining chemical concepts in mathematical terms and providing exercises that can be completed online. The later chapters cover applications of the theory, with detailed explanations on QSPR and QSAR investigations and finally structure elucidation combining mass spectrometry and structure generation. This book is aimed in particular at the users of structure generation methods and corresponding techniques, but also for those interested in teaching and learning mathematical chemistry, and for software designers in chemoinformatics.
Das vorliegende Büchlein vereint Publikationen des Apothekers Friedrich Wilhelm Sertürner (1783–1841), die die Entdeckung des Morphins, des ersten aus einer Droge isolierten Alkaloids, nachvollziehbar machen. Obwohl Sertürner, wie damals üblich, nur eine rein handwerkliche Ausbildung zum Apotheker absolviert hatte, beschäftigte er sich in seiner Freizeit mit chemisch-pharmazeutischen Untersuchungen, die ihn schließlich zur Isolierung eines Wirkstoffes führten. Die 1804/05 gelungene Entdeckung eines ersten Alkaloids, d. h. eines basischen Pflanzeninhaltsstoffes mit beachtlicher pharmakologischer Wirkung, führte einige Jahre später zu einem Paradigmenwechsel in der pharmazeutischen Forschung und in der Arzneimitteltherapie, den man mit dem Slogan „Von der pflanzlichen Droge zum Arzneistoff“ umreißen kann. Nachdem Sertürners Entdeckung einige Jahre später, erst 1817, in Frankreich bekannt geworden war, isolierten vornehmlich französische und deutsche Apotheker in der Folgezeit weitere Alkaloide aus hochwirksamen Arzneipflanzen, die dann zeitlich verzögert Eingang in die Arzneimitteltherapie fanden.
The periodic table of elements should be celebrated not only for the order it brings, but also for the fascinating stories underlying this icon of science, suggests Juris Meija.
The Periodic Table of Elements hasn't always looked like it does now, a well-organized chart arranged by atomic number. In the mid-nineteenth century, chemists were of the belief that the elements should be sorted by atomic weight. However, the weights of many elements were calculated incorrectly, and over time it became clear that not only did the elements need rearranging, but that the periodic table contained many gaps and omissions: there were elements yet to be discovered, and the allure of finding one had scientists rushing to fill in the blanks. Supposed "discoveries" flooded laboratories, and the debate over what did and did not belong on the periodic table reached a fever pitch. With the discovery of radioactivity, the discourse only intensified. Throughout its formation, the Periodic Table of Elements has seen false entries, good-faith errors, retractions, and dead ends. In fact, there have been more falsely proclaimed elemental discoveries throughout history than there are elements on the table as we know it today. The Lost Elements: The Periodic Table's Shadow Side collects the most notable of these instances, stretching from the nineteenth century to the present. The book tells the story of how scientists have come to understand elements, by discussing the failed theories and false discoveries that shaped the path of scientific progress. We learn of early chemists' stubborn refusal to disregard alchemy as a legitimate practice, and of one German's supposed discovery of an elemental metal that breathed. As elements began to be created artificially in the twentieth century, we watch the discovery climate shift to favor the physicists, rather than the chemists. Along the way, Fontani, Costa, and Orna introduce us to the key figures in the development of today's periodic table, including Lavoisier and Mendeleev. Featuring a preface from Nobel Laureate Roald Hoffmann, The Lost Elements is an expansive history of the wrong side of chemical discovery-and reveals how these errors and gaffes have helped shape the table as much as any other form of scientific progress.
The first of the transfermium elements—those elements with an atomic number greater than 100—were discovered in the 1950s, largely by the Lawrence Berkeley Laboratory (LBL) in California and the Joint Institute for Nuclear Research (JINR) in Dubna, Russia. After each new element was claimed to have been discovered by one lab, the claim was contested by the other. The International Union of Pure and Applied Chemistry (IUPAC) and the International Union of Pure and Applied Physics (IUPAP) formed a joint working group to end the controversies, the Joint Neutral Group (JNG). When that group failed to resolve the discovery disputes, another was formed, the Transfermium Working Group (TWG). Neutrality was a value important to both groups, giving them the credibility necessary to act as mediators. For the JNG in the 1970s, and the TWG in the late 1980s, neutrality had different meanings and was attempted in different ways. The extensive use of archival collections in this paper provides a more complex and nuanced look at the geopolitical and disciplinary tensions surrounding these discovery disputes and the attempts at neutrality, in its different forms, to resolve them.
Traceless solid-phase synthesis represents an ultimate sophisticated synthetic strategy on insoluble supports. Compounds synthesized on solid supports can be released without a trace of the linker that was used to tether the intermediates during the synthesis. Thus, the target products are composed only of the components (atoms, functional groups) inherent to the target core structure. A wide variety of synthetic strategies have been developed to prepare products in a traceless manner, and this review is dedicated to all aspects of traceless solid-phase organic synthesis. Importantly, the synthesis does not need to be carried out on a linker designed for traceless synthesis; most of the synthetic approaches described herein were developed using standard, commercially available linkers (originally devised for solid-phase peptide synthesis). The type of structure prepared in a traceless fashion is not restricted. The individual synthetic approaches are divided into eight sections, each devoted to a different methodology for traceless synthesis. Each section consists of a brief outline of the synthetic strategy followed by a description of individual reported syntheses.
Chemical research unveils the structure of chemical space,
spanned by all chemical species, as documented in more than
200 y of scientific literature, now available in electronic databases.
Very little is known, however, about the large-scale patterns of
this exploration. Here we show, by analyzing millions of reac-
tions stored in the Reaxys database, that chemists have reported
new compounds in an exponential fashion from 1800 to 2015
with a stable 4.4% annual growth rate, in the long run nei-
ther affected by World Wars nor affected by the introduction
of new theories. Contrary to general belief, synthesis has been
the means to provide new compounds since the early 19th cen-
tury, well before Wöhler’s synthesis of urea. The exploration of
chemical space has followed three statistically distinguishable
regimes. The first one included uncertain year-to-year output of
organic and inorganic compounds and ended about 1860, when
structural theory gave way to a century of more regular and
guided production, the organic regime. The current organometal-
lic regime is the most regular one. Analyzing the details of the
synthesis process, we found that chemists have had preferences
in the selection of substrates and we identified the workings of
such a selection. Regarding reaction products, the discovery of
new compounds has been dominated by very few elemental com-
positions. We anticipate that the present work serves as a starting
point for more sophisticated and detailed studies of the history of
chemistry.
Although extending the reactivity of a given class of molecules is relatively straightforward, the discovery of genuinely new reactivity and the molecules that result is a wholly more challenging problem. If new reactions can be considered unpredictable using current chemical knowledge, then we suggest that they are not merely new but also novel. Such a classification, however, requires an expert judge to have access to all current chemical knowledge or risks a lack of information being interpreted as unpredictability. Here, we describe how searching chemical space using automation and algorithms improves the probability of discovery. The former enables routine chemical tasks to be performed more quickly and consistently, while the latter uses algorithms to facilitate the searching of chemical knowledge databases. Experimental systems can also be developed to discover novel molecules, reactions and mechanisms by augmenting the intuition of the human expert. In order to find new chemical laws, we must seek to question current assumptions and biases. Accomplishing that involves using two areas of algorithmic approaches: algorithms to perform searches, and more general machine learning and statistical modelling algorithms to predict the chemistry under investigation. We propose that such a chemical intelligence approach is already being used and that, in the not-too-distant future, the automated chemical reactor systems controlled by these algorithms and monitored by a sensor array will be capable of navigating and searching chemical space more quickly, efficiently and, importantly, without bias. This approach promises to yield not only new molecules but also unpredictable and thus novel reactivity.
The linkages and interactions of scientific disciplines with industry, politics, and society have long been a staple in the history of science, the history of technology, and science studies. However, it is arguable that the impact of this intertwining on the epistemic and social core of scientific disciplines has not yet been sufficiently explored. Chemistry is an ideal case in point, given that it has emerged as one of the largest scientific disciplines while at the same time becoming one of the world’s most powerful technologies. Specifically, chemistry’s power lies in its ability to gain knowledge of the natural world by transforming it, along with the society in which it is embedded. The four contributions to this Focus section all address chemistry’s border permeability, based on its transformative powers; they focus on the feedback mechanisms that transformed chemistry and thereby altered the very concept of a scientific discipline. So successful has this “nonclassical approach” become that, in the opinion of the contributors to this Focus section, it is now both necessary and advisable to study the history of chemistry’s embeddedness and power in science and technology.
One-hundred fifty years ago, on the eve of German unification, about one-hundred people gathered in Berlin to found the German Chemical Society (DChG) under the charismatic leadership of August Wilhelm von Hofmann, who attracted a large international membership by promoting modern organic chemistry. By 1892, when Emil Fischer succeeded Hofmann, the DChG was the world's largest chemical society. Under Fischer the Society promoted international collaboration with foreign societies, and in 1900 it opened an impressive headquarters, the Hofmann House, where it centralized its greatly expanded literary activity including abstracts and reference publications. Yet a half-century later, after war and racial-national extremism, the house lay in ruins and the Society had ceased to exist. In remembering the Society, one may well ask why its auspicious beginning should have led to this ignominious end.
DNA-encoded chemical library technologies are increasingly being adopted in drug discovery for hit and lead generation. DNA-encoded chemistry enables the exploration of chemical spaces four to five orders of magnitude more deeply than is achievable by traditional high-throughput screening methods. Operation of this technology requires developing a range of capabilities including aqueous synthetic chemistry, building block acquisition, oligonucleotide conjugation, large-scale molecular biological transformations, selection methodologies, PCR, sequencing, sequence data analysis and the analysis of large chemistry spaces. This Review provides an overview of the development and applications of DNA-encoded chemistry, highlighting the challenges and future directions for the use of this technology.
The production of chemical compounds composed of mechanically interlocked molecules (MIMs) by acts of templation are hand-me-downs from the science of chemistry beyond the molecule, which affords molecular recognition free rein to exercise its special powers of organization in marshaling the component parts of the MIMs prior to their being transported back into the molecular world by the formation of chemical bonds. Intersecting the fields of supramolecular chemistry and chemical topology is the discipline of mechanostereochemistry. A mechanical bond is an entanglement in space between two or more component parts, such that they cannot be separated without breaking or distorting chemical bonds between atoms. It follows that a mechanical bond is as strong as the weakest participating chemical bond. Catenanes and rotaxanes are a subset of MIMs that possess mechanical bonds. While mechanomolecules have found their way into switches and motors, the molecules themselves have led to a renaissance in molecular aesthetics.
This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse.
Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.
The development of chemical theory in the nineteenth century has been relatively little studied, compared with other sciences and other periods; much remains still to be explored. One notable example is chemical atomism, and its adjuncts such as valence and structure theory. Nonexistent at the beginning of the century, a generation or two later these ideas had moved to the very center of the science, which they still inhabit. The chemical atomic theory embodies outstanding examples of paper tools that provide not only explanatory and expository functions for what is already accepted as known, but also heuristic guidance in the further construction of a science. It may be of interest, therefore, to attempt an analysis of what some recent studies have revealed about this subject, along with indications of where further historical efforts may yield additional rewards.
This paper looks at the visual and textual images of chemists in A. Cressy Morrison's Man in a Chemical World. It argues that Morrison was attempting to create a public image of an American chemist different from European chemists. Morrison and the illustrator Leon Söderston, working on behalf of the American Chemical Society, attempted to associate chemists and chemical industry with American prosperity by linking the 'man in the white lab coat' to religious and secular themes. This approach is analyzed using the concept of metonyms. Metonyms are a way of encapsulating complex ideas and associations within simple, often iconic, images in text and illustrations.
An analysis of chemical reactions used in current medicinal chemistry (2014), three decades ago (1984) and in natural product total synthesis has been conducted. The analysis revealed that of the current most frequently used synthetic reactions, none were discovered within the last twenty years, and only two in the 1980's and 1990's (Suzuki-Miyaura and Buchwald-Hartwig). This suggests an inherent high bar of impact for new synthetic reactions in drug discovery. The most frequently used reactions were amide bond formation, Suzuki-Miyaura coupling, and SNAr reactions, most likely due to commercial availability of reagents, high chemoselectivity and a pressure on delivery. We show that these practices result in overpopulation of certain types of molecular shapes to the exclusion of others using simple PMI plots. We hope that these results will help catalyze improvements in integration of new synthetic methodologies as well as new library design.
In this paper we investigate the most important visual stereotypes of chemistry as they occur in current portraits of chemists, depictions of chemical plants, and images of chemical glassware and apparatus. By studying the historical origin and development of these stereotypes within the broader context of the history of art and science, and by applying aesthetic and cultural theories, we explore what these images implicitly communicate about the chemical profession to the public. We conclude that chemists, along with commercial artists, have unknowingly created a visual image of chemistry that frequently conveys negative historical associations, ranging from imposture to kitsch. Other elements of this image, however, aestheticize chemistry in a positive manner by referring to classical ideals of beauty and borrowing from revered motifs of modern art.
A Sleeping Beauty (SB) in science refers to a paper whose importance is not
recognized for several years after publication. Its citation history exhibits a
long hibernation period followed by a sudden spike of popularity. Previous
studies suggest a relative scarcity of SBs. The reliability of this conclusion
is, however, heavily dependent on identification methods based on arbitrary
threshold parameters for sleeping time and number of citations, applied to
small or monodisciplinary bibliographic datasets. Here we present a systematic,
large-scale, and multidisciplinary analysis of the SB phenomenon in science. We
introduce a parameter-free measure that quantifies the extent to which a
specific paper can be considered an SB. We apply our method to 22 million
scientific papers published in all disciplines of natural and social sciences
over a time span longer than a century. Our results reveal that the SB
phenomenon is not exceptional. There is a continuous spectrum of delayed
recognition where both the hibernation period and the awakening intensity are
taken into account. Although many cases of SBs can be identified by looking at
monodisciplinary bibliographic data, the SB phenomenon becomes much more
apparent with the analysis of multidisciplinary datasets, where we can observe
many examples of papers achieving delayed yet exceptional importance in
disciplines different from those where they were originally published. Our
analysis emphasizes a complex feature of citation dynamics that so far has
received little attention, and also provides empirical evidence against the use
of short-term citation metrics in the quantification of scientific impact.
Everybody knows that glass is and always has been an important presence in chemical laboratories. Yet the very self-evidence of this notion tends to obscure a supremely important change in chemical practice during the early decades of the nineteenth century. This essay uses manuals of specifically chemical glassblowing published between about 1825 and 1835 to show that early nineteenth-century chemists began using glass in distinctly new ways and that their appropriation of glassblowing skill had profoundly important effects on the emerging discipline of chemistry. The new practice of chemistry in glass—exemplified in this essay by Justus Liebig’s introduction of a new item of chemical glassware for organic analysis, the Kaliapparat—transformed not merely the material culture of chemistry but also its geography, its pedagogy, and, ultimately, its institutions. Moving chemistry into glass—a change so important that it warrants the term “glassware revolution”—had far-reaching consequences.
The availability of structures and linked bioactivity data in databases is powerfully enabling for drug discovery and chemical biology. However, we now review some confounding issues with the divergent expansions of public and commercial sources of chemical structures. These are not only associated with expanding patent extraction but also increasingly large vendor collections amassed via different selection criteria between the Chemical Abstracts Service (CAS) SciFinder® and major public sources such as PubChem, ChemSpider, UniChem and others. These increasingly massive collections may include both real and virtual compounds, as well as so-called prophetic compounds from patents. We address a range of issues raised by the challenges faced resolving the NIH probe compounds. In addition we highlight the confounding of prior-art searching by virtual compounds which could impact the composition of matter patentability of a new medicinal chemistry lead. Finally, we propose some potential solutions.
In January 1865, August Kekulé published his theory of the structure of benzene, which he later reported had come to him in a daydream about a snake biting its tail. Although other theories had been postulated before 1865, Kekulé was the first to identify the correct structure. Kekulé's theory resulted in a clear understanding of aromatic compounds and thus had a major impact on the development of chemical science and industry.
This review article highlights the strategies to successfully perform an efficient solid-phase synthesis of complex peptides including posttranslational modifications, fluorescent labels, and reporters or linking groups of exceptional value for biological studies of several important diseases. The solid-phase approach is the best alternative to synthesize these peptides rapidly and in high amounts. The key aspects that need to be considered when performing a peptide synthesis in solid phase of these molecules are discussed.
In an attempt to prepare a “History of Liquid Chromatography,” it was evident, as is apparent to workers in the field, that one ends up with the generalized term “chromatography.” None of the modes of chromatography covered here is without its liquid phase or mobile phase. Therefore, the title of this history is that of “Chromatography.”
We analyze the connections of Lavoisier system of nomenclature with Leibniz’s philosophy, pointing out to the resemblance between what we call Leibnizian and Lavoisian programs. We argue that Lavoisier’s contribution to chemistry is something more subtle, in so doing we show that the system of nomenclature leads to an algebraic system of chemical sets. We show how Döbereiner and Mendeleev were able to develop this algebraic system and to find new interesting properties for it. We pointed out the resemblances between Leibniz program and Lavoisier legacy, particularly regarding the lingua philosophica for understanding and thinking Nature, in this particular case, chemistry. In the second part we discuss, from the linguistic viewpoint, how Lavoisian algebraic system may be taken further to build a language. We study the constituents of such a chemical language. Finally, we formalize some of the ideas here presented by using elements of network theory and discrete mathematics.