Article

The mechanization of mind

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Inquiry on the MRP should not be confused with metaphysics, or with the important question, within classical ethics, of the role of ontology (i.e. questions about an ethics "with or without" ontology, 36 cf. [127]). ...
... Informed by the identification of knowing with making (or the idea that it is through remaking nature that a human being can perfectly know it), nano's MRP brings with it a loss of any significant distinction between the scientist and engineer, because it identifies the search for knowledge with an intervention with or transformation of it (cf. [36,41]). The fundamental philosophical roots of the project of the mechanisation of the mind can be reconstructed along a continuum that reinterprets first nature, then the human body, and finally the mind as a computational model. ...
... In Popper's opinion, these rival metaphysical research programs are basic categories of thought that operate as invisible boundaries, dictating the types of problems that scientists choose to work on, the way that they are formulated, and the kinds of solutions that are accepted. 36 The expression 'ethics without ontology' is taken from a series of lectures by Hilary Putnam [127]. Putnam argues that 'ontology' is not meant as a synonym of 'metaphysics' but rather is a part of it: that concerning the 'science of Being'. ...
Article
Full-text available
This paper aims to review different discourses within the emerging field of ethical reflection on nanotechnology. I will start by analysing the early stages of this debate, showing how it has been focused on searching for legitimacy for this sphere of moral inquiry. I will then characterise an ethical approach, common to many authors, which frames ethical issues in terms of risks and benefits. This approach identifies normative issues where there are conflicts of interest or where challenges to the fundamental values of our society arise. In response to the limitations of this approach, other commentators have called for more profound analysis of the limits of our knowledge, and have appealed to values, such as sustainability or responsibility, which should, they suggest, inform nanotechnological development (I will define this approach as a “sophisticated form of prudence”). After showing the ways in which these frameworks are limited, I will examine more recent developments in debates on nanoethics which call for the contextualisation of ethical discourse in its ontological, epistemic and socio-economic and political reflections. Such contextualisation thus involves inquiry into the ‘metaphysical research program’ (MRP) of nanotechnology/ies and analysis of the socio-economic, political and historical reality of nano. These ideas offer genuinely new insights into the kind of approach required for nanoethical reflection: they recover a sense of the present alongside the need to engage with the past, while avoiding speculation on the future. KeywordsNanoethics-Ethical aspects of nanotechnology-Metaphysical research program-Responsible development of nanotechnology
... Mechanization conveys the idea that thought is a form of computation shared by human brains and by a class of computing machines and that fundamental elements of mental life, such as intentionality, could be understood on the basis of a physical law. Those elements have laid the foundations of cognitive science and artificial intelligence (Dupuy, 2001). However, no study to date has investigated the multidimensional nature of the ontologizing process for the Roma and Chinese target groups via structural equation modelling (SEM), and no standardized measurement of the ontologizing process with regard to the three mentioned essences (i.e. ...
... Roma, Chinese and Italian). LISREL 8.80 software was used (Jöreskog & Sörbom, 1988, 1996, 2001. The model included six latent variables (i.e. ...
Article
Full-text available
The ontologization process involves the use of social representation relating to the human–animal binary to classify ingroup and outgroup members. To date, no study has investigated the multidimensional nature (i.e. human, animal and automata) of the ontologizing process via structural equation modelling (SEM). Four hundred and twenty-one Italian participants were asked to attribute 24 positive/negative, human/animal/automata associates to each of three target groups: typical Roma/Chinese/Italian. Results showed that the proposed six-factor model (i.e. positive/negative, human/animal/automata essence) was statistically robust for each of the three groups. The Roma group was animalized by attributing more animal negative associates than any other target group, whereas the Chinese group was mainly given a robot positive essence.
... Op hierdie manier word die droom waar: alles word formaliseerbaar deur intelligensie te ontliggaam en in die kuberruimte te plaas en deur ons liggame deur rekonstruktiewe chirurgie te kommodifiseer (Franchi en Güzeldere, 2005:89). Mense word nou buite die wêreld geplaas terwyl hulle (en hulle geestelikheid) eintlik in die wêreld tuishoort (Dupuy, 2000 (Lanier, 2000). ...
... aangewend word. Dit vra egter ʼn kultuur wat verstaan dat die tegnologie kan genees, maar ook kan vernietig(Stiegler in De Beer, 2012). BeideDreyfus (2001) enRushkoff (2013) verskaf voorbeelde van hoe tegnologie tot mense se voordeel aangewend kan word, op voorwaarde dat mense hulle eie eindige aard verstaan en aanvaar, maar ook dié van tegnologie.Dupuy (2000) enFranchi en Güzeldere (2005) vra dat die wetenskaplikes en denkers wat hulle met die skep van die kuborg besig hou die sosiale wetenskappe sal opsoek om alternatiewe konseptualiserings by hulle tegnologie in te sluit. Die kuborg word beskou as ʼn basis op grond waarvan denkers uit baie dissiplines ʼn bydrae kan lewer ...
Article
This article reports on the influence of an important project of our time, the creation of thinking machines or cyborgs, on the essence of human spirituality. The popular work of Kurzweil was used as a starting point. Kurzweil believes that the cyborgs or man-machine hybrids will not only be a simulation of human cognition but also of consciousness and spirituality. In fact, they will be more than a simulation; they will enhance and alter consciousness and spirituality in a radical, unfathomable way. The basis of Kurzweil's argument is his law of accelerating returns, which predicts such a progress in technology that, if an algorithm for the working of the brain can be formulated, it will be possible to develop software emulating the conscious brain. In this way, current limitations experienced by human beings, including mortality, will be overcome. This assumption was investigated from an anthropological view using Janicaud's concepts. The cyborg is seen as an illustration of people's innate desire to overcome the human condition. Humanity is characterised by a constant struggle to find a balance between the superhuman and the inhuman as the extreme states of human spirituality. This also characterises human liberty - humans are free to choose. Given this characterisation of humanity, the nature of the cyborg as superhuman is then considered. Different possibilities are sketched: the superhuman who will exercise the pure will to power (awakening the inhuman) or a person with a complete naive freedom. The future is not easy to predict. Janicaud considers the view of the cyborg as superhuman to be a myth. The real danger facing humanity is the inhuman. The growth and progress of technology do not guarantee moral progress. Currently, technology seems to create a sharp divide between a privileged few and the rest. The two world wars in the previous century illustrated ways in which technology could be used to commit cold-blooded barbaric acts on a mass scale. In addition, biotechnology and other technological innovations could give rise to new forms of violence which can effectively be spread by new media. The inhuman is a place from where it is difficult to return. The challenge of our time is to carry the responsibility of our freedom in such a way that we can defend ourselves against our inhumanity; but in a manner that would enable us also to open up to the radical creativity and strangeness of superhumans lying dormant in us. It is clear that now, more than ever before, human spirituality needs to be as alive and rich as possible to rise to the challenge. However, the belief in technology as utopia enslaves the human spirit. We forget that we are the creators of technology and fabricate excuses for moral and intellectual abdications. Furthermore, the assumption that cognition can be mechanised or formalised leads to the disembodiment of intelligence and thought. Humans are placed outside of the world to which they belong. We become estranged from ourselves and each other. The human spirit seems to be wounded by the prevailing metaphors of disembodiment and mechanisation accompanying technological dominance. This article contributes to the call for the struggle for the re-enchantment of the human spirit. It is imperative that thinkers and innovators - leaders - create new metaphors to provide richer descriptions of humanity. Social sciences, having studied the human condition for centuries, might contribute valuable ideas. Technology can be used in this struggle, but only if human beings understand their own paradoxical nature as well as that of technology. The fortified spirit is one that accepts its mortality and fragility but takes responsibility for its freedom. In this way, meaning is re-introduced in the lives of human beings.
... Op hierdie manier word die droom waar: alles word formaliseerbaar deur intelligensie te ontliggaam en in die kuberruimte te plaas en deur ons liggame deur rekonstruktiewe chirurgie te kommodifiseer (Franchi en Güzeldere, 2005:89). Mense word nou buite die wêreld geplaas terwyl hulle (en hulle geestelikheid) eintlik in die wêreld tuishoort (Dupuy, 2000 (Lanier, 2000). ...
... aangewend word. Dit vra egter ʼn kultuur wat verstaan dat die tegnologie kan genees, maar ook kan vernietig(Stiegler in De Beer, 2012). BeideDreyfus (2001) enRushkoff (2013) verskaf voorbeelde van hoe tegnologie tot mense se voordeel aangewend kan word, op voorwaarde dat mense hulle eie eindige aard verstaan en aanvaar, maar ook dié van tegnologie.Dupuy (2000) enFranchi en Güzeldere (2005) vra dat die wetenskaplikes en denkers wat hulle met die skep van die kuborg besig hou die sosiale wetenskappe sal opsoek om alternatiewe konseptualiserings by hulle tegnologie in te sluit. Die kuborg word beskou as ʼn basis op grond waarvan denkers uit baie dissiplines ʼn bydrae kan lewer ...
Article
This article reports on the influence of an important project of our time, the creation of thinking machines or cyborgs, on the essence of human spirituality. The popular work of Kurzweil was used as a starting point. Kurzweil believes that the cyborgs or man-machine hybrids will not only be a simulation of human cognition but also of consciousness and spirituality. In fact, it will be more than a simulation; it will enhance and alter consciousness and spirituality in a radical, unfathomable way. The basis of Kurzweil’s argument is his law of accelerating returns, which predicts such a progress in technology that, if an algorithm for the working of the brain can be formulated, it will be possible to develop software emulating the conscious brain. In this way, current limitations experienced by human beings, including mortality, will be overcome. This assumption was investigated from an anthropological view using Janicaud’s concepts. The cyborg is seen as an illustration of people’s innate desire to overcome the human condition. Humanity is characterised by a constant struggle to find a balance between the superhuman and the inhuman as the extreme states of human spirituality. This also characterises human liberty – humans are free to choose. Given this characterisation of humanity, the nature of the cyborg as superhuman is then considered. Different possibilities are sketched: the superhuman who will exercise the pure will to power (awakening the inhuman) or a person with a complete naïve freedom. The future is not easy to predict. Janicaud considers the view of the cyborg as superhuman to be a myth. The real danger facing humanity is the inhuman. The growth and progress of technology do not guarantee moral progress. Currently, technology seems to create a sharp divide between a privileged few and the rest. The two world wars in the previous century illustrated ways in which technology could be used to commit cold-blooded barbaric acts on a mass scale. In addition, biotechnology and other technological innovations could give rise to new forms of violence which can effectively be spread by new media. The inhuman is a place from where it is difficult to return. The challenge of our time is to carry the responsibility of our freedom in such a way that we can defend ourselves against our inhumanity but so that we could also open up to the radical creativity and strangeness of superhumans lying dormant in us. It is clear that now, more than ever before, human spirituality needs to be as alive and rich as possible to rise to the challenge. However, the belief in technology as utopia enslaves the human spirit. We forget that we are the creators of technology and fabricate excuses for moral and intellectual abdications. Furthermore, the assumption that cognition can be mechanised or formalised leads to the disembodiment of intelligence and thought. Humans are placed outside of the world to which they belong. We become estranged from ourselves and each other. The human spirit seems to be wounded by the prevailing metaphors of disembodiment and mechanisation accompanying technological dominance. This article contributes to the call for the struggle for the re-enchantment of the human spirit. It is imperative that thinkers and innovators – leaders – create new metaphors to provide richer descriptions of humanity. Social sciences, studying the human condition for centuries, might contribute valuable ideas. Technology can be used in this struggle but only if human beings understand their own paradoxical nature as well as that of technology. The fortified spirit is one that accepts its mortality and fragility but takes responsibility for its freedom. In this way, meaning is re-introduced in the lives of human beings.
... A principal mover of these meetings was Warren McCulloch, to whom I will return later. The fascinating story of this group's wide ranging and often contentious deliberations is told in books by Heims (1991), Dupuy (1994) and in sections of Hayles' book of 1999. Transcripts of the proceedings were finally published by Pias in 2003. ...
Preprint
Metaphors of Computation and Information tended to detract attention from the intrinsic modes of neural system functions, uncontaminated by the observer's role for collection and interpretation of experimental data. Recognizing the self-referential mode of function, and the propensity for self-organization to critical states requires a fundamental re-orientation with emphasis on the conceptual approaches of Complex System Dynamics. Accordingly, local cooperative processes, intrinsic to neural structures and of fractal nature, call for applying Fractional Calculus and models of Random Walks in Theoretical Neuroscience studies.
... Issues of simulation, fidelity, presence, and immersion extend back to classical antiquity, perhaps most famously in the myth of Amphitryon (e.g.,Dupuy, 2008). The logical problems inherent with simulation in the context of experimental stimuli and virtual reality are not new. ...
Preprint
J. J. Gibson (1966) rejected many classical assumptions about perception but retained 1 that dates back to classical antiquity: the assumption of separate senses. We suggest that Gibson's retention of this assumption compromised his novel concept of perceptual systems. We argue that lawful, 1:1 specification of the animal--environment interaction, which is necessary for perception to be direct, cannot exist in individual forms of ambient energy, such as light, or sound. We argue that specification exists exclusively in emergent, higher order patterns that extend across different forms of ambient energy. These emergent, higher order patterns constitute the global array. If specification exists exclusively in the global array, then direct perception cannot be based upon detection of patterns that are confined to individual forms of ambient energy and, therefore, Gibson's argument for the existence of several distinct perceptual systems cannot be correct. We argue that the senses function as a single, irreducible perceptual system that is sensitive exclusively to patterns in the global array. That is, rather than distinct perceptual systems there exists only 1 perceptual system.
... Anti-aircraft guns were controlled by humans who learned to anticipate flight patterns while firing, but the idea that machines might be programmed to engage in such reflexive behaviour was tantalising. However, though a machine metaphor was prominent and a prominent French scholar speaks of a "mechanisation of mind" in this connection (Dupuy, 2000), cybernetics was not a reductionist program, did not propose mechanical explanations for all of human behaviour, and did not rely on a simple analogies to existing or imagined computers. Rather, cybernetics was presented as a tertium quid integrating mind and machine in the mode of circular rather than unidirectional causality, which could be, and soon was, brought to other fields, for example, in G. E. Hutchinson's study of biological, chemical and physical processes in a lake populated with organisms, from which his students later developed systems and community ecology (Hutchinson, 1948). ...
Article
Full-text available
The paper attempts to place the emergence of cognitive science (CS) as an interdisciplinary research program in historical context. A broad overview of the institutional and intellectual situations during the early postwar period is presented, focusing primarily on psychology and artificial intelligence (AI). From an institutional perspective, the paper shows that although computers and computer science were closely linked with weapons research during World War II, the postwar creation of cognitive science had no military connection, but was largely enabled by small grants from private foundations, though the RAND Corporation was involved to a limited extent. From an epistemic perspective, the paper shows: (1) that neobehaviourist learning theory was not replaced by, but flourished parallel to cognition-oriented psychology in the 1950s, because they were located in different sub-disciplines; (2) that the key theoretical inputs into CS were developed separately at first, and each group remained affiliated with the discipline or complex of disciplines from which it came. A certain tension remained at the core of the project between the machine dreams of the emerging AI community and the idea of autonomous mental processes central to cognitive psychology.
... Nur auf dieser Basis ist es möglich, gesicherte und allgemein gültige Erkenntnisse zu erzielen, das heisst, die undurchschaubare Fülle subjektiver Eindrücke durch die Welt einfacher und ewiger Gesetzmässigkeiten zu ersetzen. So aber können die Prozesse, die untersucht werden, auf Gesetze der Bewegung zurückgeführt und streng deterministisch nach dem Kausalprinzip von Ursache und Wirkung erklärt werden, was zu einer Mechanisierung der Natur geführt hat (Dijksterhuis, 2002;Dupuy, 2000;Giedion, 1987 (Gelo et al., 2019;Rieken & Gelo, 2015). ...
Article
Full-text available
Die Sonderstellung der Psychotherapiewissenschaft als selbstständiger Disziplin im Spannungsfeld zwischen nomothetischem und idiografischem Wissenschaftsverständnis wird plausibel gemacht. Dazu werden zunächst das herkömmliche mechanistische Wissenschaftsverständnis und seine Berechtigung skizziert, das, fussend auf systematischer Beobachtung und Experiment, vornehmlich durch die Kausalbeziehung von Ursache und Wirkung charakterisiert ist und geprägt wird durch das Streben nach Realismus, Objektivismus, Naturalismus und Universalismus. Demgegenüber legt das idiografische Wissenschaftsverständnis den Fokus auf den Einzelfall und seine Besonderheit, fragt aber gleichzeitig auch nach der Möglichkeit der Verallgemeinerbarkeit. Es ist charakterisiert durch Relativismus, Subjektivismus/Transaktionalismus, Konstruktivismus, aber auch Perspektivismus, der, obgleich eine vermittelnde Position beziehend, im wissenschaftlichen Diskurs nur eine untergeordnete Rolle spielt. Neben der Wirkursache finden dabei Zielursache, Ganzheit und analogischer Rationalitätstypus Anwendung. Nomothetisches und idiografisches Wissenschaftsverständnis sehen die Autoren als gleichermassen legitim an und plädieren daher für einen dialogischen Pluralismus.
... Soft robotics thus features as a good candidate for reconciling various trends of bio- Second, soft robotics scientists converge with biomimetic chemists because they need the knowledge and skills of the latter. In order to translate their fascinating investigations of living machines into technological terms they will have to use the same guiding principles for designing soft machines (Jones 2004 (Dupuy, 2000). Indeed, the model of the mind involved in early bionics has changed. ...
Article
Full-text available
Synthetic biology, materials chemistry and soft robotics are fast becoming leading disciplines within the field of practices which look to nature for inspiration and opportunities. In this article I discuss how these molecular-scale practices fit within the existing trends of bio-informed design defined at the macro level, that is, bionics, biomimetics and more specifically biomimicry. Based on the metaphysical views underlying bio-informed design practices, I argue that none of them currently fit the biomimicry model, as they are not consistently concerned with environmental sustainability. While biomimetic chemistry loosely belongs to the field of biomimetics, and soft robotics to the field of bionics, both practices have a profound impact on their respective fields, as they question the places of nature and engineers.
... The choice of random step-functions is conceptually interesting 2 not because we can argue that random search is indeed the case in real organisms, but as a proof that dumb mechanisms can yield adaptive responses which from the point of view of an external observer may look quite clever. Ashby was one of the first to challenge the viewpoint -that started with that battlehorse of the cybernetic era, McCulloch and Pitts' network of binary gates, and its heavy inspiration in the logical revolution of the 1930s and continues to this day -that intelligent performance needs to be the result of equally intelligent mechanisms, see (Dupuy, 2000). It needs not be like this, and it is quite likely that in general it is not 3 , and this is an important lesson for anyone interesting in designing robots. ...
Article
Full-text available
The major technical and conceptual advances in cognitive science have always been connected directly or indirectly to artificial intelligence and, over the last 20 years, specifically to the field of autonomous robotics. It is thanks to the elegance and practical value of work in various regions of robotics that certain fundamental questions in cognitive science can be revisited. This is particularly the case for the question of meaning. What makes an engagement with the world meaningful for a robot ? In this paper I argue for an approach that addresses this question based on a life-mind continuity perspective supported as much by notions embodiment, complex adaptive systems and enaction as well as phenomenologically and existentially informed theorising. I concern myself with the question of what constitutes the identity of a cognitive system and answer this question in terms of autonomy, defined as operational closure. Then I examine how autonomy lays down a possibility for normative regulation of environmental interactions (agency) in terms of sense-making, the hallmark of cognition. I discuss several questions that are opened by this systemic approach : like whether habits may be considered as higher forms of autonomous identities beyond the organismic self. Some direct implications for robotics are explored by revisiting and extending models of homeostatic adaptation to distortions of the visual field.
... Issues of simulation, fidelity, presence, and immersion extend back to classical antiquity, perhaps most famously in the myth of Amphitryon (e.g.,Dupuy, 2008). The logical problems inherent with simulation in the context of experimental stimuli and virtual reality are not new. ...
Article
J. J. Gibson (1966) rejected many classical assumptions about perception but retained 1 that dates back to classical antiquity: the assumption of separate senses. We suggest that Gibson's retention of this assumption compromised his novel concept of perceptual systems. We argue that lawful, 1:1 specification of the animal–environment interaction, which is necessary for perception to be direct, cannot exist in individual forms of ambient energy, such as light, or sound. We argue that specification exists exclusively in emergent, higher order patterns that extend across different forms of ambient energy. These emergent, higher order patterns constitute the global array. If specification exists exclusively in the global array, then direct perception cannot be based upon detection of patterns that are confined to individual forms of ambient energy and, therefore, Gibson's argument for the existence of several distinct perceptual systems cannot be correct. We argue that the senses function as a single, irreducible perceptual system that is sensitive exclusively to patterns in the global array. That is, rather than distinct perceptual systems there exists only 1 perceptual system.
... La cibernética como proyecto de mecanización de la mente (Dupuy, 2001), podría estar más asociado al surgimiento de la revolución cognitiva que a las teorías de la complejidad. La consideración de la mente como un mecanismo, una máquina lógica, estuvo presente desde los inicios de esta disciplina. ...
Book
Recientemente han surgido nuevas metáforas de la mente y el cerebro. El sentido de la acción, intersubjetividad, cuerpo y contexto han sido revalidados a partir de múltiples procesos cognitivos y teorías. Las teorías dinámicas de la cognición forman parte de esta vuelta conceptual. El presente libro realiza un trabajo exhaustivo, análisis de estas teorías y su relación con otros enfoques contemporáneos. El autor presenta un ambicioso programa de investigación en la neurociencia centrada en la construcción de una intencional, sentido intersubjetivo y contextual, integrando el Contribuciones de las teorías dinámicas con otras enfoques.
... 27. On cybernetics and cognitive psychology, see Dupuy (2000), Pickering (2010) and Pinto (2014). On computational metaphor, see Turner (2006: 11-39). ...
Article
The smart city has become a hegemonic notion of urban governance, transforming and supplanting planning. The first part of this article reviews current critiques of this notion. Scholars present three main arguments against the smart city: that it is incompatible with an informal character of the city, that it subjects the city to corporate power and that it reproduces social and urban inequalities. It is argued that these critiques either misunderstand how power functions in the smart city or fail to address it as a specific modality of entrepreneurial urban governance. The second part advances an alternative critique, contending that the smart city should be understood as an urban embodiment of the society of control (Deleuze). The smart city is embedded in the intellectual framework of second order cybernetics and articulates urban subjectivity in terms of data flows. Planning as a political practice is superseded by an environmental-behavioural control, in which subjectivity is articulated supra-indiv...
... Is die aflaaibare bewussyn gelyk aan die nous, is dit hoegenaamd bewussyn en indien wel hoe word bewussyn verstaan? Baie van hierdie definisies is gekonstrueer om die definieerders eerder as die werklikheid te pas, indien ons vir Dupuy (2000) ernstig kan neem. Sy deeglike studies oor kognitivisme, die kognitiewe wetenskappe en kubernetika toon duidelik, en myns insiens baie oortuigend, aan hoedanig die omvang van die meganisering, naturalisering en materialisering van gees in hierdie gevalle werk, wat ook die analitiese filosofie van gees in hul verwantskap met hierdie strominge insluit. ...
Article
Full-text available
This article reflects on the vicissitudes of spirit and spirituality during the course of time. In the early phases of human history until fairly recently, spirit and spirituality were essential to their history and shamelessly confessed and promoted. Examples of this attitude are apparent in the works of Aristotle, Augustine, Spinoza, Hegel, and the philosophers of the spirit, Bachelard and Gadamer. They generally see spirit as the organizing, fundamental and forceful principle that guides and inspires humans towards many great achievements, the principle responsible for organising all special attributes and qualities of humans such as psuche, phusis and logos - science, art and religion. Gradually this self-evident and taken for granted presence and force of human spirituality, responsible for making human history and establishing great and exciting human achievements, has been quietly but sometimes even ruthlessly eroded in diverse ways. Its fate is multi-faceted. It is either simply forgotten or push d aside, or active efforts of despiritualisation became effective. From another perspective it becomes emasculated and loses its force, or becomes seconded to machines up to the point of complete elimination in the sciences of various kinds, until eventually, in some circles, an aggressive antagonism towards the idea of spirit and explicit rejection of this idea emerged. Merely mentioning spirit becomes a shameful event, a laughing matter, and even a subject of jokes. What goes relatively unnoticed in these processes is the extent to which these dismissive attitudes regarding spirituality bring about catastrophic and barbarous consequences, with devastating implications for human existence, human societies, scientific endeavours, social initiatives and aesthetic expressions. World wars, unbridled violence of humans against humans, exploitation of humans by humans, the absence and loss of meaningful human existence, the development of symbolic misery, the emergence of dis-affected individuals, the sociopatholo ies that reign in societies, the darkening of the world manifested in the flight of the gods, the celebration of mediocrity, the standardisation of humans and the destruction of the earth are some of the devastating and ruinous consequences. According to some these developments pose a serious threat to the future of the human race - humankind may be on the edge of disappearing. If this scenario is true and real, which clearly seems to be the case, deep and serious attention should be given to these developments in view of the future not only of specific individuals but of humankind as a whole. This creates the imperative that there needs to be a reawakening of spirit and spirituality; it should be reinvented for the contemporary milieu, circumstances and demands which differ extensively from previous times. What is required is the creation of a milieu that will facilitate this re-awakening and re-invention of spirituality: a deep rethinking of the notion of community; extensive efforts to pursue without hesitation the battle for intelligence; a serious embracement of comprehensive and compositional thinking; a rediscovery of the deepest possible understanding of what knowledge really means. The suggestion is that the re-invention and re-awakening of human spirituality, through attentive thoughtful listening to Being, will enable human communities to deal constructively with the huge problems with their catastrophic proportions that threaten humankind, enabled by the concurrent re-awakening and re-invention of human attitudes of friendship, love and care with socio-therapeutic consequences on a wide scale and pertinent for our unique situations and conditions. This calls for the deliberate, explicit and communal development on a continuous basis of "the art of being" with Being.
... This means that the inscrutable wealth of subjective impressions was replaced by a world of simpler and eternal laws. In this way, the processes that were studied could be traced back to the laws of motion and explained in a strictly deterministic way through the principle of cause and effect, which led to a mechanization of nature (Dijksterhuis 1961;Dupuy 2000;Giedion 1948;see Sect. 4.2.3). ...
Chapter
Full-text available
The aim of this chapter is to show that determinism, reductionism, and mechanism have dominated people’s lives since the early modern period and, as a consequence, have been representing a monopoly in sciences in general and in psychotherapy science in particular ever since. In addition, it will raise the issue of what other approaches to understanding reality and human beings have unjustly been forgotten— unjustly, because they might contribute to a more comprehensive understanding and examination of human life and with it also of psychotherapy. These include, as we will attempt to demonstrate in the following, intentionality, wholeness, and the analogical thinking, which lay the groundwork for emerging alternative research approaches. Finally, the implications of the above for a pluralistic psychotherapy science will be presented.
... The Modern scientific worldview was based on a Newtonian/Cartesian machine or clockwork metaphor in which the world was fundamentally objective, rational, linear, deterministic, and orderly-like a machine (Capra, 1984(Capra, , 1996Matson, 1964;Morin, 1981;Peat, 2002;Russell, 1983;Toulmin, 1992). Human beings were seen as machines: we think of La Mettrie's L'Homme Machine, of course, but this kind of machine thinking about human beings continues in computer metaphors of the mind and body (Dupuy, 2000). Machine thinking was borrowed from physics and applied to all human activities. ...
Article
Full-text available
Uncertainty has become an increasingly powerful dimension of human experience in the 21st Century. In this paper I explore the social and psychological dimensions of uncertainty, in the context of what have been referred to as postnormal times. Particularly in such times of transition, when an old order is failing and a new one had not yet emerged , the need for certainty feeds into authoritarianism. An appreciation and education for creativity, complexity, and collaboration can mitigate the anxiety caused by uncertainty. Navigation of the complexity of uncertain times also provides us with an opportunity for creation of new ways of being in the world.
... This volume offers a wide range of original material, with some emphasis on underexplored areas, such as British cybernetics, and the relationship between the mechanical mind and the arts. It is intended to complement more specific histories (such as those of the cybernetic period, including Heims 1991;Dupuy 2000) as well as more general surveys of the field (McCorduck 1979;Dyson 1997;Cordeschi 2002; and Boden's recent heroic two-volume history of cognitive science [2006]). ...
... While some of Chomsky's work played an important part in this-notably his review (Chomsky 1959) of Skinner's Verbal Behavior -and while Chomsky was supportive of early work by such psychologists as George Miller, Chomsky by the 1970s for all practical purposes had lost interest in cognitive psychology, as far as the documentary history goes. While little in the chapter is new, it is a good overview of the subject, though its primary weakness is the way in which it hews to the version of the story of the birth of cognitive science that was laid down in The Mind's New Science (Gardner 1985), a story that begins with a self-conscious coalition of researchers who met in Dartmouth and MIT in 1956; unfortunately, it fails to see the essential ways in which this movement was a continuation of the cybernetics movement (see, notably, Dupuy 2000). Like many of the authors in this book, Harris seems totally amazed by Chomsky's magesterial dismissal of entire subdomains: "Machine translation [MT] is a very low level engineering project," Chomsky told Barsky (Barsky 1997), a remark cited again by Harris (p. ...
... A principal mover of these meetings was Warren McCulloch, to whom I will return later. The fascinating story of this group's wide ranging and often contentious deliberations is told in books by Heims (1991), Dupuy (1994), and in sections of Hayles (1999) book. Transcripts of the proceedings were finally published by Pias (2003). ...
... Rückblick wie eine mésalliance erscheinen mag (Dupuy 2000 ), so sind doch die Parallelen zur US-NBIC-Initiative (Kap. 2.2) bemerkenswert. ...
Article
Full-text available
The debates on nanoscience and technology and their convergence with other fields of research are heavily influenced by technofuturistic visions that have a long-standing tradition in Western thought. In the controversies about technofu-turism the debate on converging technologies is framed as a political and cultural conflict. Posthumanist technofuturism, as both a particular set of ideas and a so-ciocultural movement, plays a significant role in this context. What follows is an analysis of posthumanism's role in the debates, with particular references to its crypto-religious and utopian aspects.
... This simple feedback-cum-computation model of an almost supernatural phenomenon as teleology promised further solutions of hard philosophical riddles of the mind. For early cybernetics the computation taken to be the substrate of teleology was, unlike its successors classical AI (GOFAI) and cognitive science, conceived of as strictly mechanical [15]. The important difference is that the computational paradigm for intelligence and semantics represented by GOFAI and cognitive science took computation to consist of rule-guided manipulation of symbolic entities already endowed with meaning or gaining meaning by the syntactical operations themselves. ...
Conference Paper
Biologically inspired approaches to the design of general IT are pres-ently flourishing. Investigating the scientific and historical roots of the tendency will serve to prepare properly for future biomimetic work. This paper explores the genealogy of the contemporary biological influence on science, design and culture in general to determine the merits of the tendency and lessons to learn. It is argued that biomimetics rests on bona fide scientific and technical reasons for the pursuit of dynamic IT, but also on other more external factors, and that biomimetics should differentiate the relevant from the superficial. Furthermore the search for dynamic capacities of IT that mimicking adaptive processes can bring about is put forward as both the history and raison d'être of biomimetics.
... In the 1950s, Alex Bavelas ran a series of experiments on the effects of group structure and communication on task performance -with surprising results. In one of them, which he described to the Cybernetics group at the Interdisciplinary Macy conferences [7, 4] held in New York City between 1946 and 1953, participants were asked to pick a number from the range 0-5, to write it down on a piece of paper and to give it to the experiment moderator. The five participants in the group were supposed to generate guesses so that their total added up to 17. Participants were not allowed to communicate with each other and were not told what the other participants had guessed. ...
... By contrast in biomimetic strategies, "self" refers to a population of interconnected molecules exploring the various possibilities of collective behavior. 51 Whitesides (2004) 52 Stengers (1985) p. 82 53 See Dupuy (2000) ...
Article
Full-text available
Over the past decades, self-assembly has attracted a lot of research attention and transformed the relations between chemistry, materials science and biology. The paper explores the impact of the current interest in self-assembly techniques on the traditional debate over the nature of life. The first section describes three different research programs of self-assembly in nanotechnology in order to characterize their metaphysical implications: (1) Hybridization (using the building blocks of living systems for making devices and machines) ; (2) Biomimetics (making artifacts mimicking nature); (3) Integration (a composite of the two previous strategies). The second section focused on the elusive boundary between self-assembly and self-organization tries to map out the various positions adopted by the promoters of self-assembly on the issue of vitalism.
... * He presented no justification for doing so, identifying it as a premise of the theory in his opening paragraphs. Whatever the source of the idea, it was an idea that fit well with the formalization of syntax in linguistics, and it became one of the intellectual cornerstones of computational studies of language and mind (Wiener, 1961; Cherry, 1978; Dupuy, 2001). Shannon's work, however, contains more relevance to linguistics than this. ...
Article
An explanation for the uncertain progress of formalist linguistics is sought in an examination of the concept of syntax. The idea of analyzing language formally was made possible by developments in 20th century logic. It has been pointed out by many that the analogy between natural language and a formal system may be imperfect, but the objection made here is that the very concept of syntax, when applied to any non-abstract system of communication, is flawed as it is commonly used. Syntax is properly defined with respect to an individual transformation rule that might be applied to some message. Collections of syntax rules, however, are inevitably due to categories imposed by an observer, and do not correspond to functional features found in non-abstract systems. As such, these categories should not be relied upon as aids to understanding any natural system.
... A principal mover of these meetings was Warren McCulloch, to whom I will return later. The fascinating story of this group's wide ranging and often contentious deliberations is told in books by Heims (1991), Dupuy (1994), and in sections of Hayles (1999) book. Transcripts of the proceedings were finally published by Pias (2003). ...
Article
Full-text available
Metaphors of Computation and Information tended to detract attention from the intrinsic modes of neural system functions, uncontaminated by the observer’s role in collection, and interpretation of experimental data. Recognizing the self-referential mode of function, and the propensity for self-organization to critical states requires a fundamentally new orientation, based on Complex System Dynamics as non-ergodic, non-stationary processes with inverse-power-law statistical distributions. Accordingly, local cooperative processes, intrinsic to neural structures, and of fractal nature, call for applying Fractional Calculus and models of Random Walks with long-term memory in Theoretical Neuroscience studies.
... In sum, the disciplines that contributed significantly to the emergence of cognitive science—e.g., AI, linguistics, psychology, philosophy , neuroscience, anthropology— to a large extent assumed a human starting point. Cybernetics, or mathematical control theory, whose homeostasis-like feedback principles were critical to the project of designing a mind, was neither biogenic nor unambiguously anthropogenic, but it also was not especially influential in the formulation of cognitivism's central tenets (Dupuy, 2000; Bindra, 1984). The anthropogenic nature of cognitivism can also be seen in its would-be rivals, all of which were much closer to a biogenic Cognitive robotics and the goal of creating artificial intelligence remain a lively research area, but computational functionalism—the license for equating biological and machine cognition—has taken serious knocks. ...
Article
The Human Stain
Research
Full-text available
The paper argues against a commitment to metaphysical necessity, semantic modalities are enough. The best approaches to elucidate the semantic modalities are (still) versions of lingustic ersatzism and fictionalism, even if only developed in parts. Within these necessary properties and the difference between natural and semantic laws can be accounted for. The proper background theory for this is an updated version of Logical Empiricism, which
Article
In “Scientific Realism and the Issue of Variability in Behavior,” Arocha (2021) develops an acute critique of “the standard model of current research practice in psychology” (p. 376), sharply dissecting five unwarranted assumptions behind it. To address these issues, the author proposes adopting a nonpositivist philosophical basis for behavioral research: scientific realism. Behind this argumentation, however, it is implied that scientific realism is fit for becoming the metatheoretical framework for psychology because it addresses the shortcomings of the current positivist model. In this commentary, I argue that scientific realism is not fit for becoming that philosophical basis, because it is open to reducing the discipline’s subject matter—the human person—to make it fit with models that have been fruitful in other sciences. Three historical examples are presented to show the risks of adopting models from disciplines devoted to explaining other phenomena to tackle the complexity of psychology’s subject matter.
Thesis
Full-text available
There still exist many fields in which ways are to be explored to improve the human-system interaction. These systems must have the capability to take advantage of the environment in order to improve interaction. This extends the capabilities of system (machine or robot) to better reach natural language used by human beings. We propose a methodology to solve the multimodal interaction problem adapted to several contexts by defining and modelling a distributed architecture relying on W3C standards and web services (semantic agents and input/output services) working in ambient intelligence environment. This architecture is embedded in a multi-agent system modelling technique. In order to achieve this goal, we need to model the environment using a knowledge representation and communication language (EKRL, Ontology). The obtained semantic environment model is used in two main semantic inference processes: fusion and fission of events at different levels of abstraction. They are considered as two context-aware operations. The fusion operation interprets and understands the environment and detects the happening scenario. The multimodal fission operation interprets the scenario, divides it into elementary tasks, and executes these tasks which require the discovery, selection and composition of appropriate services in the environment to accomplish various aims. The adaptation to environmental context is based on multilevel reinforcement learning technique. The overall architecture of fusion and fission is validated under our framework (agents, services, EKRL concentrator), by developing different performance analysis on some use cases such as monitoring and assistance in daily activities at home and in the town.
Chapter
No scientific endeavor is possible without considering philosophical viewpoints, such as about truth and knowledge. This chapter reflects on these topics and will show how the ‘mechanization of the worldview’ developed, leading to the mechanization of enterprises and the elimination of moral considerations, which subsequently eliminates the very possibility of practicing the employee-centric theory of organization. Normative and ethical issues about what is good and right are, however, inevitable in the case of society and enterprises. Ultimately, philosophical viewpoints determine how social and organization science is conducted, hence how society and enterprises are perceived, studied, and arranged. Since enterprises are social entities, the philosophical foundation further provides essential viewpoints about human existence and interaction, whereby communication and language are important aspects. Philosophical viewpoints about communication and language are thus of specific importance for modeling and designing enterprises. The chapter concludes by sketching the implications of the philosophical foundation for enterprise governance and enterprise engineering.
Chapter
Reflections about the ontological foundation probe into the nature of society and subsequently into the nature of enterprises. The chapter shows how certain social theories have dominated and shaped perspectives on society and in turn on enterprises, inevitably leading to enterprise mechanization. Important concepts that capture the essence of society and enterprises are thereby largely ignored with detrimental consequences. Unlike the mechanistic perspective with its associated deterministic essence, the chapter stresses the importance of acknowledging and understanding emergence: the occurrence of unpredictable, novel phenomena in social and enterprise reality. Core sources and forces of those phenomena are outlined, as well as their profound implications for conceptualizing society and enterprises. Subsequent to outlining the conceptual model of society, the conceptual model of enterprises will be introduced, which reflects the essential facets of enterprises, aids in understanding emergence, and enables coherent and consistent enterprise design as a vital condition for adequate enterprise operational and strategic performance. Implications of the ontological foundation for enterprise governance and enterprise engineering will be sketched.
Article
Full-text available
How do we perceive virtual reality? This simple question takes the reader on a journey into the physiology of perception, following cybernetic models that regard spatial perception as an activity that presupposes bodily motion. In this discussion, head-mounted displays (HMDs) are reconstructed not only as visual but also as relational devices, closer to cybernetic augmentation of the human body than to pictorial traditions. The condition of possibility of virtual reality, as embodied by HMDs, is reconstructed as the circular coupling of machine response to human action and perception, in contrast to perception of space through the eyes.
Chapter
To consider the global protests of 1968 from the perspective of the present, especially in view of the particular intersections of media and counterculture that gave them their distinctive flavor, is of necessity to consider contemporary issues. This is true not only due to a recently renewed focus on democratic participation precipitated by developments such as Occupy Wall Street and the Arab Spring on the one hand and a growing awareness of the proliferation of digital surveillance on the other, but, more fundamentally, because processes of informational convergence in that nominally analog moment can tell us much about our current, digital age. Indeed, more recent studies focusing on the development of computing and networks from the 1960s onward highlight contiguities between bureaucratic forces of establishment control and the countercultural movements that once sought to oppose them.1 Such studies rightfully emphasize the links between the countercultural and technological milieux, usually noting the connections between California hippie scenes and the rising high-tech industry of Silicon Valley. Similarly, the cultural backdrop of Southern California and the products of its attendant entertainment industry also have the power to elucidate aspects of this broader symbiotic relationship. This chapter examines two such instances: the 1967 Hollywood film The President’s Analyst, directed by Theodore J. Flicker, and the career of experimental composer Joseph Byrd from the early 1960s through 1968, including the 1968 self-titled LP by his Los Angeles avant-rock group The United States of America.
Book
Full-text available
This volume offers a new and very different approach to exploring leadership, one based on the new sciences of complexity. What we are calling “Complex Systems Leadership Theory” posits that leadership can be enacted through any interaction in an organization. Far from being the sole province of managers and executives, we contend leadership is an emergent phenomenon within complex systems. As such, exploring the meaning and implications of “emergent” is one of the major issues taken up by the chapters in this book. Through advances in computational modeling and non-linear dynamics, the interactions which generate leadership can be “tracked” in a much more rigorous way, enabling managers to better understand and encourage those dynamics of interaction which prove to have beneficial effects on the organization. Overall, we see a Complex Systems Leadership Theory as the core of a new era in leadership studies; introducing and furthering this new era are the primary goals of the present volume. This
Chapter
Full-text available
The individualization of lifestyles as it developed only since the beginning of the twentieth century was accompanied by an increased complexity at the mental level. The downside was that this made people more psychically vulnerable and created a need for specific occupations focused on the psyche, above all psychotherapy. Thus, psychotherapy is a modern phenomenon; it would’ve been a foreign concept in past epochs. This does not mean that mental illness and its treatment was unknown in the archaic and the premodern period, but that it required different approaches, though those, in their structure, may show similarities with today’s methods. This applies equally to the popular healing methods of shamans in archaic cultures and folk healers in the premodern period but also to certain practices of the Christian religion and philosophy shaped in advanced cultures. This chapter will first outline the differences between a modern and a premodern or archaic self-concept; second, the prehistoric precursors of modern psychotherapy will be described; finally, the implications of this outlook for psychotherapy research will be addressed.
Chapter
In the course of his inexplicable existence, Man bequeathed to his descendants multiple evidence, worthy of his immortal origin. As he also bequeathed vestiges of the dawn’s remnants, snowballing celestial reptiles, kites, diamonds and glances of hyacinths. Amidst moans, tears, famine, lamentations and cinder of subterranean wells.
Article
Full-text available
This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible’, on the horizon, ’calm’, it also points to a much more important and slightly different perspective: that of creative action upon novel forms of artifice. Most importantly for this article, ubiquity and pervasive computing is seen to point to the continuous existence throughout the computational heritage since the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty) relationship of extension of one into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic aspects of John von Neumann’s First Draft of a Report on the EDVAC from 1945. b. Herbert Simon’s notion of simulation in The Science of the Artificial from the 1970s. c. Jean-Pierre Dupuy’s idea of ‘verum et factum convertuntur’ from the 1990s. Third it concludes with a brief re-situating of Weiser’s notion of pervasive computing on the basis of this background.
Article
Full-text available
Although many active scientists deplore the publicity about Drexler’s futuristic scenario, I will argue that the controversies it has generated are very useful, at least in one respect. They help clarify the metaphysical assumptions underlying nanotechnologies, which may prove very helpful for understanding their public and cultural impact. Both Drexler and his opponents take inspiration from living systems, which they both describe as machines. However there is a striking contrast in their respective views of molecular machineries. This paper based on semipopular publications is an attempt to characterize the rival models of nanomachines and to disentangle the worldviews underpinning the uses of biological reference on both sides. Finally, in an effort to point out the historical roots of the contrast in the concepts of nanomachines, I raise the question of a divide between two cultures of nanotechnology.
Article
Full-text available
My claim in this article is that the 1950 paper in which Turing describes the world-famous set-up of the Imitation Game is much richer and intriguing than the formalist ersatz coined in the early '70s under the name "Turing Test". Therefore, doing justice to the Imitation Game implies showing first that the formalist interpretation misses some crucial points in Turing's line of thought and secondly that the 1950 paper should not be understood as the Magna Charta of strong Artificial Intelligence but as a work in progress focused on the notion of Form. This has unexpected consequences about the status of Mind, and from a more general point of view, about the way we interpret the notions of Science and Language.
Chapter
Full-text available
This book explains why complex systems research is important in understanding the structure, function and dynamics of complex natural and social phenomena. It illuminates how complex collective behavior emerges from the parts of a system, due to the interaction between the system and its environment. You will learn the basic concepts and methods of complex system research. It is shown that very different complex phenomena of nature and society can be analyzed and understood by nonlinear dynamics since many systems of very different fields, such as physics, chemistry, biology, economics, psychology and sociology etc. have similar architecture. "Complexity Explained" is not highly technical and mathematical, but teaches and uses the basic mathematical notions of dynamical system theory making the book useful for students of science majors and graduate courses, but it should be readable for a more general audience; actually for those, who ask: What complex systems really are? © 2008 Springer-Verlag Berlin Heidelberg. All rights are reserved.
Article
The most fundamental issue raised by any discussion around the ‘moralization of capitalism’ is the puzzle of second-order morality: How exactly is it possible to pass a moral judgement on our categories of moral judgement? How can our norms of morality be said to be immoral, thus calling for (re-)moralization? The answer depends on the observation that norms and interaction structures in capitalism have co-evolved, and hence can be taken neither as autonomous with respect to one another nor as obeying a hidden functionality. This implies that, paradoxically, the moralization problem cannot be solved in moral terms, but calls for a political approach, to make best use of which we need to come to terms with capitalism as a fully fledged cultural system. The ideology inherent in that cultural system can only be attacked from within the system itself, through decentralized processes of democratic decision-making rather than by mere prophetic denunciation or moral invectives. Because the particular version of the capitalist culture in which we live now is a radically contingent result of history, it makes sense to support a framework of democratic experimentalism which embeds multiple institutional experimentation within a system of experience-building and experience-formation analogous to the system of information-utilisation and information-dissemination offered by the Hayekian market. Only by thus creating the real and concrete democratic presuppositions for alternative capitalist practices can we begin to make sense of the puzzle inherent in the “moralization of capitalism” problem.
Chapter
New model potentials exist for coping with the complexities of today’s cities. These are related to the cognitive mediation role that modeling allows one to establish between the abstraction process (internal loop) and the external environment to which a model application belongs (external loop). The focus is turned to the two main aspects involved in that role, i.e. the modeling task and the technological interface. As far as the first is concerned, there are claims that model building in geography involves three main components: a syntactic component (how are the mechanisms underlying the functioning of the system accounted for?), a representational (semantic) component (what kind of urban descriptions are conveyed by the model?) and a purposive investigation project component (what is the modeling activity intended for?). As they increasingly rely on computing technology, models as cognitive mediators are not just simple, autonomous entities, but active complex objects. A model can therefore be understood as an ALC (Action, Learning, Communication) agent, capable of performing a certain course of Action, and permitting a certain Learning ability, which, because of its cognitive mediating role, Communicates with other kinds of agents (other models). This notion is then related to the various aspects of model-building in geography as originally introduced in the early seventies. These aspects are re-interpreted in light of the above characteristics. We conclude the paper with some remarks about the implications which may be derived as far as the harnessing of complexity in urban systems is concerned.
Article
Living systems are characterized by unique properties that make them resistant to the ``information-processingperspective'' of traditional cognitive science.This paper details those unique properties andoffers a new theoretical framework forunderstanding the behavior of living systems.This framework leans heavily on ideas fromgeneral systems theory (specifically Bateson'sinteractionist perspective), semiotics, andMerleau-Ponty's phenomenology. The benefits ofusing this framework are illustrated withexamples from two different domains: immunologyand verbal interaction.
Chapter
Since there are few disruptive nanotechnological products and processes now, it would seem that ethical and societal deliberations concern what nanotech-nology may bring in the future. This orientation towards the future is shown to be unnecessary and wholly inadequate to technoscientific research programs. Since these programs posit that there is something deficient or problematic about the present that will benefit from a nanotechnological solution, they posit not a future but an alternative world. Since nanotechnology is primarily a conquest of space, critiques of colonization and globalization may offer the most appropriate resources for the assessment of this alternative world.
Article
Full-text available
Purpose The purpose of this paper is to present a cybernetic way of seeing analog and digital along with a basic vocabulary for discussing assumptions underlying the use of both terms. Design/methodology/approach Taking analog and digital not as properties of observed phenomena but as properties of observers, I ask not what is digital or analog, but what I do when I use these terms. I analyze introspectively, and report on, what I think my assumptions are when using the two terms. Findings I develop a basic vocabulary to describe engagements that I describe as analog or digital. This vocabulary is applicable beyond technical contexts and suitable also for discussing social and creative processes. It includes a kind of observer who I call matchmaker . Research limitations/implications The presented research is preliminary and subjective. Originality/value While previous discussions consider analog and digital as properties of observed phenomena, they are considered here as properties of observers. The presented discussion is sufficiently abstract to account for the analog and the digital at various scales, including electronic signal processing and human interaction. The author argues that discussions of engagements described as analog or digital must account for observers of these engagements, including those who act as their matchmakers.
ResearchGate has not been able to resolve any references for this publication.