Chapter

An Introduction To Cybernetics

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... As a subject of empirical inquiry, Convergence Research has been largely understudied. Because Convergence Research is a complex endeavor, to pursue inquiry that adequately enriches the policies and programs that promote Convergence Research, we must engage with complexity and address it with a commensurately complex response (Ashby 1956). In this paper, we have presented a system-of-systems approach to illuminate more of what Convergence Research is and how it operates. ...
Article
Full-text available
Over the past decade, Convergence Research has increasingly gained prominence as a research, development, and innovation (RDI) strategy to address grand societal challenges. However, a dearth of research-based evidence is available to aid researchers, research teams, and institutions with navigating the complexities attendant to the specifics of Convergence Research. This paper presents a multilevel research agenda that accounts for an integral understanding of Convergence Research as a complex adaptive system. Furthermore, by developing a framework that accounts for ancillary, yet essential, systems associated with Convergence Research, we enrich the agenda with a literature-steeped discussion that considers how systems-based practices of collaboration, inquiry, and context interact with the processes and products of Convergence Research. Finally, we synthesize and apply insights from the reviewed literature by providing paths for empirical exploration emphasizing systems-based practices. In so doing, we delineate an extended boundary for a research stream that both clarifies and enlarges our understanding of Convergence Research as a ‘system-of-systems’.
... 1. To attenuate the complexity/variety of the market, for example, by modelling, structuring or segmenting it, 2 Originally the wording of the law was 'Only variety can destroy variety' (Ashby, 1956 inventing new ways of clustering market objects, targeting its own activity to parts of that market only (e.g. specific segments of the market) or reframing the system-in-focus. ...
Article
Management in systemic terms means to cope with complexity. Ross Ashby's law of requisite variety shows the way—maintaining the varieties of interacting systems in balance. To denote that process, we use the cybernetic concept of ‘variety engineering’, which we also formalize. It refers to processes of mutual complexity amplification and attenuation by interacting agents. The purpose of this contribution is to elicit ways of coping with complexity by means of variety engineering. The abstract concepts are illustrated by examples from ecological, social and economic contexts.
Article
Although the Last Planner System (LPS) has been successfully used in complex construction projects, previous studies have not investigated how it supports resilient performance (RP), which is crucial for the construction industry. To address this gap, a case study of using the LPS in refurbishment building projects was conducted. The implementation of LPS was analysed in light of seven principles for the design of resilient systems. Sources of data for this analysis involved documents, semi-structured interviews, participant observation, and secondary data. The results pointed out 25 production planning and control practices that contributed to RP, including well-established LPS practices, formalised in the planning standards of the company (32% of the total); formal practices not usually considered as elements of LPS (20%); and informal practices not anticipated by company standards (48%). These findings indicate that although LPS contributes to RP, it must be complemented by other practices, either formal or informal. A set of well-established practices (e.g. hierarchical planning, identification and removal of constraints, collaborative meetings, and use of lagging and leading indicators) are those most logically connected to the principles of design for RP. This study also offers insights into some LPS limitations (e.g. low control frequency and overemphasis on production in relation to other functional dimensions), which indicate opportunities for the development of new production planning and control approaches supportive of RP. ARTICLE HISTORY
Chapter
This chapter analyzes the current historical situation and the prospects for its further development. It is evident that drastic changes are currently taking place in the world, reflecting a phase transition in historical development, namely, the transition from the late industrial phase to the cybernetic phase, features of which are largely determined by the ongoing Cybernetic Revolution. One could also talk about a transition from an economically organized World System to a politically organized one. Sadovnichy et al. consider the most important features of the ongoing changes and their impact on various spheres of society’s life, taking into account various scenarios of the future global evolution. In particular, Sadovnichy et al. discuss the possible transition from a liberal market capitalist society to a model of society with an equal distribution (but still preserving private initiative and the spirit of enterprise). This global transition is facilitated by the limitation of extensive growth opportunities due to objective reasons. Modern transition is also connected with the achievement of a high level of technological development, which objectively demands the transformation of social, political, and international relations to correspond to the level of technology. There are some contradictions with respect to technological development. On the one hand, the benefits of new technology implementation are significant for solving many modern problems in providing a higher quality of life, since for the first time in history, it is becoming possible to satisfy the material needs of the majority of the world’s population. On the other hand, the development of technological innovations leads to the formation of socio-technical self-regulating systems based on artificial intelligence, the introduction of which also carries with it serious risks, and their scale remains unknown. At the moment, the rate of demographic growth is declining, alleviating the problem of overpopulation of the Earth. However, the high-income countries and some of the low-income ones will experience depopulation, while on the contrary, in most Sub-Saharan African countries, fast population growth will continue for a few decades. Therefore, the optimization of demographic processes is an extremely important task. It is very important that the development of medical technologies leads to an increase in life expectancy, as well as to a change in the demographic structure of society, i.e., to global aging, which is leading to changing social and labor force structure. Therefore, ensuring the quality of life of the aging population will present a critical problem. Aging at the same time gradually reduces the acuteness of material problems and their role in motivating people’s activities, while at the same time increasing the role of ideological factors. In this regard, it makes sense to talk about the formation of a new type of society (cybernetic W-society), since the ongoing phase transition will affect all the foundations of life and the organization of society. The transition to a new model of society is inevitably interconnected with global political transformations, involving the formation of a new world order in which new principles of interaction between states will appear with a gradual rejection of competitive confrontation in favor of mutually beneficial cooperation.
Chapter
This chapter examines technologies’ current and future development in the framework of the Cybernetic Revolution—the third of the largest production (or technological) revolutions after the Agrarian and Industrial ones. The Cybernetic Revolution is a fundamental transition from industrial production to the production of services and goods based on the widespread implementation of self-regulating systems, that is, systems that can function in the absence or with minimum involvement of people and independently make complex decisions. This transition has already started and will continue up to the 2070s. The Cybernetic Revolution began its active development in the 1950s and has now finished its modernization phase. At the moment, the key technologies are information and communication technology and artificial intelligence, whose role in society is gradually increasing, and they come with benefits and potential risks. However, Grinin & Grinin assume that from the 2030s, the new—final—phase of the Cybernetic Revolution will start. Its major technological breakthroughs will lead to self-regulating systems’ formation and widespread implementation. So, Grinin & Grinin assume that new technologies will emerge. They forecast that it will be a set of technological spheres, and the MANBRIC complex/convergence is taking shape and will actively develop in the final phase of the Cybernetic Revolution (in the 2030s–2070s). The MANBRIC is an abbreviation formed from the initial letters of the seven breakthrough areas: Medicine-Additive-Nano-Bio-Robotics-Info-Cognitive technologies. These technological fields closely interact and corroborate each other and will continue to do so increasingly in the future. Due to its specific characteristics, medicine will be an integral part of the MANBRIC complex. Grinin & Grinin also offer some scenarios for further technological development. They significantly depend on the areas where technological breakthroughs will start. The main developmental scenario is presented as a breakthrough that will occur in the 2030s in the field of medicine, especially at the nexus of its new directions and some areas of the MANBRIC. There will be the introduction of innovations based on self-regulating systems in various fields of social activity (economy, medicine, biology, and socio-administrative structures). Grinin & Grinin describe the most favorable scenario and recommend how to move toward this scenario.
Article
Full-text available
The recent and important advances in bottom-up synthetic biology (SB), in particular in the field of the so-called “synthetic cells” (SCs) (or “artificial cells”, or “protocells”), lead us to consider the role of wetware technologies in the “Sciences of Artificial”, where they constitute the third pillar, alongside the more well-known pillars hardware (robotics) and software (Artificial Intelligence, AI). In this article, it will be highlighted how wetware approaches can help to model life and cognition from a unique perspective, complementary to robotics and AI. It is suggested that, through SB, it is possible to explore novel forms of bio-inspired technologies and systems, in particular chemical AI. Furthermore, attention is paid to the concept of semantic information and its quantification, following the strategy recently introduced by Kolchinsky and Wolpert. Semantic information, in turn, is linked to the processes of generation of “meaning”, interpreted here through the lens of autonomy and cognition in artificial systems, emphasizing its role in chemical ones.
Chapter
Phenotype is the physical characteristics of a living organism that are the product of both genetic and environmental influences. Phenotypes are typically the visible or measurable traits of an organism that can be observed and studied. A phenome is the set of observable characteristics of an individual or species that result from the interaction between their genotype and the environment. It is the collective expression of all the genes in an organism, including their interactions with the environment. Topics included in this chapter are activity-based protein profiling, phenotype microarrays seeded with microorganisms, ethomics (the quantitative study of behaviour), actimetry and, finally, modeling life via the virtual living organism (VLO).
Chapter
Just as we are often interested in events that are composed of many elementary (simple) events, in biology the objects under scrutiny are vastly complex objects composed of many individual molecules (the molecule is probably the most appropriate level of coarse graining for the systems we are dealing with). Since these components are connected together, they constitute a system. The essence of a system is that it cannot be usefully decomposed into its constituent parts; it is an integrated whole made up of interconnected parts.
Chapter
This chapter deals with the basics of information transmission, which include some or all of the following: the information source, encoding the information, transmitting it through a channel, the addition of noise, and, finally, receiving and decoding the information. Biological coding is introduced through the examples of DNA, RNA, and proteins. Further topics dealt with include the compression of information (and its use to measure the distance between different objects), ergodicity, and error correction. Noise is given particular attention, and the concept of equivocation to quantify the amount of information lost in transmission due to noise is introduced.
Chapter
“Machine” is used formally to describe the embodiment of a transformation (e.g., Eq. (3.1); cf. the automata in Sect. 12.1.1). In this formal sense, it does not have any particular connotation of animate or inanimate. The essential feature is that the internal state of the machine, together with the state of its surroundings, uniquely defines the next state to which it will go.
Chapter
The purpose of this and the following chapter is to give an overview of living systems, especially directed at the bioinformatician who has previously dealt purely with the computational aspects of the subject.
Chapter
What is information? We have already asserted that it is a profound, primitive (i.e., irreducible) concept. Dictionary definitions include “(desired) items of knowledge”; for example, one wishes to know the length of a piece of wood. It appears to be less than a foot long, so we measure it with our desktop ruler marked off in inches, with the result, let us say, “between six and seven inches”. This result is clearly an item of desired knowledge, hence information. We shall return to this example later.
Article
Target-based drug discovery is the dominant paradigm of drug discovery; however, a comprehensive evaluation of its real-world efficiency is lacking. Here, a manual systematic review of about 32000 articles and patents dating back to 150 years ago demonstrates its apparent inefficiency. Analyzing the origins of all approved drugs reveals that, despite several decades of dominance, only 9.4% of small-molecule drugs have been discovered through "target-based" assays. Moreover, the therapeutic effects of even this minimal share cannot be solely attributed and reduced to their purported targets, as they depend on numerous off-target mechanisms unconsciously incorporated by phenotypic observations. The data suggest that reductionist target-based drug discovery may be a cause of the productivity crisis in drug discovery. An evidence-based approach to enhance efficiency seems to be prioritizing, in selecting and optimizing molecules, higher-level phenotypic observations that are closer to the sought-after therapeutic effects using tools like artificial intelligence and machine learning.
Article
Research Summary The multinational corporation (MNC) is a typical example of a complex organization. In this essay, we employ an established body of literature on complexity in organizations to explore and discuss the nature and consequences of complexity for global strategy and MNCs. On that basis, we develop a simple organizing framework for complexity in global strategies emphasizing the source (external and internal complexity) and type (process and structural complexity) of complexity. We use this framework to structure and discuss the six research contributions in this Special Issue. We conclude by suggesting additional avenues of research on the interface between global strategy and complexity. Managerial Summary Firms internationalize because they recognize business opportunities abroad and devise strategies to successfully exploit them. At the same time, managers face increasing complexity as MNCs expand internationally and engage in more unknown and dispersed operations. Not only do MNCs face considerable complexity by operating in diverse and uncertain environments, but also by managing and coordinating organizational tasks and activities spanning multiple countries. This essay discusses these challenges and corresponding strategies for MNC managers. It also provides an overview of the six research articles included in this Special Issue about complexity and MNCs.
Article
This paper explores the sustainability field from a Complex System Governance (CSG) perspective. In general, sustainability suggests maintenance at a specific rate or level. It is also frequently held as maintaining ecological balance to negate the depletion of natural resources. CSG offers sustainability a theoretically grounded, model based, and methodologically sound approach to better inform sustainability design, execution, and development for complex systems. CSG examines sustainability as an outcome‐based product resulting from effective governance of an underlying system which produces sustainability. Thus, sustainability is proposed as a ‘systems engineered product’, whose design, execution, and development will be favored by CSG systems engineering. Following an introduction, two primary objectives are pursued. First, Systems Theory is used to provide an alternative view of sustainability. Second, a perspective of sustainability is developed through the paradigm of the emerging CSG field. The paper closes with the contributions, opportunities, and challenges for deployment of CSG for enhanced development, transition, and maintenance of sustainable systems.
Article
Full-text available
What is it about our current digital technologies that seemingly makes it difficult for users to attend to what matters to them? According to the dominant narrative in the literature on the “attention economy,” a user’s lack of attention is due to the large amounts of information available in their everyday environments. I will argue that information-abundance fails to account for some of the central manifestations of distraction, such as sudden urges to check a particular information-source in the absence of perceptual information. I will use active inference, and in particular models of action selection based on the minimization of expected free energy, to develop an alternative answer to the question about what makes it difficult to attend. Besides obvious adversarial forms of inference, in which algorithms build up models of users in order to keep them scrolling, I will show that active inference provides the tools to identify a number of problematic structural features of current digital technologies: they contain limitless sources of novelty, they can be navigated by very simple and effortless motor movements, and they offer their action possibilities everywhere and anytime independent of place or context. Moreover, recent models of motivated control show an intricate interplay between motivation and control that can explain sudden transitions in motivational state and the consequent alteration of the salience of actions. I conclude, therefore, that the challenges users encounter when engaging with digital technologies are less about information overload or inviting content, but more about the continuous availability of easily available possibilities for action.
Chapter
The purpose of this paper is to investigate the use of machine learning approaches to build a dictionary of terms to analyze text for ESG content using a bag of words approach, where ESG stands for “environment, social and governance.” Specifically, the paper reviews some experiments performed to develop a dictionary for information about the environment, for “carbon footprint”. We investigate using Word2Vec based on Form 10K text and from Earnings Calls, and queries of ChatGPT and compare the results. As part of the development of our dictionaries we find that bigrams and trigrams are more likely to be found when using ChatGPT, suggesting that bigrams and trigrams provide a “better” approach for the dictionaries developed with Word2Vec. We also find that terms provided by ChatGPT were not as likely to appear in Form 10Ks or other business disclosures, as were those terms generated using Word2Vec. In addition, we explored different question approaches to ChatGPT to find different perspectives on carbon footprint, such as “reducing carbon footprint” or “negative effects of carbon footprint.” We then discuss combining the findings from each of these approaches, to build a dictionary that could be used alone or with other ESG concept dictionaries.KeywordsESGCarbon FootprintEnvironmentWord2VecChatGPTDictionaryBag of WordsOntologyConceptForm 10KReducing Carbon FootprintHybrid Approach
Chapter
Heron of Alexandria (first century ad ) was the first chronicler of a peculiar mechanism capable of holding the flame of oil lamps steady. Later, similar mechanisms were found in water clocks. In the eighteenth century, they reappeared in the regulator of Watt's steam engine, which drove industrial production. Since 1910, engineers have called them servomechanisms. Norbert Wiener (1948, 1950), mathematician at MIT, realized that the theory underlying them had far wider applications, linked them to communication, and called the study of these phenomena “cybernetics … the science of control and communication in the animal and the machine” (→ Communication: History of the Idea). He derived “cybernetics” from the Greek kybernetes or “steersman.” Before him, Ampere had used the word to designate a science of government, without, however, developing the idea.
Chapter
The word “system” is widely used. We speak of planetary systems, transportation systems, nervous systems, number systems, filing systems, political systems, systems of checks and balances, systems of grammatical rules, systems of weights and measures, and so on as if they shared the same reality. Their common denominator is a multitude of component parts, depending on each other, working together in complex ways, and functioning as wholes. Beyond these commonalities, such systems exhibit at best Wittgensteinian family resemblances.
Article
Full-text available
High variety is a characteristic attribute of any material phenomena and processes involving living matter, i.e., very complex systems (VCSs). We have verified the presence of fundamental constraints on the size/shape diversity and self-organization by the example of mammalian skeleton in four orders (41 species). The properties of more than 4700 multidimensional descriptive models of VCSs were studied. A self-organization parameter R (0 ≤ R ≤ 1) was calculated for each model, and its range of variability was mainly limited to the interval from ~0.10 to ~0.31. The concepts of an abstract Ashby regulator and the Shannon–Hartley theorem were used to explain the variation in the empirical data. It has been concluded that there are significant constraints on the quality of morphological diversity regulation and the possible level of self-organization of VCSs for steady states.
Article
Full-text available
Synopsis Cells are the fundamental unit of biological organization. Although it may be easy to think of them as little more than the simple building blocks of complex organisms such as animals, single cells are capable of behaviors of remarkable apparent sophistication. This is abundantly clear when considering the diversity of form and function among the microbial eukaryotes, the protists. How might we navigate this diversity in the search for general principles of cellular behavior? Here, we review cases in which the intensive study of protists from the perspective of cellular biophysics has driven insight into broad biological questions of morphogenesis, navigation and motility, and decision making. We argue that applying such approaches to questions of evolutionary cell biology presents rich, emerging opportunities. Integrating and expanding biophysical studies across protist diversity, exploiting the unique characteristics of each organism, will enrich our understanding of general underlying principles.
Article
Full-text available
The potential role of bottom-up Synthetic Cells (SCs) in the Internet of Bio-Nano Things (IoBNT) is discussed. In particular, this perspective paper focuses on the growing interest in networks of biological and/or artificial objects at the micro- and nanoscale (cells and subcellular parts, microelectrodes, microvessels, etc.), whereby communication takes place in an unconventional manner, i.e., via chemical signaling. The resulting “molecular communication” (MC) scenario paves the way to the development of innovative technologies that have the potential to impact biotechnology, nanomedicine, and related fields. The scenario that relies on the interconnection of natural and artificial entities is briefly introduced, highlighting how Synthetic Biology (SB) plays a central role. SB allows the construction of various types of SCs that can be designed, tailored, and programmed according to specific predefined requirements. In particular, “bottom-up” SCs are briefly described by commenting on the principles of their design and fabrication and their features (in particular, the capacity to exchange chemicals with other SCs or with natural biological cells). Although bottom-up SCs still have low complexity and thus basic functionalities, here, we introduce their potential role in the IoBNT. This perspective paper aims to stimulate interest in and discussion on the presented topics. The article also includes commentaries on MC, semantic information, minimal cognition, wetware neuromorphic engineering, and chemical social robotics, with the specific potential they can bring to the IoBNT.
Chapter
Through the ages, human societies have developed systems for responding to their education related needs and improving participation and management of all learning activities in its education system. African traditional knowledge system is one of the traditional knowledge systems that can be used to effectively manage inclusive education institutions in African communities. African Traditional Systems are knowledge and practices that are used in addressing the needs of African communities and rely exclusively on practical experiences and observations handed down from generation to generation mostly verbal but also limited extent in writing. African Traditional Knowledge Systems are developed in Africa used by inhabitants of the Africa communities and they incorporate their concepts and needs. There is currently an increased recognition of African traditional knowledge systems in the management of Inclusive education institutions in African. Africa needs to utilise its traditional knowledge systems to respond to educational needs of its communities that are grounded in local culture. The African traditional knowledge systems must design inclusive education management epistemologies that are African-centred and seek to address African-centred educational problems. This chapter will explore how best can African traditional knowledge systems used to manage inclusive education in Africa.KeywordsAfrican traditional knowledge systemsInclusive educationIndigenous systems
Article
Full-text available
Purpose: to present and justify a methodical approach to assessing the level of adaptability of organizational management structures to the conditions of a dynamically changing external environment. Methods: a wide range of general scientific methods is used – system analysis, synthesis, graphical interpretation of data. During the study, the method of expert assessments was used to assess the level of adaptability. In order to classify the types of management structures depending on their adaptability, a cluster analysis was carried out. Results: the article presents an approach to the definition of the concepts of "adaptation" and "adaptability" in relation to management structures. In order to develop a methodical approach to assessing the level of adaptability of organizational structures, the types of their adaptation were systematized, the main characteristics of the structures were identified and the scale was developed to assess the level of their adaptability. The use of the expert method made it possible to assess the level of adaptability of the main types of organizational structures to the conditions of a changing business space and rank them depending on this level. As a result of the cluster analysis, all the studied types of management structures were classified depending on their adaptability. Conclusions and Relevance: the developed methodical approach to assessing the level of adaptability of management structures made it possible to determine the adaptive properties of both hierarchical and organic management structures, assess their level of adaptability and identify the most adaptive among them. It has been established that the basis for the successful development of companies in the conditions of turbulent business space is the use of organic structures or changing individual parameters of hierarchical structures in order to increase their adaptability. Adaptation can be carried out using any structure, both traditional hierarchical and organic, by forming new management structures or increasing the adaptability of existing ones. Further research in this area should be devoted to the development of an effective mechanism for adapting management structures to the conditions of a changing business space.
Artificial life is a research field studying what processes and properties define life, based on a multidisciplinary approach spanning the physical, natural, and computational sciences. Artificial life aims to foster a comprehensive study of life beyond “life as we know it” and toward “life as it could be,” with theoretical, synthetic, and empirical models of the fundamental properties of living systems. While still a relatively young field, artificial life has flourished as an environment for researchers with different backgrounds, welcoming ideas, and contributions from a wide range of subjects. Hybrid Life brings our attention to some of the most recent developments within the artificial life community, rooted in more traditional artificial life studies but looking at new challenges emerging from interactions with other fields. Hybrid Life aims to cover studies that can lead to an understanding, from first principles, of what systems are and how biological and artificial systems can interact and integrate to form new kinds of hybrid (living) systems, individuals, and societies. To do so, it focuses on three complementary perspectives: theories of systems and agents, hybrid augmentation, and hybrid interaction. Theories of systems and agents are used to define systems, how they differ (e.g., biological or artificial, autonomous, or nonautonomous), and how multiple systems relate in order to form new hybrid systems . Hybrid augmentation focuses on implementations of systems so tightly connected that they act as a single, integrated one. Hybrid interaction is centered around interactions within a heterogeneous group of distinct living and nonliving systems. After discussing some of the major sources of inspiration for these themes, we will focus on an overview of the works that appeared in Hybrid Life special sessions, hosted by the annual Artificial Life Conference between 2018 and 2022. This article is categorized under: Neuroscience > Cognition Philosophy > Artificial Intelligence Computer Science and Robotics > Robotics
Article
Full-text available
The first part of this essay relates a minimal and primordial concept of agency to be found in science and technology studies to an overall ontology of liveliness. The second part explores the relation between minimal and higher-level conceptions of agency concerning goal-orientedness and adaptation, and moves towards specifically biological concerns via a discussion of cybernetic machines.
Thesis
Full-text available
Investigación en estado del arte en Ciberteología, así como algunas propuestas de definiciones conceptuales de la Ciberteología a partir de los tres niveles de Cibernética.
Article
Full-text available
Our urban systems and their underlying sub-systems are designed to deliver only a narrow set of human-centered services, with little or no accounting or understanding of how actions undercut the resilience of social-ecological-technological systems (SETS). Embracing a SETS resilience perspective creates opportunities for novel approaches to adaptation and transformation in complex environments. We: i) frame urban systems through a perspective shift from control to entanglement, ii) position SETS thinking as novel sensemaking to create repertoires of responses commensurate with environmental complexity (i.e., requisite complexity), and iii) describe modes of SETS sensemaking for urban system structures and functions as basic tenets to build requisite complexity. SETS sensemaking is an undertaking to reflexively bring sustained adaptation, anticipatory futures, loose-fit design, and co-governance into organizational decision-making and to help reimagine institutional structures and processes as entangled SETS.
Article
The qualitative paper explores an alternative lens, that is, informed by a complexity perspective, through which to frame the adaptive role of local implementers in multi-level governance systems. It argues that three key tenets of complexity thinking – emergence, self-organisation and co-evolution – can help better explain that role. The re-conceptualisation of local actors – from agents to stewards – and of central government – from controller to enabler – are identified as the conditions that allow intelligent actors to leverage local varieties to deliver context-specific solutions. The paper ends with actionable measures that can enhance the self-steering capacity of policy systems.
Chapter
Full-text available
Article
Full-text available
Explaining the foundation of cognitive abilities in the processing of information by neural systems has been in the beginnings of biophysics since McCulloch and Pitts pioneered work within the biophysics school of Chicago in the 1940s and the interdisciplinary cyberneticist meetings in the 1950s, inseparable from the birth of computing and artificial intelligence. Since then, neural network models have traveled a long path, both in the biophysical and the computational disciplines. The biological, neurocomputational aspect reached its representational maturity with the Distributed Associative Memory models developed in the early 70 s. In this framework, the inclusion of signal-signal multiplication within neural network models was presented as a necessity to provide matrix associative memories with adaptive, context-sensitive associations, while greatly enhancing their computational capabilities. In this review, we show that several of the most successful neural network models use a form of multiplication of signals. We present several classical models that included such kind of multiplication and the computational reasons for the inclusion. We then turn to the different proposals about the possible biophysical implementation that underlies these computational capacities. We pinpoint the important ideas put forth by different theoretical models using a tensor product representation and show that these models endow memories with the context-dependent adaptive capabilities necessary to allow for evolutionary adaptation to changing and unpredictable environments. Finally, we show how the powerful abilities of contemporary computationally deep-learning models, inspired in neural networks, also depend on multiplications, and discuss some perspectives in view of the wide panorama unfolded. The computational relevance of multiplications calls for the development of new avenues of research that uncover the mechanisms our nervous system uses to achieve multiplication.
ResearchGate has not been able to resolve any references for this publication.