Natural Computing Journal Impact Factor & Information

Publisher: Springer Verlag

Journal description

Natural Computing is a general term referring to computing going on in nature and computing inspired by nature. When complex phenomena going on in nature are viewed as computational processes, our understanding of these phenomena and of the essence of computation is enhanced. In this way one gains valuable insights into both natural sciences and computer science. Characteristic for man-designed computing inspired by nature is the metaphorical use of concepts, principles and mechanisms underlying natural systems. This type of computing includes evolutionary algorithms, neural networks, molecular computing and quantum computing. The aim of the journal is (1) to provide a publication forum for, and to foster links and mutual understanding between researchers from various areas of natural computing, (2) to give researchers in natural sciences and computer science an insight into research trends in natural computing. The research in natural computing is concerned with theory, experiments, and applications, and the journal reports on each of these research lines. Moreover, the journal will cover natural computing in a very broad scope. Thus, e.g., the subarea of evolutionary algorithms will cover also very active research on the boundary of biology and computer science which views the evolution as a computational process, and the subarea of neural networks will also cover computational trends in brain research. The journal is soliciting papers on all aspects of natural computing. Because of the interdisciplinary character of the journal a special effort will be made to solicit survey, review, and tutorial papers which would make research trends in a given subarea more accessible to the broad audience of the journal.

Current impact factor: 0.54

Impact Factor Rankings

2015 Impact Factor Available summer 2015
2013 / 2014 Impact Factor 0.539
2012 Impact Factor 0.683

Additional details

5-year impact 0.00
Cited half-life 5.50
Immediacy index 0.14
Eigenfactor 0.00
Article influence 0.00
Website Natural Computing website
Other titles Natural computing (Online)
ISSN 1567-7818
OCLC 50721116
Material type Document, Periodical, Internet resource
Document type Internet Resource, Computer File, Journal / Magazine / Newspaper

Publisher details

Springer Verlag

  • Pre-print
    • Author can archive a pre-print version
  • Post-print
    • Author can archive a post-print version
  • Conditions
    • Author's pre-print on pre-print servers such as
    • Author's post-print on author's personal website immediately
    • Author's post-print on any open access repository after 12 months after publication
    • Publisher's version/PDF cannot be used
    • Published source must be acknowledged
    • Must link to publisher version
    • Set phrase to accompany link to published version (see policy)
    • Articles in some journals can be made Open Access on payment of additional charge
  • Classification
    ​ green

Publications in this journal

  • [Show abstract] [Hide abstract]
    ABSTRACT: Modelling and simulation of complex systems can create scientific research tools that allow the inaccessible dynamic aspects of systems to be explored in ways that are not possible in live systems. In some scientific contexts, there is a need to be able to create and use such simulations to explore and generate hypotheses alongside conventional laboratory research. The principled complex systems modelling and simulation (CoSMoS) approach was created to support these activities, as a response to a perceived gap in the software engineering development process for simulation. The article presents some of the software engineering motivation for CoSMoS, by exploring this perceived gap. Following from this analysis, the article considers the validation of complex systems simulators, especially where these are to be used in ongoing research.
    Natural Computing 03/2015; 14(1). DOI:10.1007/s11047-014-9462-5
  • [Show abstract] [Hide abstract]
    ABSTRACT: Complex Systems Modelling and Simulation (CoSMoS) was a 4 year EPSRC funded research project at the Universities of York and Kent in the UK. As part of that project, the research team developed the CoSMoS approach to assist the building and use of fit-for-purpose computational simulations of complex systems, and initiated a series of international workshops to disseminate best practice in CoSMoS. This special issue brings together several authors from the first six workshops.
    Natural Computing 03/2015; 14(1). DOI:10.1007/s11047-015-9482-9
  • [Show abstract] [Hide abstract]
    ABSTRACT: Self-organized regularities in terms of patient arrivals and wait times have been discovered in real-world healthcare services. What remains to be a challenge is how to characterize those regularities by taking into account the underlying patients’ or hospitals’ behaviors with respect to various impact factors. This paper presents a case study to address such a challenge. Specifically, it models and simulates the cardiac surgery services in Ontario, Canada, based on the methodology of Autonomy-Oriented Computing (AOC). The developed AOC-based cardiac surgery service model (AOC-CSS model) pays a special attention to how individuals’ (e.g., patients and hospitals) behaviors and interactions with respect to some key factors (i.e., geographic accessibility to services, hospital resourcefulness, and wait times) affect the dynamics and relevant patterns of patient arrivals and wait times. By experimenting with the AOC-CSS model, we observe that certain regularities in patient arrivals and wait times emerge from the simulation, which are similar to those discovered from the real world. It reveals that patients’ hospital-selection behaviors, hospitals’ service-adjustment behaviors, and their interactions via wait times may potentially account for the self-organized regularities of wait times in cardiac surgery services.
    Natural Computing 03/2015; 14(1):7-24. DOI:10.1007/s11047-014-9472-3
  • [Show abstract] [Hide abstract]
    ABSTRACT: We firstly investigate the multipartite entanglement features of the quantum states by means of the separable degree and the entanglement measure. Then we give the qualitative and quantitative descriptions of the entanglement dynamics of the quantum states in Grover’s search algorithm. Our results show that for most instances (1) the separable degrees of these states and ranges of their maximum Schmidt numbers are invariable by following the dynamics of Grover’s search algorithm; (2) the dynamics of Grover’s search algorithm is almost “filled” by the fully entangled states.
    Natural Computing 01/2015; DOI:10.1007/s11047-014-9481-2
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article introduces methods for modeling compound granules used in algorithms which could successfully construct a mosaic from the images coming from an endoscope capsule. In order to apply the algorithm, combined images must have a common area where the correspondence of points is determined. That allows to determine the transformation parameters to compensate movement of the capsule that occurs between moments when the mosaic images were acquired. The developed algorithm for images from the capsule endoscopy has proved to be faster and comparably accurate as commercial GDB-ICP algorithm.
    Natural Computing 01/2015; DOI:10.1007/s11047-014-9477-y
  • [Show abstract] [Hide abstract]
    ABSTRACT: Guided by a polymath approach—encompassing neuroscience, philosophy, psychology and computer science, this article describes a novel ‘cognitive’ computational mind framework for text comprehension in terms of Minsky’s ‘Society of Mind’ and ‘Emotion Machine’ theories. Observing a top-down design method, we enumerate here the macrocosmic elements of the model—the ‘agencies’ and memory constructs, followed by an elucidation on the working principles and synthesis concerns. Besides corroboration of results of a dry-run test by thoughts generated by random human subjects; the completeness of the conceptualized framework has been validated as a consequence of its total representation of ‘text understanding’ functions of the human brain, types of human memory and emulation of the layers of the mind. A brief conceptual comparison, between the architecture and existing ‘conscious’ agents, has been included as well. The framework, though observed here in its capacity as a text comprehender, is capable of understanding in general. A cognitive model of text comprehension, besides contributing to the ‘thinking machines’ research enterprise, is envisioned to be strategic in the design of intelligent plagiarism checkers, literature genre-cataloguers, differential diagnosis systems, and educational aids for children with reading disorders. Turing’s landmark 1950 article on computational intelligence is the principal motivator behind our research initiative.
    Natural Computing 01/2015; DOI:10.1007/s11047-014-9478-x
  • [Show abstract] [Hide abstract]
    ABSTRACT: Unsupervised technique like clustering may be used for software cost estimation in situations where parametric models are difficult to develop. This paper presents a software cost estimation model based on a modified K-Modes clustering algorithm. The aims of this paper are: first, the modified K-Modes clustering which is an enhancement over the simple K-Modes algorithm using a proper dissimilarity measure for mixed data types, is presented and second, the proposed K-Modes algorithm is applied for software cost estimation. We have compared our modified K-Modes algorithm with existing algorithms on different software cost estimation datasets, and results showed the effectiveness of our proposed algorithm.
    Natural Computing 01/2015; DOI:10.1007/s11047-015-9492-7
  • [Show abstract] [Hide abstract]
    ABSTRACT: One of the most important and challenging problems in functional genomics is how to select the disease genes. In this regard, the paper presents a new computational method to identify disease genes. It judiciously integrates the information of gene expression profiles and shortest path analysis of protein-protein interaction networks. While the \(f\) -information based maximum relevance-maximum significance framework is used to select differentially expressed genes as disease genes using gene expression profiles, the functional protein association network is used to study the mechanism of diseases. An important finding is that some \(f\) -information measures are shown to be effective for selecting relevant and significant genes from microarray data. Extensive experimental study on colorectal cancer establishes the fact that the genes identified by the integrated method have more colorectal cancer genes than the genes identified from the gene expression profiles alone, irrespective of any gene selection algorithm. Also, these genes have greater functional similarity with the reported colorectal cancer genes than the genes identified from the gene expression profiles alone. The enrichment analysis of the obtained genes reveals to be associated with some of the important KEGG pathways. All these results indicate that the integrated method is quite promising and may become a useful tool for identifying disease genes.
    Natural Computing 01/2015; DOI:10.1007/s11047-015-9485-6
  • [Show abstract] [Hide abstract]
    ABSTRACT: Heuristic search is one of the fundamental problem solving techniques in artificial intelligence, which is used in general to efficiently solve computationally hard problems in various domains, especially in planning and optimization. In this paper, we present an anytime heuristic search algorithm called anytime pack search (APS) which produces good quality solutions quickly and improves upon them over time, by focusing the exploration on a limited set of most promising nodes in each iteration. We discuss the theoretical properties of APS and show that it is complete. We also present the complexity analysis of the proposed algorithm on a tree state-space model and show that it is asymptotically of the same order as that of A*, which is a widely applied best-first search method. Furthermore, we present a parallel formulation of the proposed algorithm, called parallel anytime pack search (PAPS), which is applicable for searching tree state-spaces. We theoretically prove the completeness of PAPS. Experimental results on the sliding-tile puzzle problem, traveling salesperson problem, and single machine scheduling problem depict that the proposed sequential algorithm produces much better anytime performance when compared to some of the existing methods. Also, the proposed parallel formulation achieves super-linear speedups over the sequential method.
    Natural Computing 01/2015; DOI:10.1007/s11047-015-9490-9
  • Natural Computing 01/2015; DOI:10.1007/s11047-014-9479-9
  • [Show abstract] [Hide abstract]
    ABSTRACT: Outlier detection is an important data mining task with many contemporary applications. Clustering based methods for outlier detection try to identify the data objects that deviate from the normal data. However, the uncertainty regarding the cluster membership of an outlier object has to be handled appropriately during the clustering process. Additionally, carrying out the clustering process on data described using categorical attributes is challenging, due to the difficulty in defining requisite methods and measures dealing with such data. Addressing these issues, a novel algorithm for clustering categorical data aimed at outlier detection is proposed here by modifying the standard \(k\) -modes algorithm. The uncertainty regarding the clustering process is addressed by considering a soft computing approach based on rough sets. Accordingly, the modified clustering algorithm incorporates the lower and upper approximation properties of rough sets. The efficacy of the proposed rough \(k\) -modes clustering algorithm for outlier detection is demonstrated using various benchmark categorical data sets.
    Natural Computing 01/2015; DOI:10.1007/s11047-015-9489-2
  • Natural Computing 01/2015; DOI:10.1007/s11047-015-9486-5
  • [Show abstract] [Hide abstract]
    ABSTRACT: The adaptive niche quantum-inspired immune clonal algorithm (ANQICA) is proposed by combining the quantum coding, immune clone and niche mechanism together to solve the multi-modal function optimization more effectively and make the function converge to as many as possible extreme value points. The quantum coding can better explore the solution space, the niche mechanism ensures the algorithm to converge to multi-extremum and the adaptive mechanism is introduced according to the characteristics of each procedure of the algorithm to improve the effect of the algorithm. Example analysis shows that the ANQICA is better in exploration and convergence. Therefore, the ANQICA can be used to solve the problem of multi-modal function optimization effectively.
    Natural Computing 01/2015; DOI:10.1007/s11047-015-9495-4
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a fast learning method for fuzzy measure determination named fuzzy extreme learning machine (FELM). Moreover, we apply it to a special application domain, which is known as unit combination strategy evaluation in real time strategy (RTS) game. The contribution of this paper includes three aspects. First, we describe feature interaction among different unit types by fuzzy theory. Second, we develop a new set selection algorithm to represent the complex relation between input and hidden layers in extreme learning machine, in order to enable it to learn different fuzzy integrals. Finally, based on the set selection algorithm, we propose the FELM model for feature interaction description, which has an extremely fast learning speed. Experimental results on artificial benchmarks and real RTS game data show the feasibility and effectiveness of the proposed method in both accuracy and efficiency.
    Natural Computing 01/2015; DOI:10.1007/s11047-015-9484-7
  • [Show abstract] [Hide abstract]
    ABSTRACT: Assuming an insecure quantum channel, a quantum computer, and an authenticated classical channel, we propose an unconditionally secure scheme for encrypting classical messages under a shared key, where attempts to eavesdrop the ciphertext can be detected. If no eavesdropping is detected, we can securely re-use the entire key for encrypting new messages. If eavesdropping is detected, we must discard a number of key bits corresponding to the length of the message, but can re-use almost all of the rest. We show this is essentially optimal. Thus, provided the adversary does not interfere (too much) with the quantum channel, we can securely send an arbitrary number of message bits, independently of the length of the initial key. Moreover, the key-recycling mechanism only requires one-bit feedback. While ordinary quantum key distribution with a classical one time pad could be used instead to obtain a similar functionality, this would need more rounds of interaction and more communication.
    Natural Computing 12/2014; 13(4):469-486. DOI:10.1007/s11047-014-9454-5
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Chemical reaction networks (CRNs) and DNA strand dis-placement systems (DSDs) are widely-studied and useful models of molec-ular programming. However, in order for some DSDs in the literature to behave in an expected manner, the initial number of copies of some reagents is required to be fixed. In this paper we show that, when mul-tiple copies of all initial molecules are present, general types of CRNs and DSDs fail to work correctly if the length of the shortest sequence of reactions needed to produce any given molecule exceeds a threshold that grows polynomially with attributes of the system.
    Natural Computing 12/2014; 13(4). DOI:10.1007/s11047-013-9403-8
  • [Show abstract] [Hide abstract]
    ABSTRACT: The novelty of quantum cryptography is that whenever a spy tries to eavesdrop the communication he causes disturbances in the transmission of the message. Ultimately this unavoidable disturbance is a consequence of Heisenberg’s uncertainty principle that limits the joint knowledge of complementary observables. We present in this paper a novel and highly speculative approach. We propose to replace Heisenberg uncertainties by another type of uncertainty, that characterizes the knowledge of the time at which an unstable nucleus decays. Previously developed protocols of quantum cryptography make it possible to refresh a key even in the case that we do not trust the carrier of the key. This scheme might do so as well.
    Natural Computing 12/2014; 13(4). DOI:10.1007/s11047-014-9455-4
  • [Show abstract] [Hide abstract]
    ABSTRACT: Entanglement is a global characteristic unique to quantum states that depends on quantum coherence and may allow one to carry out communications and information processing tasks that are either impossible or less efficient using classical states. Because environmental noise, even when entirely local in spatial extent, can fully destroy entanglement in finite time, an effect referred to as "entanglement sudden death" (ESD), it may threaten quantum information processing tasks. Although it may be possible to "distill" entanglement from a collection of noise-affected systems under appropriate circumstances, once entanglement has been completely lost no amount of distillation can recover it. It is therefore extremely important to avoid its complete destruction in times comparable to those of information processing tasks. Here, the effect of local noise on a class of entangled states used in entanglement-based quantum key distribution is considered and the threat ESD might pose to it is assessed.
    Natural Computing 12/2014; 13(4):459-467. DOI:10.1007/s11047-014-9452-7