Natural Computing (Nat Comput )

Publisher: Springer Verlag

Description

Natural Computing is a general term referring to computing going on in nature and computing inspired by nature. When complex phenomena going on in nature are viewed as computational processes, our understanding of these phenomena and of the essence of computation is enhanced. In this way one gains valuable insights into both natural sciences and computer science. Characteristic for man-designed computing inspired by nature is the metaphorical use of concepts, principles and mechanisms underlying natural systems. This type of computing includes evolutionary algorithms, neural networks, molecular computing and quantum computing. The aim of the journal is (1) to provide a publication forum for, and to foster links and mutual understanding between researchers from various areas of natural computing, (2) to give researchers in natural sciences and computer science an insight into research trends in natural computing. The research in natural computing is concerned with theory, experiments, and applications, and the journal reports on each of these research lines. Moreover, the journal will cover natural computing in a very broad scope. Thus, e.g., the subarea of evolutionary algorithms will cover also very active research on the boundary of biology and computer science which views the evolution as a computational process, and the subarea of neural networks will also cover computational trends in brain research. The journal is soliciting papers on all aspects of natural computing. Because of the interdisciplinary character of the journal a special effort will be made to solicit survey, review, and tutorial papers which would make research trends in a given subarea more accessible to the broad audience of the journal.

  • Impact factor
    0.68
  • 5-year impact
    0.00
  • Cited half-life
    5.50
  • Immediacy index
    0.14
  • Eigenfactor
    0.00
  • Article influence
    0.00
  • Website
    Natural Computing website
  • Other titles
    Natural computing (Online)
  • ISSN
    1567-7818
  • OCLC
    50721116
  • Material type
    Document, Periodical, Internet resource
  • Document type
    Internet Resource, Computer File, Journal / Magazine / Newspaper

Publisher details

Springer Verlag

  • Pre-print
    • Author can archive a pre-print version
  • Post-print
    • Author can archive a post-print version
  • Conditions
    • Author's pre-print on pre-print servers such as arXiv.org
    • Author's post-print on author's personal website immediately
    • Author's post-print on any open access repository after 12 months after publication
    • Publisher's version/PDF cannot be used
    • Published source must be acknowledged
    • Must link to publisher version
    • Set phrase to accompany link to published version (see policy)
    • Articles in some journals can be made Open Access on payment of additional charge
  • Classification
    ​ green

Publications in this journal

  • [Show abstract] [Hide abstract]
    ABSTRACT: The novelty of quantum cryptography is that whenever a spy tries to eavesdrop the communication he causes disturbances in the transmission of the message. Ultimately this unavoidable disturbance is a consequence of Heisenberg’s uncertainty principle that limits the joint knowledge of complementary observables. We present in this paper a novel and highly speculative approach. We propose to replace Heisenberg uncertainties by another type of uncertainty, that characterizes the knowledge of the time at which an unstable nucleus decays. Previously developed protocols of quantum cryptography make it possible to refresh a key even in the case that we do not trust the carrier of the key. This scheme might do so as well.
    Natural Computing 12/2014; 13(4).
  • [Show abstract] [Hide abstract]
    ABSTRACT: The hydrophobic-polar (HP) model has been widely studied in the field of protein structure prediction both for theoretical purposes and as a benchmark for new optimization strategies. In this work we present results of the recently proposed Hybrid Monte Carlo Ant Colony Optimization heuristic in the HP model using a fragment assembly-like strategy. Moreover we extend that method introducing a general framework for optimization in the HP model, called Local Landscape Mapping, and we test it using the pull moves set to generate solutions. We describe the heuristic and compare results obtained on well known HP instances in the 3-dimensional cubic lattice to those obtained with standard Ant Colony optimization and Simulated Annealing. Fragment assembly-like tests were performed using a modified objective function to prevent the creation of overlapping walks. Results show that our method performs better than the other heuristics in all benchmark instances when the fragment assembly-like strategy is used while in the case of pull moves-based neighborhood its performance is comparable to that of simulated annealing.
    Natural Computing 07/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we investigate the operations of insertion and deletion performed at the ends of a string. We show that using these operations in a P systems framework (which corresponds to using specific variants of graph control), computational completeness can even be achieved with the operations of left and right insertion and deletion of only one symbol, both in the generating as well as in the accepting case.
    Natural Computing 06/2014; 13(2).
  • [Show abstract] [Hide abstract]
    ABSTRACT: The wavelength-based machine, or simply w-machine, is an optical computational model, which is designed based on simultaneous movement of several wavelengths in a single light ray, and simultaneous effect of simple optical devices on them. In this paper, we investigate nonuniform complexity classes of w-machine, based on three complexity measures, namely, size, time, and word length. We show that the class of languages which can be generated by constant size nonuniform w-machines contain infinitely many Turing undecidable languages. Also, we show that polynomial size nonuniform w-machines generate all NP languages, and every NP-hard language requires at least polynomial time and polynomial size nonuniform w-machines to be generated. We prove that the class of languages which can be generated by polynomial size nonuniform w-machines is equal to NP/poly, and almost all languages require exponential size and polynomial time nonuniform w-machines to be generated.
    Natural Computing 06/2014; 13(2).
  • Natural Computing 06/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: The job-shop scheduling problem with operators is a very interesting problem that generalizes the classic job-shop problem in such a way that an operation must be algorithm to solve this problem considering makespan minimization. The genetic algorithm uses permutations with repetition to encode chromosomes and a schedule generation scheme, termed OG&T, as decoding algorithm. This combination guaranties that at least one of the chromosomes represents and optimal schedule and, at the samhat machines and operators are idle while an operation is available to be processed. To improve the quality of the schedules for large instances, we use Lamarckian evolution and modify the OG&T algorithm to further reduce the idle time of the machines and operators, in this case at the risk of leaving all optimal schedules out of the search space. We conducted a large experimental study showing that these improvements allow the genetic algorithm to reach high quality solutions in very short time, and so it is quite competitive with the current state-of-the-art methods.
    Natural Computing 06/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Trees are a useful framework for classifying entities whose attributes are, at least partially, related through a common ancestry, such as species of organisms, family members or languages. In some common applications, such as phylogenetic trees based on DNA sequences, relatedness can be inferred from the statistical analysis of unweighted attributes. The vast majority of mutations that survive across generations are evolutionarily neutral, which means that most genetic differences between species will have accumulated independently and randomly. In these cases, it is possible to calculate the tree from a precomputed matrix of distances. In other cases, such as with anatomical traits or languages, the assumption of random and independent differences does not hold, making it necessary to consider some traits to be more relevant than others for determining how related two entities are. In this paper, we present a constraint programming approach that can enforce consistency between bounds on the relative weight of each trait and tree topologies, so that the user can best determine which sets of traits to use and how the entities are likely to be related.
    Natural Computing 06/2014;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We study the computational complexity of the recently proposed nubots model of molecular-scale self-assembly. The model generalises asynchronous cellular automaton to have non-local movement where large assemblies of molecules can be moved around, analogous to millions of molecular motors in animal muscle effecting the rapid movement of macroscale arms and legs. We show that nubots is capable of simulating Boolean circuits of polylogarithmic depth and polynomial size, in only polylogarithmic expected time. In computational complexity terms, we show that any problem from the complexity class NC is solved in polylogarithmic expected time on nubots that use a polynomial amount of workspace. Along the way, we give fast parallel algorithms for a number of problems including line growth, sorting, Boolean matrix multiplication and space-bounded Turing machine simulation, all using a constant number of nubot states (monomer types). Circuit depth is a well-studied notion of parallel time, and our result implies that nubots is a highly parallel model of computation in a formal sense. Asynchronous cellular automata are not capable of such parallelism, and our result shows that adding a movement primitive to such a model, to get the nubots model, drastically increases parallel processing abilities.
    Natural Computing 05/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes an optimization technique inspired by the endocrine system, in particular by the intrinsic mechanism of hormonal regulation. The approach is applicable for many optimization problems, such as multimodal optimization in a static environment, multimodal optimization in a dynamic environment and multi-objective optimization. The advantage of this technique is that it is intuitive and there is no need for a supplementary mechanism to deal with dynamic environments, nor for major revisions in a multi-objective context. The Endocrine Control Evolutionary Algorithm (ECEA) is described. The ECEA is able to estimate and track the multiple optima in a dynamic environment. For multi-objective optimization problems, the issue of finding a good definition of optimality is solved naturally without using Pareto non-dominated in performance evaluation. Instead, the overall preference of the solution is used for fitness assignment. Without any adjustments, just by using a suitable fitness assignment, the ECEA algorithm performs well for the multi-objective optimization problems.
    Natural Computing 03/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a new algorithm for image edge detection based on the theory of universal gravity is proposed. The problem is represented by a discrete space in which each image pixel is considered as a celestial body and its mass is considered to be corresponding to the pixel’s grayscale intensity. To find the edgy pixels a number of moving agents are randomly generated and initialized through the image space. Artificial agents move through the space via the forces of celestial bodies that are located in their neighborhood and in this way they can find the promising edge pixels. A large number of experiments are employed to determine suitable algorithm parameters and confirm the legitimacy of the proposed algorithm. Also, the results are compared with conventional and soft computing based methods like Sobel, Canny and ant-based edge detector. As compared to other standard techniques, our algorithm provides more accurate results over 11 test images via Baddeley’s error metric. The visual and quantitative comparisons reveal the effectiveness and robustness of the proposed algorithm.
    Natural Computing 03/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Particle swarm optimization (PSO) is a population based algorithm for solving global optimization problems. Owing to its efficiency and simplicity, PSO has attracted many researchers' attention and developed many variants. Orthogonal learning particle swarm optimization (OLPSO) is proposed as a new variant of PSO that relies on a new learning strategy called orthogonal learning strategy. The OLPSO differs in the utilization of the information of experience from the standard PSO, in which each particle utilizes its historical best experience and globally best experience through linear summation. In OLPSO, particles can fly in better directions by constructing an efficient exemplar through orthogonal experimental design. However, the global version based orthogonal learning PSO (OLPSO-G) still have some drawbacks in solving some complex multimodal function optimization. In this paper, we proposed a quadratic interpolation based OLPSO-G (QIOLPSO-G), in which, a quadratic interpolation based construction strategy for the personal historical best experience is applied. Meanwhile, opposition-based learning, and Gaussian mutation are also introduced into this paper to increase the diversity of the population and discourage the premature convergence. Experiments are conducted on 16 benchmark problems to validate the effectiveness of the QIOLPSO-G, and comparisons are made with four typical PSO algorithms. The results show that the introduction of the three strategies does enhance the effectiveness of the algorithm.
    Natural Computing 03/2014; 13(1):17-37.
  • [Show abstract] [Hide abstract]
    ABSTRACT: We propose a novel population-based optimization algorithm, Chaotic Evolution (CE), which uses ergodic property of chaos to implement exploration and exploitation functions of an evolutionary algorithm. CE introduces a mathematical mechanism into an iterative process of evolution and simulates ergodic motion in a search space with a simple principle. A control parameter, direction factor rate, is proposed to guide search direction in CE. It is easy to extend its search capability by using different chaotic system in CE algorithm framework. The scalability of CE is higher than that of some other evolutionary computation algorithms. A series of comparative evaluations and investigations is conducted to analyse characteristics of the proposal. Our proposal can obtain better optimization performance by comparing with differential evolution and some of its variants. We point out that the chaos theory is used not only to describe and explain a non-linear system, but also to implement a variety of optimization algorithms based on its ergodic property.
    Natural Computing 03/2014; 13(1).
  • [Show abstract] [Hide abstract]
    ABSTRACT: Mathematical modeling has become an integral part of synthesizing gene regulatory networks. One of the common problems is the determination of parameters, which are a part of the model description. In the present work, we propose a customized genetic algorithm as a method to determine the parameters such that the underlying oscillatory system exhibits the target behavior. We propose a problem specific, adaptive fitness function evaluation and a method to quantify the effect of a single parameter on the system response. The properties of the algorithm are highlighted and confirmed on two test cases of synthetic biological oscillators.
    Natural Computing 03/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Synthetic aperture radar (SAR) image segmentation is an important problem of the realm of image segmentation. In this study, a novel SAR image segmentation algorithm using a multi-objective evolutionary algorithm based on decomposition with non-local means denoising (MISD) is proposed. The novelty of MISD lies in the following issues: (1) an effective multi-objective method with decomposition to solve SAR image segmentation; (2) in order to denoise the SAR images and retain the details, we employ non-local means to remove the noise. The multi-objective decomposition method makes MISD have lower computational complexity. In order to evaluate the performance of the new method, we compared the results with three other popular segmentation approaches on four simulated and two real SAR images. In our experiments, the new method can always find better results, which means MISD is a promising SAR image segmentation method.
    Natural Computing 03/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Empirical modeling, which is a process of developing a mathematical model of a system from experimental data, has attracted many researchers due to its wide applicability. Finding both the structure and appropriate numeric coefficients of the model is a real challenge. Genetic Programming (GP) has been applied by many practitioners to solve this problem. However, there are a number of issues which require careful attention while applying GP to empirical modeling problems. We begin with highlighting the importance of these issues including: computational efforts in evolving a model, premature convergence, generalization ability of an evolved model, building hierarchical models, and constant creation techniques. We survey and classify different approaches used by GP researchers to deal with the mentioned issues. We present different performance measures which are useful to report the results of analysis of GP runs. We hope this work would help the reader by facilitating to understand key concepts and practical issues of GP and steering in selection of an appropriate approach to solve a particular issue effectively.
    Natural Computing 02/2014;