Journal of Heuristics (J HEURISTICS)

Publisher: Springer Verlag

Journal description

The field of heuristics within the area of optimization has become extremely important as researchers and practitioners attempt to solve larger and more complex problems. With the integration of artificial intelligence principles in the area of optimization the field of heuristics has experienced a rapid growth. The Journal of Heuristics considers the importance of theoretical empirical and experimental work related to the development of heuristics providing a forum for advancing the state-of-the-art in the theory and practical application of techniques for solving problems approximately that cannot be solved exactly. The journal fosters the development understanding and practical use of heuristic solution techniques for solving business engineering and societal problems. The following areas are of interest to the journal: Practical Applications showing the benefits achieved by the particular application (i.e. reduced costs increased profits etc.). These papers can deal with new application areas for which no previous solution methods exist or with areas where previous methods have proved unsatisfactory. Theoretical Developments dealing with theoretical work on existing or new heuristics. These papers generally present mathematical work such as probabilistic analysis convergence results worst-case analysis combinatorial leverage results and theorems about special structures. Decision Analysis models that consider issues of rational decision making with limited information. Of special interest is the interface between decision analysis and the search principles used in artificial intelligence and the integration of processes in which information becomes available in stages or where data and inputs are subject to uncertainties. Artificial Intelligence -based heuristics applied to a wide variety of problems in optimization classification statistics recognition planning design and so forth. Typical approaches include genetic algorithms neural networks simulated annealing and tabu search. Learning paradigms with implications for heuristic problem solving are also encouraged. Computational Experimentation focuses on computational comparisons of heuristic methods. The Journal of Heuristics provides strict guidelines under which the experiments are to be conducted. The Journal of Heuristics also provides access to all problem instances used for computation experiments published in this area.

Current impact factor: 1.14

Impact Factor Rankings

2016 Impact Factor Available summer 2017
2014 / 2015 Impact Factor 1.135
2013 Impact Factor 1.359
2012 Impact Factor 1.471
2011 Impact Factor 1.262
2010 Impact Factor 1.623
2009 Impact Factor 1.264
2008 Impact Factor 1.064
2007 Impact Factor 0.644
2006 Impact Factor 0.74
2005 Impact Factor 0.551
2004 Impact Factor 1.113
2003 Impact Factor 0.633
2002 Impact Factor 0.655
2001 Impact Factor 0.404
2000 Impact Factor 0.65

Impact factor over time

Impact factor

Additional details

5-year impact 1.94
Cited half-life 8.50
Immediacy index 0.19
Eigenfactor 0.00
Article influence 0.83
Website Journal of Heuristics website
Other titles Journal of heuristics (Online)
ISSN 1381-1231
OCLC 41570324
Material type Document, Periodical, Internet resource
Document type Internet Resource, Computer File, Journal / Magazine / Newspaper

Publisher details

Springer Verlag

  • Pre-print
    • Author can archive a pre-print version
  • Post-print
    • Author can archive a post-print version
  • Conditions
    • Author's pre-print on pre-print servers such as
    • Author's post-print on author's personal website immediately
    • Author's post-print on any open access repository after 12 months after publication
    • Publisher's version/PDF cannot be used
    • Published source must be acknowledged
    • Must link to publisher version
    • Set phrase to accompany link to published version (see policy)
    • Articles in some journals can be made Open Access on payment of additional charge
  • Classification

Publications in this journal

  • [Show abstract] [Hide abstract]
    ABSTRACT: The capacitated single assignment hub location problem with modular link capacities is a variant of the classical hub location problem in which the cost of using edges is not linear but stepwise, and the hubs are restricted in terms of transit capacity rather than in the incoming traffic. We propose a metaheuristic algorithm based on strategic oscillation, a methodology originally introduced in the context of tabu search. Our method incorporates several designs for constructive and destructive algorithms, together with associated local search procedures, to balance diversification and intensification for an efficient search. Computational results on a large set of instances show that, in contrast to exact methods that can only solve small instances optimally, our metaheuristic is able to find high-quality solutions on larger instances in short computing times. In addition, the new method, which joins tabu search strategies with strategic oscillation, outperforms the previous tabu search implementation.
    No preview · Article · Jan 2016 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: The feature selection problem aims to choose a subset of a given set of features that best represents the whole in a particular aspect, preserving the original semantics of the variables on the given samples and classes. In 2004, a new approach to perform feature selection was proposed. It was based on a NP-complete combinatorial optimisation problem called (\(\alpha ,\beta \))-k-feature set problem. Although effective for many practical cases, which made the approach an important feature selection tool, the only existing solution method, proposed on the original paper, was found not to work well for several instances. Our work aims to cover this gap found on the literature, quickly obtaining high quality solutions for the instances that existing approach can not solve. This work proposes a heuristic based on the greedy randomised adaptive search procedure and tabu search to address this problem; and benchmark instances to evaluate its performance. The computational results show that our method can obtain high quality solutions for both real and the proposed artificial instances and requires only a fraction of the computational resources required by the state of the art exact and heuristic approaches which use mixed integer programming models.
    No preview · Article · Dec 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we improved two classical degree-based variable ordering heuristics, (Formula presented.) and (Formula presented.). We propose a method using the summation of constraint tightness in degree-based heuristics. We also propose two methods to calculate dynamic constraint tightness for binary extensional constraints and non-binary intensional constraints respectively. Our work shows how constraint tightness can be practically used to guide search. We performed a number of experiments on some benchmark instances. The results have shown that, the new heuristics improve the classical ones by both computational time and search tree nodes and they are more efficient than some other successful heuristics on the instances where the classical heuristics work well.
    No preview · Article · Nov 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: Large scale disasters, natural or human-made, have huge consequences on people and infrastructures. After a disaster strikes, the distribution of humanitarian aid to the population affected is one of the main operations to be carried out, and several crucial decisions must be made in a short time. This paper addresses a last-mile distribution problem in disaster relief operations, under insecure and uncertain conditions. A model is presented that takes into account the cost and time of operation, the security and reliability of the routes, and the equity of aid handed out. The output of the model consists of a detailed set of itineraries that can be used to build an implementable distribution plan. Given its high complexity, the resulting problem is solved using a multi-criteria metaheuristic approach. In particular, a constructive algorithm and a GRASP based metaheuristic are developed, which are tested in a case study based on the 2010 Haiti earthquake.
    No preview · Article · Oct 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: Given a graph, the critical node detection problem can be broadly defined as identifying the minimum subset of nodes such that, if these nodes were removed, some metric of graph connectivity is minimised. In this paper, two variants of the critical node detection problem are addressed. Firstly, the basic critical node detection problem where, given the maximum number of nodes that can be removed, the objective is to minimise the total number of connected nodes in the graph. Secondly, the cardinality constrained critical node detection problem where, given the maximum allowed connected graph component size, the objective is to minimise the number of nodes required to be removed to achieve this. Extensive computational experiments, using a range of sparse real-world graphs, and a comparison with previous exact results demonstrate the effectiveness of the proposed algorithms.
    No preview · Article · Oct 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: Digital cameras are equipped with passive autofocus mechanisms where a lens is focused using only the camera’s optical system and an algorithm for controlling the lens. The speed and accuracy of the autofocus algorithm are crucial to user satisfaction. In this paper, we address the problems of identifying the global optimum and significant local optima (or peaks) when focusing an image. We show that supervised machine learning techniques can be used to construct a passive autofocus heuristic for these problems that out-performs an existing hand-crafted heuristic and other baseline methods. In our approach, training and test data were produced using an offline simulation on a suite of 25 benchmarks and correctly labeled in a semi-automated manner. A decision tree learning algorithm was then used to induce an autofocus heuristic from the data. The automatically constructed machine-learning-based (ml-based) heuristic was compared against a previously proposed hand-crafted heuristic for autofocusing and other baseline methods. In our experiments, the ml-based heuristic had improved speed—reducing the number of iterations needed to focus by 37.9 % on average in common photography settings and 22.9 % on average in a more difficult focus stacking setting—while maintaining accuracy.
    No preview · Article · Oct 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper investigates perturbation operators for variable neighborhood search (VNS) approaches for two related problems, namely the pickup and delivery traveling salesman problem with LIFO loading (TSPPDL) and FIFO loading (TSPPDF). Our study is motivated by the fact that previously published results on VNS approaches on the TSPPDL suggest that the perturbation operation has the most significant effect on solution quality. We propose a new perturbation operator for the TSPPDL that achieves better results on average than the existing best approach. We also devise new perturbation operators for the TSPPDF that combine request removal and request insertion operations, and investigate which combination of request removal and request insertion operations produces the best results. Our resultant VNS that employs our best perturbation operator outperforms the best existing TSPPDF approach on benchmark test data.
    No preview · Article · Oct 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Maximum Satisfiability (MaxSAT) problem is an optimization variant of the Satisfiability (SAT) problem. Several combinatorial optimization problems can be translated into a MaxSAT formula. Among exact MaxSAT algorithms, SAT-based MaxSAT algorithms are the best performing approaches for real-world problems. We have extended the WPM2 algorithm by adding several improvements. In particular, we show that by solving some subproblems of the original MaxSAT instance we can dramatically increase the efficiency of WPM2. This led WPM2 to achieve the best overall results at the international MaxSAT Evaluation 2013 (MSE13) on industrial instances. Then, we present additional techniques and heuristics to further exploit the information retrieved from the resolution of the subproblems. We exhaustively analyze the impact of each improvement what contributes to our understanding of why they work. This architecture allows to convert exact algorithms into efficient incomplete algorithms. The resulting solver had the best results on industrial instances at the incomplete track of the latest international MSE.
    No preview · Article · Sep 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: MOEA/D is one of the promising evolutionary approaches for solving multi and many-objective optimization problems. MOEA/D decomposes a multi-objective optimization problem into a number of single objective optimization problems. Each single objective optimization problem is defined by a scalarizing function using a weight vector. In MOEA/D, there are several scalarizing approaches such as weighted Tchebycheff, reciprocal weighted Tchebycheff, weighted sum (WS) and penalty-based boundary intersection (PBI). Each scalarizing function has a characteristic effect on the search performance of MOEA/D and provides a scenario of multi-objective solution search. To improve the availability of MOEA/D framework for solving various kinds of problems, it is important to provide a new scalarizing function which has different characteristics from the conventional scalarizing functions. In particular, the conventional scalarizing approaches face a difficulty to approximate a widely spread Pareto front in some problems. To approximate the entire Pareto front by improving the spread of solutions in the objective space and enhance the search performance of MOEA/D in multi and many-objective optimization problems, in this work we propose the inverted PBI scalarizing approach which is an extension of the conventional PBI and WS. In this work, we analyze differences between inverted PBI and other scalarizing functions, and compare the search performances of NSGA-III and five MOEA/Ds using weighted Tchebycheff, reciprocal weighted Tchebycheff, WS, PBI and inverted PBI in many-objective knapsack problems and WFG4 problems with 2–8 objectives. As results, we show that the inverted PBI based MOEA/D achieves higher search performance than other algorithms in problems with many-objectives and the difficulty to approximate a widely spread Pareto front in the objective space. Also, we show the robustness of the inverted PBI on Pareto front geometry by using problems with four representative concave, linear, convex and discontinuous Pareto fronts.
    No preview · Article · Sep 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: DNA-based technologies, such as nanotechnology, DNA sequencing or DNA computing, have grown significantly in recent years. The inherent properties presented in DNA molecules (storage density, parallelism possibilities, energy efficiency, etc) make these particles unique computational elements. However, DNA libraries used for computation have to fulfill strict biochemical properties to avoid undesirable reactions, because those reactions usually lead to incorrect calculations. DNA sequence design is an NP-hard problem which involves several heterogeneous and conflicting biochemical design criteria which may produce some difficulties for traditional optimization methods. In this paper, we propose a multiobjective evolutionary algorithm which is hybridized with a local search heuristics specially designed for the problem. The proposal is a multiobjective variant of the teaching–learning-based optimization algorithm. On the other hand, with the aim of ensuring the performance of our proposal, we have made comparisons with the well-known fast-non dominated sorting genetic algorithm and strength Pareto evolutionary algorithm 2, and other approaches published in the literature. After performing diverse comparisons, we can conclude that our hybrid approach is able to obtain very promising DNA sequences suitable for computation which outperform in reliability other results generated with other existing sequence design techniques published in the literature.
    No preview · Article · Aug 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Cableway Location Problem (CLP) is a facility location problem usually studied as a part of a hierarchical approach for large cable yarding systems outside of Europe. Small adaptable cable yarding systems are used in Europe. This increases the number of possible landing sites and makes the layout problem hard to solve to optimality. Here, two approaches are presented that solve the novel European CLP (E-CLP). The methods are tested on several generated cases and one real world case. The lateral yarding distance is introduced in the cost calculations to improve the quality of the solutions.
    No preview · Article · Jun 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this article we propose a hybrid genetic algorithm for the discrete \((r|p)\) -centroid problem. We consider the competitive facility location problem where two non-cooperating companies enter a market sequentially and compete for market share. The first decision maker, called the leader, wants to maximize his market share knowing that a follower will enter the same market. Thus, for evaluating a leader’s candidate solution, a corresponding follower’s subproblem needs to be solved, and the overall problem therefore is a bi-level optimization problem. This problem is \(\Sigma _2^P\) -hard, i.e., harder than any problem in NP (if \(\hbox {P}\not =\hbox {NP}\) ). A heuristic approach is employed which is based on a genetic algorithm with tabu search as local improvement procedure and a complete solution archive. The archive is used to store and convert already visited solutions in order to avoid costly unnecessary re-evaluations. Different solution evaluation methods are combined into an effective multi-level evaluation scheme. The algorithm is tested on well-known benchmark sets of both Euclidean and non-Euclidean instances as well as on larger newly created instances. Especially on the Euclidean instances our algorithm is able to exceed previous state-of-the-art heuristic approaches in solution quality and running time in most cases.
    No preview · Article · Jun 2015 · Journal of Heuristics
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a detailed analysis of the scalability and parallelization of Local Search algorithms for constraint-based and SAT (Boolean satisfiability) solvers. We propose a framework to estimate the parallel performance of a given algorithm by analyzing the runtime behavior of its sequential version. Indeed, by approximating the runtime distribution of the sequential process with statistical methods, the runtime behavior of the parallel process can be predicted by a model based on order statistics. We apply this approach to study the parallel performance of a constraint-based Local Search solver (Adaptive Search), two SAT Local Search solvers (namely Sparrow and CCASAT), and a propagation-based constraint solver (Gecode, with a random labeling heuristic). We compare the performance predicted by our model to actual parallel implementations of those methods using up to 384 processes. We show that the model is accurate and predicts performance close to the empirical data. Moreover, as we study different types of problems, we observe that the experimented solvers exhibit different behaviors and that their runtime distributions can be approximated by two types of distributions: exponential (shifted and non-shifted) and lognormal. Our results show that the proposed framework estimates the runtime of the parallel algorithm with an average discrepancy of 21 % w.r.t. the empirical data across all the experiments with the maximum allowed number of processors for each technique.
    Preview · Article · May 2015 · Journal of Heuristics
  • [Show abstract] [Hide abstract]
    ABSTRACT: Community detection is one of the most important problems in the field of complex networks in recent years. The majority of present algorithms only find disjoint communities, however, community often overlap to some extent in many real-world networks. In this paper, an improved multi-objective quantum-behaved particle swarm optimization (IMOQPSO) based on spectral-clustering is proposed to detect the overlapping community structure in complex networks. Firstly, the line graph of the graph modeling the network is formed, and a spectral method is employed to extract the spectral information of the line graph. Secondly, IMOQPSO is employed to solve the multi-objective optimization problem so as to resolve the separated community structure in the line graph which corresponding to the overlapping community structure in the graph presenting the network. Finally, a fine-tuning strategy is adopted to improve the accuracy of community detection. The experiments on both synthetic and real-world networks demonstrate our method achieves cover results which fit the real situation in an even better fashion.
    No preview · Article · Apr 2015 · Journal of Heuristics