Journal of Heuristics (J HEURISTICS )

Publisher: Springer Verlag

Description

The field of heuristics within the area of optimization has become extremely important as researchers and practitioners attempt to solve larger and more complex problems. With the integration of artificial intelligence principles in the area of optimization the field of heuristics has experienced a rapid growth. The Journal of Heuristics considers the importance of theoretical empirical and experimental work related to the development of heuristics providing a forum for advancing the state-of-the-art in the theory and practical application of techniques for solving problems approximately that cannot be solved exactly. The journal fosters the development understanding and practical use of heuristic solution techniques for solving business engineering and societal problems. The following areas are of interest to the journal: Practical Applications showing the benefits achieved by the particular application (i.e. reduced costs increased profits etc.). These papers can deal with new application areas for which no previous solution methods exist or with areas where previous methods have proved unsatisfactory. Theoretical Developments dealing with theoretical work on existing or new heuristics. These papers generally present mathematical work such as probabilistic analysis convergence results worst-case analysis combinatorial leverage results and theorems about special structures. Decision Analysis models that consider issues of rational decision making with limited information. Of special interest is the interface between decision analysis and the search principles used in artificial intelligence and the integration of processes in which information becomes available in stages or where data and inputs are subject to uncertainties. Artificial Intelligence -based heuristics applied to a wide variety of problems in optimization classification statistics recognition planning design and so forth. Typical approaches include genetic algorithms neural networks simulated annealing and tabu search. Learning paradigms with implications for heuristic problem solving are also encouraged. Computational Experimentation focuses on computational comparisons of heuristic methods. The Journal of Heuristics provides strict guidelines under which the experiments are to be conducted. The Journal of Heuristics also provides access to all problem instances used for computation experiments published in this area.

  • Impact factor
    1.47
    Show impact factor history
     
    Impact factor
  • 5-year impact
    1.80
  • Cited half-life
    9.00
  • Immediacy index
    0.17
  • Eigenfactor
    0.00
  • Article influence
    0.90
  • Website
    Journal of Heuristics website
  • Other titles
    Journal of heuristics (Online)
  • ISSN
    1381-1231
  • OCLC
    41570324
  • Material type
    Document, Periodical, Internet resource
  • Document type
    Internet Resource, Computer File, Journal / Magazine / Newspaper

Publisher details

Springer Verlag

  • Pre-print
    • Author can archive a pre-print version
  • Post-print
    • Author can archive a post-print version
  • Conditions
    • Author's pre-print on pre-print servers such as arXiv.org
    • Author's post-print on author's personal website immediately
    • Author's post-print on any open access repository after 12 months after publication
    • Publisher's version/PDF cannot be used
    • Published source must be acknowledged
    • Must link to publisher version
    • Set phrase to accompany link to published version (see policy)
    • Articles in some journals can be made Open Access on payment of additional charge
  • Classification
    ​ green

Publications in this journal

  • [Show abstract] [Hide abstract]
    ABSTRACT: We describe VarClust, a gossip-based decentralized clustering algorithm designed to support multi-mean decentralized aggregation in energy-constrained wireless sensor networks. We empirically demonstrate that VarClust is at least as accurate as, and requires less node-to-node communication (and hence consumes less energy) than, a state-of-the-art aggregation approach, affinity propagation. This superiority holds for both the clustering and aggregation phases of inference, and is demonstrated over a range of noise levels and for a range of random and small-world graph topologies.
    Journal of Heuristics 12/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article describes a heuristic for scheduling so-called ‘modular’ projects. Exact solutions to this NP-hard problem can be obtained with existing branch-and-bound and dynamic-programming algorithms, but only for small to medium-size instances. The proposed heuristic, by contrast, can be used even for large instances, or when instances are particularly difficult because of their characteristics, such as a low network density. The proposed heuristic draws from existing results in the literature on sequential testing, which will be briefly reviewed. The performance of the heuristic is assessed over a dataset of 360 instances.
    Journal of Heuristics 12/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Protecting communication networks against failures is becoming increasingly important as they have become an integrated part of our society. Cable failures are fairly common, but it is unacceptable for a single cable failure to disconnect communication for more than a few seconds—hence protection schemes are employed. In contrast to manual intervention, automatic protection schemes such as shared backup path protection (SBPP) can recover from failure quickly and efficiently. SBPP is a simple but efficient protection scheme that can be implemented in backbone networks with technology available today. In SBPP backup paths are planned in advance for every failure scenario in order to recover from failures quickly and efficiently. Planning SBPP is an NP-hard optimization problem, and previous work confirms that it is time-consuming to solve the problem in practice using exact methods. We present heuristic algorithms and lower bound methods for the SBPP planning problem. Experimental results show that the heuristic algorithms are able to find good quality solutions in minutes. A solution gap of less than \(3.5~\%\) was achieved for 5 of 7 benchmark instances (and a gap of less than \(11~\%\) for the remaining instances.)
    Journal of Heuristics 10/2014; 20(5).
  • [Show abstract] [Hide abstract]
    ABSTRACT: What is the minimum tour visiting all lines in a subway network? In this paper we study the problem of constructing the shortest tour visiting all lines of a city railway system. This combinatorial optimization problem has links with the classic graph circuit problems and operations research. A broad set of fast algorithms is proposed and evaluated on simulated networks and example cities of the world. We analyze the trade-off between algorithm runtime and solution quality. Time evolution of the trade-off is also captured. Then, the algorithms are tested on a range of instances with diverse features. On the basis of the algorithm performance, measured with various quality indicators, we draw conclusions on the nature of the above combinatorial problem and the tacit assumptions made while designing the algorithms.
    Journal of Heuristics 10/2014; 20(5).
  • [Show abstract] [Hide abstract]
    ABSTRACT: Local search algorithms play an essential role in solving large-scale combinatorial optimization problems. Traditionally, the local search procedure is guided mainly by the objective function of the problem. Hence, the greedy improvement paradigm poses the potential threat of prematurely getting trapped in low quality attraction basins. In this study, we intend to utilize the information extracted from the relaxed problem, to enhance the performance of the local search process. Considering the Lin-Kernighan-based local search (LK-search) for the p-median problem as a case study, we propose the Lagrangian relaxation Assisted Neighborhood Search (LANS). In the proposed algorithm, two new mechanisms, namely the neighborhood reduction and the redundancy detection, are developed. The two mechanisms exploit the information gathered from the relaxed problem, to avoid the search from prematurely targeting low quality directions, and to cut off the non-promising searching procedure, respectively. Extensive numerical results over the benchmark instances demonstrate that LANS performs favorably to LK-search, which is among the state-of-the-art local search algorithms for the p-median problem. Furthermore, by embedding LANS into other heuristics, the best known upper bounds over several benchmark instances could be updated. Besides, run-time distribution analysis is also employed to investigate the reason why LANS works. The findings of this study confirm that the idea of improving local search by leveraging the information induced from relaxed problem is feasible and practical, and might be generalized to a broad class of combinatorial optimization problems.
    Journal of Heuristics 10/2014; 20(5).
  • [Show abstract] [Hide abstract]
    ABSTRACT: Over the last several decades researchers have addressed the use of statistical techniques to estimate the optimal values of difficult optimization problems. These efforts have been developed in different communities with a wide range of different applications in mind. In this paper we review the theory and applications of these approaches and discuss their strengths and weaknesses. We conclude the paper with a discussion of issues to consider when using these methods in computational experiments, and suggest directions for future research.
    Journal of Heuristics 06/2014; 20(3).
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we study the multiobjective version of the set covering problem. To our knowledge, this problem has only been addressed in two papers before, and with two objectives and heuristic methods. We propose a new heuristic, based on the two-phase Pareto local search, with the aim of generating a good approximation of the Pareto efficient solutions. In the first phase of this method, the supported efficient solutions or a good approximation of these solutions is generated. Then, a neighborhood embedded in the Pareto local search is applied to generate non-supported efficient solutions. In order to get high quality results, two elaborate local search techniques are considered: a large neighborhood search and a variable neighborhood search. We intensively study the parameters of these two techniques. We compare our results with state-of-the-art results and we show that with our method, better results are obtained for different indicators.
    Journal of Heuristics 04/2014; 20(2).
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose a heuristic algorithm to solve a new variant of the partial set covering problem. In this variant, each element $e_i$ has a gain $g_i$ (i.e., a positive profit), each set $s_j$ has a cost $c_j$ (i.e., a negative profit), and each set $s_j$ is part of a unique group $G_k$ that has a fixed cost $f_k$ (i.e., a negative profit). The objective is to maximize profit and it is not necessary to cover all of the elements. We present an industrial application of the model and propose a hybrid heuristic algorithm to solve it; the proposed algorithm is an iterated-local-search algorithm that uses two levels of perturbations and a tabu-search heuristic. Whereas the first level of perturbation diversifies the search around the current local optimum, the second level of perturbation performs long jumps in the search space to help escape from local optima with large basins of attraction. The proposed algorithm is evaluated on thirty real-world problems and compared to a memetic algorithm. Computational results show that most of the solutions found by ITS are either optimal or very close to optimality.
    Journal of Heuristics 04/2014; 20(2).
  • [Show abstract] [Hide abstract]
    ABSTRACT: The complexity status of the minimum dilation triangulation (MDT) problem for a general point set is unknown. Therefore, we focus on the development of approximated algorithms to find high quality triangulations of minimum dilation. For an initial approach, we design a greedy strategy able to obtain approximate solutions to the optimal ones in a simple way. We also propose an operator to generate the neighborhood which is used in different algorithms: Local Search, Iterated Local Search, and Simulated Annealing. Besides, we present an algorithm called Random Local Search where good and bad solutions are accepted using the previous mentioned operator. For the experimental study we have created a set of problem instances since no reference to benchmarks for these problems were found in the literature. We use the sequential parameter optimization toolbox for tuning the parameters of the SA algorithm. We compare our results with those obtained by the OV-MDT algorithm that uses the obstacle value to sort the edges in the constructive process. This is the only available algorithm found in the literature. Through the experimental evaluation and statistical analysis, we assess the performance of the proposed algorithms using this operator.
    Journal of Heuristics 04/2014; 20(2).
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we study an interpretation of the sample-based approach to chance-constrained programming problems grounded in statistical testing theory. On top of being simple and pragmatic, this approach is theoretically well founded, non parametric and leads to a general method for leveraging existing heuristic algorithms for the deterministic case to their chance-constrained counterparts. Throughout this paper, this algorithm design approach is illustrated on a real world graph partitioning problem which crops up in the field of compilation for parallel systems. Extensive computational results illustrate the practical relevance of the approach, as well as the robustness of the obtained solutions.
    Journal of Heuristics 04/2014; 20(3):261-290.
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper we present a three-phase heuristic for the Capacitated Location-Routing Problem. In the first stage, we apply a GRASP followed by local search procedures to construct a bundle of solutions. In the second stage, an integer-linear program (ILP) is solved taking as input the different routes belonging to the solutions of the bundle, with the objective of constructing a new solution as a combination of these routes. In the third and final stage, the same ILP is iteratively solved by column generation to improve the solutions found during the first two stages. The last two stages are based on a new model, the location-reallocation model, which generalizes the capacitated facility location problem and the reallocation model by simultaneously locating facilities and reallocating customers to routes assigned to these facilities. Extensive computational experiments show that our method is competitive with the other heuristics found in the literature, yielding the tightest average gaps on several sets of instances and being able to improve the best known feasible solutions for some of them.
    Journal of Heuristics 02/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Less-Than-Truckload (LTL) carriers generally serve geographical regions that are more localized than the inter-city line-hauls served by truckload carriers. That localization can lead to urban freight transportation routes that overlap. If trucks are traveling with less than full loads, there typically exist opportunities for carriers to collaborate over such routes. We introduce a two stage framework for LTL carrier collaboration. Our first stage involves collaboration between multiple carriers at the entrance to the city and can be formulated as a vehicle routing problem with time windows (VRPTW). We employ guided local search for solving this VRPTW. The second stage involves collaboration between carriers at transshipment facilities while executing their routes identified in phase one. For solving the second stage problem, we develop novel local search heuristics, one of which leverages integer programming to efficiently explore the union of neighborhoods defined by new problem-specific move operators. Our computational results indicate that integrating integer programming with local search results in at least an order of magnitude speed up in the second stage problem. We also perform sensitivity analysis to assess the benefits from collaboration. Our results indicate that distance savings of 7–15 % can be achieved by collaborating at the entrance to the city. Carriers involved in intra-city collaboration can further save 3–15 % in total distance traveled, and also reduce their overall route times.
    Journal of Heuristics 12/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: A mobile device connects to the cell tower (base station) from which it receives the strongest signal. As the device moves it may connect to a series of towers. The process in which the device changes the base station it is connected to is called handover. A cell tower is connected to a radio network controller (RNC) which controls many of its operations, including handover. Each cell tower handles an amount of traffic and each radio network controller has capacity to handle a maximum amount of traffic from all base stations connected to it. Handovers between base stations connected to different RNCs tend to fail more often than handovers between base stations connected to the same RNC. Handover failures result in dropped connections and therefore should be minimized. The Handover Minimization Problem is to assign towers to RNCs such that RNC capacity is not violated and the number of handovers between base stations connected to different RNCs is minimized. We describe an integer programming formulation for the handover minimization problem and show that state-of-the-art integer programming solvers can solve only very small instances of the problem. We propose several randomized heuristics for finding approximate solutions of this problem, including a GRASP with path-relinking for the generalized quadratic assignment problem, a GRASP with evolutionary path-relinking, and a biased random-key genetic algorithm. Computational results are presented.
    Journal of Heuristics 12/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents a general-purpose software framework dedicated to the design, the analysis and the implementation of local search metaheuristics: ParadisEO-MO. A substantial number of single solution-based local search metaheuristics has been proposed so far, and an attempt of unifying existing approaches is here presented. Based on a fine-grained decomposition, a conceptual model is proposed and is validated by regarding a number of state-of-the-art methodologies as simple variants of the same structure. This model is then incorporated into the ParadisEO-MO software framework. This framework has proven its efficiency and high flexibility by enabling the resolution of many academic and real-world optimization problems from science and industry.
    Journal of Heuristics 12/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Many real-life problems are over-constrained, so that no solution satisfying all their constraints exists. Soft constraints, with costs denoting how much the constraints are violated, are used to solve these problems. We use the edit-distance based SoftRegular constraint as an example to show that a propagation algorithm that sometimes underestimates the cost may guide the search to incorrect (non-optimal) solutions to an over-constrained problem. To compute correctly the cost for the edit-distance based SoftRegular constraint, we present a quadratic-time propagation algorithm based on dynamic programming and a proof of its correctness. We also give an improved propagation algorithm using an idea of computing the edit distance between two strings, which may also be applied to other constraints with propagators based on dynamic programming. The asymptotic time complexity of our improved propagator is always at least as good as the one of our quadratic-time propagator, but significantly better when the edit distance is small. Our propagators achieve domain consistency on the problem variables and bounds consistency on the cost variable. Our method can also be adapted for the violation measure of the edit-distance based Regular constraint for constraint-based local search.
    Journal of Heuristics 10/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: The paper focuses on automatic programming and how to synthesize stochastic local search algorithms using automatic design of algorithms through evolution (ADATE). The main goal is to provide support for the hypothesis that automatic programming can generate competitive heuristic algorithms. A well studied and highly optimized SAT solver, G2WSAT, is used as a case study. The results indicate that automatic programming is an effective design tool for heuristics and that there may be many well studied optimization problems that could benefit from the ADATE system and the specification methodology that we describe.
    Journal of Heuristics 10/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: The feedback vertex set problem (FVSP) consists in making a given directed graph acyclic by removing as few vertices as possible. In spite of the importance of this NP-hard problem, no local search approach had been proposed so far for tackling it. Building on a property of acyclic graphs, we suggest in this paper a new representation of the solutions of the FVSP (feedback sets). Thanks to this solution representation, we are able to design a local transformation (equivalent to a neighborhood) that changes a feedback set into a new one. Based on this neighborhood, we have developed a simulated annealing algorithm for the FVSP. Our experiments show that our algorithm outperforms the best existing heuristic, namely the greedy adaptive search procedure by Pardalos et al.
    Journal of Heuristics 10/2013;