ArticlePDF Available

Abstract

Tabu search is a “higher level” heuristic procedure for solving optimization problems, designed to guide other methods (or their component processes) to escape the trap of local optimality. Tabu search has obtained optimal and near optimal solutions to a wide variety of classical and practical problems in applications ranging from scheduling to telecommunications and from character recognition to neural networks. It uses flexible structures memory (to permit search information to be exploited more thoroughly than by rigid memory systems or memoryless systems), conditions for strategically constraining and freeing the search process (embodied in tabu restrictions and aspiration criteria), and memory functions of varying time spans for intensifying and diversifying the search (reinforcing attributes historically found good and driving the search into new regions). Tabu search can be integrated with branch-and-bound and cutting plane procedures, and it has the ability to start with a simple implementation th...
... The two-phase node exchange strategy breaks an exchange operation on a node pair into two distinct phases: a "removal phase" removes a node from the residual graph and an "add phase" adds a removed node back to the residual graph. This type of two-phase strategy is often used in conjunction with a candidate list approach in tabu search (see, e.g., [19]). In our case, the two-phase node exchange strategy first selects a component at random among the qualified large components and removes a node v from the selected component with the node weighting scheme. ...
... Our node weighing scheme follows the general penalty idea for constraint satisfaction problems, which was first used in this setting in Morris's breakout method [32]. We note that this scheme is also an instance of a tabu search frequency-based memory (see, e.g., the six frequency-based memory classes proposed earlier in [19] and their refinements in [22]). To the best of our knowledge, it is the first time that a node weight learning technique is applied to a heuristic procedure for CNP. ...
Preprint
Critical node problems involve identifying a subset of critical nodes from an undirected graph whose removal results in optimizing a pre-defined measure over the residual graph. As useful models for a variety of practical applications, these problems are computational challenging. In this paper, we study the classic critical node problem (CNP) and introduce an effective memetic algorithm for solving CNP. The proposed algorithm combines a double backbone-based crossover operator (to generate promising offspring solutions), a component-based neighborhood search procedure (to find high-quality local optima) and a rank-based pool updating strategy (to guarantee a healthy population). Specially, the component-based neighborhood search integrates two key techniques, i.e., two-phase node exchange strategy and node weighting scheme. The double backbone-based crossover extends the idea of general backbone-based crossovers. Extensive evaluations on 42 synthetic and real-world benchmark instances show that the proposed algorithm discovers 21 new upper bounds and matches 18 previous best-known upper bounds. We also demonstrate the relevance of our algorithm for effectively solving a variant of the classic CNP, called the cardinality-constrained critical node problem. Finally, we investigate the usefulness of each key algorithmic component.
... Therefore, using metaheuristic algorithms that can provide good-quality solutions with a reasonable computational effort is relevant. Different kinds of heuristics and meta-heuristics have been used for solving combinatorial optimization problems like Genetic Algorithms [28], Simulated Annealing [29], Ant Colony Optimization [30], the Gravitational Search Algorithm [31], Tabu Search [32], and Particle Swarm Optimization [33], among many others. To solve instances of the CLTPI, two metaheuristic optimization algorithms were selected: (i) the Particle Swarm Optimization algorithm (PSO) and (ii) the Biased Random-Key Genetic Algorithm (BRKGA). ...
Article
Full-text available
This paper addresses the Capacitated Location Tree Problem with Interconnections, a new combinatorial optimization problem with applications in network design. In this problem, the required facilities picked from a set of potential facilities must be opened to serve customers using a tree-shaped network. Costs and capacities are associated with the opening of facilities and the establishment of network links. Customers have a given demand that must be satisfied while respecting the facilities and link capacities. The problem aims to minimize the total cost of designing a distribution network while considering facility opening costs, demand satisfaction, capacity constraints, and the creation of interconnections to enhance network resilience. A valid mixed-integer programming was proposed and an exact solution method based on the formulation was used to solve small- and medium-sized instances. To solve larger instances two metaheuristic approaches were used. A specific decoder procedure for the metaheuristic solution approaches was also proposed and used to help find solutions, especially for large instances. Computational experiments and results using the three solution approaches are also presented. Finally, a case study on the design of electrical transportation systems was presented and solved.
... This prevents the algorithm from becoming trapped in local optima. By maintaining a tabu list of recently visited solutions, the algorithm can explore new regions of the search space and potentially find better solutions [28,29]. ...
Article
Full-text available
Performing synchronization in public transport is one of the most challenging tasks that transport managers perform when organizing the processes of passenger servicing. Many variables characterizing existing public transport lines should be considered in the final timetable; in addition, the complexity of the transportation system, the variability in transport demand, and the stochasticity of the servicing process both in time and space have a significant influence on the result of synchronization. The synchronization problem in real-world applications does not have an exact solution, so in practice, a variety of techniques were developed to achieve a rational solution in a reasonable time. In our paper, we classify existing approaches to solving the problem of public transport synchronization, describe the essence of the most promising methods, and study their popularity based on the most recent scientific publications. As the result of our research, we show the most promising directions for the future development of synchronization methods and their application in public transportation.
... Tabu search [70] uses special memory structures (short-term and long-term) during the search process that allow the method to go beyond local optimality to explore promising regions of the search space. ...
Preprint
Simulation Optimization (SO) refers to the optimization of an objective function subject to constraints, both of which can be evaluated through a stochastic simulation. To address specific features of a particular simulation---discrete or continuous decisions, expensive or cheap simulations, single or multiple outputs, homogeneous or heterogeneous noise---various algorithms have been proposed in the literature. As one can imagine, there exist several competing algorithms for each of these classes of problems. This document emphasizes the difficulties in simulation optimization as compared to mathematical programming, makes reference to state-of-the-art algorithms in the field, examines and contrasts the different approaches used, reviews some of the diverse applications that have been tackled by these methods, and speculates on future directions in the field.
... In particular, the use of tabus can prohibit attractive moves, or it may lead to an overall stagnation of the search process, with a lot of moves that are not allowed. Hence, several methods for revoking a tabu have been defined [21], and they are commonly known as aspiration criteria. The simplest and most commonly used aspiration criterion consists of allowing a move (even if it is tabu) if it results in a solution with an objective value better than that of the current best-known solution (since the new solution has obviously not been previously visited). ...
Preprint
One of the most challenging tasks when adopting Bayesian Networks (BNs) is the one of learning their structure from data. This task is complicated by the huge search space of possible solutions, and by the fact that the problem is NP-hard. Hence, full enumeration of all the possible solutions is not always feasible and approximations are often required. However, to the best of our knowledge, a quantitative analysis of the performance and characteristics of the different heuristics to solve this problem has never been done before. For this reason, in this work, we provide a detailed comparison of many different state-of-the-arts methods for structural learning on simulated data considering both BNs with discrete and continuous variables, and with different rates of noise in the data. In particular, we investigate the performance of different widespread scores and algorithmic approaches proposed for the inference and the statistical pitfalls within them.
... Preventing recently audited principals from being reaudited, a la Tabu Search [57], is usually helpful. Principals' ranks exhibit strong auto-correlation across days because we apply aggregated scoring over overlapping windows of activity. ...
Preprint
Full-text available
We present Facade (Fast and Accurate Contextual Anomaly DEtection): a high-precision deep-learning-based anomaly detection system deployed at Google (a large technology company) as the last line of defense against insider threats since 2018. Facade is an innovative unsupervised action-context system that detects suspicious actions by considering the context surrounding each action, including relevant facts about the user and other entities involved. It is built around a new multi-modal model that is trained on corporate document access, SQL query, and HTTP/RPC request logs. To overcome the scarcity of incident data, Facade harnesses a novel contrastive learning strategy that relies solely on benign data. Its use of history and implicit social network featurization efficiently handles the frequent out-of-distribution events that occur in a rapidly changing corporate environment, and sustains Facade's high precision performance for a full year after training. Beyond the core model, Facade contributes an innovative clustering approach based on user and action embeddings to improve detection robustness and achieve high precision, multi-scale detection. Functionally what sets Facade apart from existing anomaly detection systems is its high precision. It detects insider attackers with an extremely low false positive rate, lower than 0.01%. For single rogue actions, such as the illegitimate access to a sensitive document, the false positive rate is as low as 0.0003%. To the best of our knowledge, Facade is the only published insider risk anomaly detection system that helps secure such a large corporate environment.
... Derivative-free methods (DFMs), exemplified by the random search algorithms, systematically probe the solution space until certain predefined criteria is met (Brooks, 1958;Pronzato et al., 1984). Evolutionary random search algorithms further integrate mechanisms such as mutation (Holland, 1992), adaptive selection (Reeves, 1997;Whitley, 1994), simulated annealing (Bertsimas and Tsitsiklis, 1993;Skiscim and Golden, 1983), and memory structures (Glover, 1990(Glover, , 1999. While DFMs can be used to optimize unknown objective functions, they require real-time access to the objective values of candidate solutions at each search step, e.g., through a simulator. ...
Preprint
Full-text available
Many challenges in science and engineering, such as drug discovery and communication network design, involve optimizing complex and expensive black-box functions across vast search spaces. Thus, it is essential to leverage existing data to avoid costly active queries of these black-box functions. To this end, while Offline Black-Box Optimization (BBO) is effective for deterministic problems, it may fall short in capturing the stochasticity of real-world scenarios. To address this, we introduce Stochastic Offline BBO (SOBBO), which tackles both black-box objectives and uncontrolled uncertainties. We propose two solutions: for large-data regimes, a differentiable surrogate allows for gradient-based optimization, while for scarce-data regimes, we directly estimate gradients under conservative field constraints, improving robustness, convergence, and data efficiency. Numerical experiments demonstrate the effectiveness of our approach on both synthetic and real-world tasks.
Article
Full-text available
We describe the development and successful implementation of a decision support system now being used by several leading firms in the architecture and space planning industries. The system, which we call SPDS (spatialprogrammingdesignsystem) has the following characteristics: (i) user-friendly convenience features permitting architects and space planners to operate the system without being experienced programmers; (ii) interactive capabilities allowing the user to control and to manipulate relevant parameters, orchestrating conditions to which his or her intuition provides valuable input; (iii) informative and understandable graphics, providing visual displays of interconnections that the computer itself treats in a more abstract methematical form; (iv) convenient ways to change configurations, and to carry out what if analyses calling on the system's decision support capabilities; (v) a collection of new methods, invisible to the user, capable of generating good solutions to the mathematical programming problems that underlie each major design component. These new methods succeed in generating high quality solutions to a collection of complex discrete, highly nonlinear problems. While these problems could only be solved in hours, or not at all, with previously existing software, the new methods obtain answers in seconds to minutes on a minicomputer. Major users, including Dalton, Dalton, Newport, and Marshal Erdwin, report numerous advantages of the system over traditional architectural design methods.
Article
Full-text available
Integer programming has benefited from many innovations in models and methods. Some of the promising directions for elaborating these innovations in the future may be viewed from a framework that links the perspectives of artificial intelligence and operations research. To demonstrate this, four key areas are examined: 1.(1) controlled randomization, 2.(2) learning strategies, 3.(3) induced decomposition and 4.(4) tabu search. Each of these is shown to have characteristics that appear usefully relevant to developments on the horizon.
Article
Full-text available
The general employee scheduling problem extends the standard shift scheduling problem by discarding key limitations such as employee homogeneity and the absence of connections across time period blocks. The resulting increased generality yields a scheduling model that applies to real world problems confronted in a wide variety of areas.The price of the increased generality is a marked increase in size and complexity over related models reported in the literature. The integer programming formulation for the general employee scheduling problem, arising in typical real world settings, contains from one million to over four million zero-one variables. By contrast, studies of special cases reported over the past decade have focused on problems involving between 100 and 500 variables.We characterize the relationship between the general employee scheduling problem and related problems, reporting computational results for a procedure that solves these more complex problems within 98–99% optimality and runs on a microcomputer. We view our approach as an integration of management science and artificial intelligence techniques. The benefits of such an integration are suggested by the fact that other zero-one scheduling implementations reported in the literature, including the one awarded the Lancaster Prize in 1984, have obtained comparable approximations of optimality only for problems from two to three orders of magnitude smaller, and then only by the use of large mainframe computers.
Article
Full-text available
Tabu search techniques are used for moving step by step towards the minimum value of a function. A tabu list of forbidden movements is updated during the iterations to avoid cycling and being trapped in local minima. Such techniques are adapted to graph coloring problems. We show that they provide almost optimal colorings of graphs having up to 1000 nodes and their efficiency is shown to be significantly superior to the famous simulated annealing.
Article
This survey considers emerging approaches of heuristic search for solutions to combinatorially complex problems. Such problems arise in business applications, of traditional interest to operations research, such as in manufacturing operations, financial investment, capital budgeting and resource management. Artificial intelligence is a revived approach to problem-solving that requires heuristic search intrinsically in knowledge-base operations, especially for logical and analogical reasoning mechanisms. Thus, one bilateral linkage between operations research and artificial intelligence is their common interest in solving hard problems with heuristic search. That is the focus here. But longstanding methods of directed tree search with classical problem heuristics, such as for the Traveling Salesman Problem—a paradigm for combinatorially difficult problems—are not wholly satisfactory. Thus, new approaches are needed, and it is at least stimulating that some of these are inspired by natural phenomena.
Article
A technique for finding in a graph an independent set with maximum cardinality is presented. It consists of an implicit enumeration procedure; the procedure uses at various stages bounds on the independence number of a subgraph. These are obtained by applying an adaptation of Tabu Search. Computational results are given which show that with Tabu Search a competitive algorithm is obtained; the case of randomly generated graphs having up to 450 or 500 nodes (with edge density 0.5) can be handled by this approach.