Article

The P-Centre Problem—Heuristic and Optimal Algorithms

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

The p-centre problem, or minimax location-allocation problem in location theory terminology, is the following: given n demand points on the plane and a weight associated with each demand point, find p new facilities on the plane that minimize the maximum weighted Euclidean distance between each demand point and its closest new facility. We present two heuristics and an optimal algorithm that solves the problem for a given p in time polynomial in n. Computational results are presented.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... First, the former problem seeks a maximal (profit) coverage solution while our problem seeks a full coverage solution; second, the number and shapes (in terms of the semi-major and semiminor axis lengths) of the covering ellipses are given in the former problem, whereas in our problem neither the number nor the shapes of the covering ellipses is known a priori as they depend on the location of the foci points. As a result, the geometric analysis approach that was useful for enumerating the possible solutions of the former problem (similar ideas were also found in Church [1984], Drezner [1984] and Chazelle and Lee [1986]) is not applicable (at least to the author's knowledge) to the Elliptical Cover problem studied in this paper. Blanco and Puerto [2021] investigated the problem of determining the location of p depots to cover a finite set of demand points so that the largest weighted sum of the distances from a demand point to all depots is minimized. ...
... The best known exact algorithm is proposed by Chen and Chen [2013]. The idea is solving a finite series of set covering problems via integer programming, similar to those proposed by Minieka [1970] and Drezner [1984] for solving the EPC problem. Numeric experiments performed by the authors showed that the optimal solution to the 2-neighbor p-center problem often coincides with the optimal solution to the EPC problem with 2p centers -by placing two centers at the same location to meet the 2-neighbor requirement. ...
... Consequently, successful algorithms invariably exploited some geometric and combinatorial insights into the problem structure, rather than relying on mathematical optimization approaches. For instance, Drezner [1984] uncovered two geometric properties of the EPC's optimal solution which ultimately led to good algorithms. One is the fact that the largest circle in the solution is defined by at most three demand points hence there are O(n 3 ) possible radii with one of them being the solution radius. ...
Article
Full-text available
Given n demand points in a geographic area, the elliptical cover problem is to determine the location of p depots (anywhere in the area) so as to minimize the maximum distance of an economical delivery trip in which a delivery vehicle starts from the nearest depot to a demand point, visits the demand point and then returns to the second nearest depot to that demand point. We show that this problem is NP-hard, and adapted Cooper’s alternating locate-allocate heuristic to find locally optimal solutions for both the point-coverage and area-coverage scenarios. Experiments show that most locally optimal solutions perform similarly well, suggesting their sufficiency for practical use. The one-dimensional variant of the problem, in which the service area is reduced to a line segment, permits recursive algorithms that are more efficient than mathematical optimization approaches in practical cases. The solution also provides the best-known lower bound for the original problem at a negligible computational cost.
... Elzinga and Hearn give a geometric algorithm to solve the 1-center problem with Euclidean distances, and prove the correctness of the algorithm in [22]. Drezner [23] discusses the problem of locating a new facility among n given demand points by taking the l p -norm distance into consideration, and proposes two heuristic algorithms and an optimal algorithm to solve the problem in [24]. en Callaghan et al. [25] attempt to speed up the optimal method proposed in [24] by introducing neighbourhood reduction schemes and embedding a CPLEX policy. ...
... Drezner [23] discusses the problem of locating a new facility among n given demand points by taking the l p -norm distance into consideration, and proposes two heuristic algorithms and an optimal algorithm to solve the problem in [24]. en Callaghan et al. [25] attempt to speed up the optimal method proposed in [24] by introducing neighbourhood reduction schemes and embedding a CPLEX policy. ...
... In this section, we firstly present a heuristic methodology to select sample locations for a given query zone in a 2D space, and develop efficient algorithms based on the studies of facility allocation techniques [23,24]. ...
Article
Full-text available
While promoting a business or activity in geo-social networks, the geographical distance between its location and users is critical. Therefore, the problem of Distance-Aware Influence Maximization (DAIM) has been investigated recently. The efficiency of DAIM heavily relies on the sample location selection. Specifically, the online seeding performance is sensitive to the distance between the promoted location and its nearest sample location, and the offline precomputation performance is sensitive to the number of sample locations. However, there is no work to fully study the problem of sample location selection for DAIM in geo-social networks. To do this, we first formalize the problem under a reasonable assumption that a promoted location always adheres to the distribution of users (query zone). Then, we propose two efficient location sampling approaches based on facility location analysis, which is one of the most well-studied areas of operations research, and these two approaches are denoted by Facility Location based Sampling (FLS) and Conditional Facility Location Based Sampling (CFLS), respectively. FLS conducts one-time sample location selection, and CFLS extends the one-time sample location selection to a continuous process, so that an online advertising service can be started immediately without sampling a lot of locations. Our experimental results on two real datasets demonstrate the effectiveness and efficiency of the proposed methods. Specifically, both FLS and CFLS can achieve better performance than the existing sampling methods for the DAIM problem, and CFLS can initialize the online advertising service in a matter of seconds and achieve better objective distance than FLS after sampling a large number of sample locations.
... The problem that covering all the HTs with the minimal number of cells is a derivative problem of p-center problem [28], [29] in plane geometry for the circular cells. p-center problem can be defined as follows: given a set S = {s 1 , s 2 , ..., s n } of n points in a plane, p-center problem seeks the location of p centers c 1 , c 2 , ..., c p such that the maximum distance of any given point from its closest center is minimized, that is: ...
... Basing on the p-center algorithm in [29], we make a little modification in our p-center algorithm. Let us define some concepts that we shall need later. ...
... Step 2 and step 3 are applied alternately until neither changes the partition. The p-center algorithm in [29] only searches the minimum distance from points to centers, and in our example, it will stop at the state shown in Fig. 5(c). However, the division of subset 2 and subset 3 can still be optimized such as from Fig. 5(c) to Fig. 5(d). ...
Article
Full-text available
The beam-hopping (BH) technology applied to low earth orbit (LEO) satellite communication networks is a superior choice, but the long transmission delay partly caused by data packets waiting in the queue of satellite transponders will seriously affect the user experience. To shorten the packet queueing delay, in this paper, we propose an optimization method of dynamic beam position division for LEO BH satellite communication systems. Firstly, we analyze the packet queueing delay problem in BH satellites to find out the factors related to the queueing delay, and we find that the number of beam positions is negatively correlated with the queueing delay. Then, we turn the beam position division problem into a p-center problem to try to cover all users with the least number of beam positions. The beam positions among the footprint of LEO satellites are determined dynamically by the user distribution and the traffic distribution. Finally, the performance evaluation of the proposed optimization method is carried out in real-time and the simulation shows that the beam position division optimized system we proposed can shorten the queueing delay up to 40% compare to the benchmark system without sacrificing throughput.
... The problem has been shown to be NP-hard when p is variable, see Megiddo & Supowit (1984). For a fixed value of p the problem can be solved in polynomial time, O(n 2p+4 ), though requiring an excessive amount of computational effort especially for larger values of p, see Drezner (1984). ...
... In both the discrete and the continuous problems, Cooper's (1964) Multi-Start method, which is based on the locate-allocate principle, is often used to produce an upper bound for optimal methods or initial solutions for metaheuristics. This paper will be analysing the original continuous p−centre problem by revisiting an interesting, though originally very slow, optimal algorithm proposed thirty years ago by Drezner (1984). This older method used a subset of facility locations based on specific circles rather than demand points. ...
... We can now define a Z−maximal circle in the following way, as given by Drezner (1984). ...
Article
Full-text available
Drezner's optimal algorithm for the p-centre problem is an elegant but somewhat slow method. We suggest some technical enhancements that significantly improve the method's efficiency.
... The problem has been shown to be NP-hard when p is variable, see Megiddo and Supowit (1984) . For a fixed value of p the problem can be solved in polynomial time, O (n 2 p+4 ) , though requiring an excessive amount of computational effort especially for larger values of p , see Drezner (1984) . ...
... In both the discrete and the continuous problems, Cooper 's (1964) Multi-Start method, which is based on the locate-allocate principle, is often used to produce an upper bound for optimal methods or initial solutions for metaheuristics. This paper will be analysing the original continuous p -centre problem by revisiting an interesting, though originally very slow, optimal algorithm proposed thirty years ago by Drezner (1984) . This older method used a subset of facility locations based on specific circles rather than demand points. ...
... We can now define a Z -maximal circle in the following way, as given by Drezner (1984) . Definition 2.3. ...
Article
Full-text available
This paper revisits an early but interesting optimal algorithm first proposed by Drezner to solve the continuous centre problem. The original algorithm is reexamined and efficient neighbourhood reductions which are mathematically supported are proposed to improve its overall computational performance. The revised algorithm yields a considerably high reduction in computational time reaching, in some cases, a decrease of 96%. This new algorithm is now able to find proven optimal solutions for large data sets with over 1300 demand points and various values of p for the first time.
... This is repeated until there is no improvement in the allocation. Drezner [6] presents two methods, namely, a multi-start similar to Cooper's locate allocate adapted to the p -center problem (referred to as (H1)) followed by a composite heuristic made up of H1 and a post optimiser that allocates the critical points between the clusters (called (H2)). Eiselt and Charlesworth [13] propose three constructive and improvement-based heuristics. ...
... Few papers deal with exact methods for the planar p-center problem. Drezner [6] put forward an interesting idea of enumerating all the maximum sets given a threshold (the radius of the largest circle at a given iteration) to be used within a covering-based model. If the problem is feasible, the obtained feasible solution is then used to get a new threshold. ...
... The process is repeated until the covering problem has no feasible solution leading to the current threshold being the optimal solution. Results for small instances up to 40 n  and 5 p  were tested starting with the initial solution (threshold) found by the Drezner's heuristic H2 [6]. This optimal method will be revisited in the computational results section as it is found to be not as slow as originally mentioned in the literature (see subsection 5.2). ...
Article
A self-adaptive heuristic that incorporates a variable level of perturbation, a novel local search and a learning mechanism is proposed to solve the p-centre problem in the continuous space. Empirical results, using several large TSP-Lib data sets, some with over 1300 customers with various values of p, show that our proposed heuristic is both effective and efficient. This perturbation metaheuristic compares favourably against the optimal method on small size instances. For larger instances the algorithm outperforms both a multi-start heuristic and a discrete-based optimal approach while performing well against a recent powerful VNS approach. This is a self-adaptive method that can easily be adopted to tackle other combinatorial/global optimisation problems. For benchmarking purposes, the medium size instances with nodes are solved optimally for the first time, though requiring a large amount of computational time. As a by-product of this research, we also report for the first time the optimal solution of the vertex p-centre problem for these TSP-Lib data sets.
... Let min-radius(S, d) be the smallest value of r such that cover(S, B, r, d) is true for some B. We often refer to this value as r min when S and d are understood from context. Note that for the specific case of d = 1, and considering only finding the minimum radius (disregarding actor movement), this is equivalent to the Euclidean p-center problem 1 in Operations Research [14][15][16]. ...
... The problem with this method is that the number of points on the plane is infinite. However, it was shown in [9,14,17] that there does exist a finite set of points P such that, if there exists a set B satisfying cover(S, B, r, d), then there must also exist a subet B of P satisfying cover(S, B, r, d). We refer to P as a set of possible actor locations, and is obtained as follows. ...
... In this section, we present an ILP formulation of the MRaMS problem to quantify the effect of choosing the new actor locations from the above set P A (S, A, r) of possible optimal positions (with respect to movement) vs. choosing from the canonical set P(S, r) given in [9,14,17]. It will also serve as a reference for the MRaMS heuristics that we present in Section 8. ...
Article
Wireless sensor and actor networks are composed of static sensors and mobile actors. We assume actors have a random initial location in the two-dimensional sensing area. The objective is to move each actor to a location such that every sensor node is within a bounded number of hops from some actor. Because sensors have limited energy, the new actor locations are chosen to minimize the transmission range of the sensors. However, actors also have a limited (although larger) power supply, and their movement depletes their resources. It follows that by carefully choosing the new actor locations, the total actor movement can be minimized. In this paper, we introduce the problem of simultaneously minimizing the required transmission range and amount of actor movement. To find a solution, we formulate the problem using an ILP framework. For the ILP solution to be feasible, we introduce a finite set of possible actor locations such that an optimal solution is guaranteed to be found within this set. We also present a heuristic for this problem. As a preliminary step, we study minimizing the transmission range necessary for multi-hop communication. Various heuristics for this smaller problem are proposed and their results are compared by simulation. The best of these heuristics is then enhanced to incorporate minimizing the movement of actors, and its performance is compared to the optimal ILP solution.
... The p-center problem generally refers to the problem of locating p facilities to cover a set of demand points so as to minimize the maximum distance from any demand point to its nearest facility. Drezner (1984) [13], Megiddo and Supowit (1984) [20] and Vijay (1985) [26] were among the first to study the problem variant in which p facilities were to be located on an Euclidean plane to cover m demand points. However, these pioneers used different names for the problem, including the "p-center problem in a plane" and the "Euclidean p-center problem". ...
... The p-center problem generally refers to the problem of locating p facilities to cover a set of demand points so as to minimize the maximum distance from any demand point to its nearest facility. Drezner (1984) [13], Megiddo and Supowit (1984) [20] and Vijay (1985) [26] were among the first to study the problem variant in which p facilities were to be located on an Euclidean plane to cover m demand points. However, these pioneers used different names for the problem, including the "p-center problem in a plane" and the "Euclidean p-center problem". ...
Article
Full-text available
This paper introduces several means to expedite a known Voronoi heuristic method for solving a continuous p‐center location problem, which is to cover a polygon with p circles such that no circle's center lies outside the polygon, no circle's center drops inside a polygonal hole, and the radius of the largest circle is as small as possible. A key step in the Voronoi heuristic is the repeated task of determining the constrained minimum covering circle (CMCC), but the best‐known method for this task is brute‐force and inefficient. This paper develops an improved method by exploiting the convexity of a related subproblem and employing an optimized search procedure. The algorithmic enhancements help to drastically reduce the effort required for finding the CMCC, and in turn, significantly expedite the solution of the p‐center problem. On a realistic‐scale test case, the proposed algorithm ran 400× faster than the literature benchmark.
... Tighter ILP formulations than (12) were proposed, with efficient exact algorithms relying on the IP models [22,23]. Exponential exact algorithms were also designed for the continuous p-center problem [24,25]. An n O( √ p) -time algorithm was provided for the continuous Euclidean p-center problem in the plane [26]. ...
... Lastly, we prove (25). ...
Article
Full-text available
With many efficient solutions for a multi-objective optimization problem, this paper aims to cluster the Pareto Front in a given number of clusters K and to detect isolated points. K-center problems and variants are investigated with a unified formulation considering the discrete and continuous versions, partial K-center problems, and their min-sum-K-radii variants. In dimension three (or upper), this induces NP-hard complexities. In the planar case, common optimality property is proven: non-nested optimal solutions exist. This induces a common dynamic programming algorithm running in polynomial time. Specific improvements hold for some variants, such as K-center problems and min-sum K-radii on a line. When applied to N points and allowing to uncover M<N points, K-center and min-sum-K-radii variants are, respectively, solvable in O(K(M+1)NlogN) and O(K(M+1)N2) time. Such complexity of results allows an efficient straightforward implementation. Parallel implementations can also be designed for a practical speed-up. Their application inside multi-objective heuristics is discussed to archive partial Pareto fronts, with a special interest in partial clustering variants.
... For example, k-median and k-center are two well-known types of FL problems for public FL and emergency FL with the objectives min-sum and min-max, respectively. The NP-completeness of both of the problems (and some variations of them) has been proved [4], and many approximations and heuristic approaches have been proposed for solving them (e.g., see [5][6][7][8]). ...
Article
Full-text available
The optimal placement of healthcare facilities, including the placement of diagnostic test centers, plays a pivotal role in ensuring efficient and equitable access to healthcare services. However, the emergence of unique complexities in the context of a pandemic, exemplified by the COVID-19 crisis, has necessitated the development of customized solutions. This paper introduces a bi-objective integer linear programming model designed to achieve two key objectives: minimizing average travel time for individuals visiting testing centers and maximizing an equitable workload distribution among testing centers. This problem is NP-hard and we propose a customized local search algorithm based on the Voronoi diagram. Additionally, we employ an ϵ-constraint approach, which leverages the Gurobi solver. We rigorously examine the effectiveness of the model and the algorithms through numerical experiments and demonstrate their capability to identify Pareto-optimal solutions. We show that while the Gurobi performs efficiently in small-size instances, our proposed algorithm outperforms it in large-size instances of the problem.
... Dentre os métodos de resolução mais citados para o problema, temos uma heurística bastante utilizada que envolve a solução iterativa de p problemas de 1-center, apresentada em [Drezner, 1984], e também citado como alternativa em [Mladenovic et al., 2003]. Em [Dyer e Frieze, 1985] e apresentado um método aproximativo para construção de uma solução do problema. ...
... This indicates that it is even NP-hard to approximate the p-center problem with a relative error of less than about 15%. Drezner (1984) proposed two heuristics and an optimal algorithm for solving the EPC problem. The optimal algorithm ingeniously exploited the geometric properties of the optimal solution and boasted a time complexity of O(n 2p+1 ), that is, for a given p, the time is polynomial in n. ...
Article
Full-text available
The p-center location problem in an area is an important yet very difficult problem in location science. The objective is to determine the location of p hubs within a service area so that the distance from any point in the area to its nearest hub is as small as possible. While effective heuristic methods exist for finding good feasible solutions, research work that probes the lower bound of the problem’s objective value is still limited. This paper presents an iterative solution framework along with two optimization-based heuristics for computing and improving the lower bound, which is at the core of the problem’s difficulty. One method obtains the lower bound via solving the discrete version of the Euclidean p-center problem, and the other via solving a relatively easier clustering problem. Both methods have been validated in various test cases, and their performances can serve as a benchmark for future methodological improvements.
... The locations of the disaster relief centres have to be planned on greenfield sites. The problem is a continuous -centre problem without considering capacities and with demand-weighted distances, which can be formulated as follows (Drezner, 1984) (Drezner, 2011 (1) and (2), the maximum demand-weighted distance between the sources and the destinations is to be minimised. These distances are to be determined for all combinations of sources and destinations with a suitable distance function (•). ...
Article
Full-text available
Logistical decision problems are a part of many courses in the field of logistics, management and operations research. It makes sense to illustrate these optimisation problems using case studies, which can be reproduced by students using suitable software. Often, solver add-ins in spreadsheets programs or general optimisation software are used, which on the one hand requires a high level of knowledge in Operations Research and on the other hand does not always allow an intuitive approach. This article describes the academic software LogisticsLab with which the distributors tie in with the idea of interactive decision support systems to systematically combine the experiences and intuitions of human decision-makers with the possibilities of computer-assisted modelling and optimisation of a wide range of logistical decisions.
... The cluster-level constraint considers some available information about the underlying cluster structures in the form of limitations [48] [49]. A facility location problem is a similar scenario to the clustering with a cluster-level constraint [50] in which the authors propose two heuristics algorithms in the facility location problems that can be considered as clustering problem with upper bounds on the radius of the clusters. In [51], the authors study two types of cluster-level constraints in a search-based agglomerative hierarchical clustering algorithm. ...
Article
Full-text available
Cluster analysis using metaheuristic algorithms has earned increasing popularity and acceptance over recent years due to the great success of these algorithms in finding high-quality clusters in complex real-world problems. This paper proposes a novel framework for automatic data clustering with the capability of generating clusters with approximately the same maximum distortion using nature-inspired binary optimization algorithms. The inherent problem with clustering using such algorithms is having a huge search space. Therefore, we propose a binary encoding scheme for the particle representation to alleviate this problem as well. The proposed clustering solution requires no prior knowledge of the number of clusters and proceed with the process based on re-clustering, merging, and modifying the small clusters to compensate for the distortion gap between groups with different sizes. The proposed framework’s performance has been evaluated over a wide range of synthetic, real-life, and higher dimensional datasets first by considering four different binary optimization algorithms for the optimizer module. Then, it has been compared with multiple classical and new clustering solutions in addition to two automatic clustering techniques using continuous search space in terms of separation and compactness of the clusters by utilizing internal validity measures. The experimental results show the proposed solution is highly efficient in creating well-separated and compact clusters with approximately the same distortion. Moreover, the application of the proposed framework to the correlated binary data is also reported as a case study. The presence of correlation in the binary dataset results from the similarity between data in the same category, such as repeated measurements in remote sensing, crowdsourced multi-view video uploading, and augmented reality. The simplicity, customizability, and flexibility to add additional conditions to the proposed solution in addition to having a dynamic number of clusters are the advantages of the proposed framework.
... 3. The objective of the p-center problem (Calik et al., 2015;Drezner, 1984;Kariv and Hakimi, 1979a) is to minimize the largest distance to all demand points. The 1-center problem is termed the minimax location problem (Drezner, 2011;Elzinga and Hearn, 1972;Sylvester, 1857Sylvester, , 1860. 4. One of the competitive location models is based on the assumption that customers patronize the closest facility (Drezner, 1982;Hakimi, 1983;Hotelling, 1929). ...
Preprint
We investigate a new model for partitioning a set of items into groups (clusters). The number of groups is given and the distances between items are well defined. These distances may include weights. The sum of the distances between all members of the same group is calculated for each group, and the objective is to find the partition of the set of items that minimizes the sum of these individual sums. Two problems are formulated and solved. In the first problem the number of items in each group are given. For example, all groups must have the same number of items. In the second problem there is no restriction on the number of items in each group. We propose an optimal algorithm for each of the two problems as well as a heuristic algorithm. Problems with up to 100 items and 50 groups are tested. In the majority of instances the optimal solution was found using IBM's CPLEX. The heuristic approach, which is very fast, found all confirmed optimal solutions and equal or better solutions when CPLEX was stopped after five hours. The problem with given group sizes can also be formulated and solved as a quadratic assignment problem.
... If we consider the facility locations as cluster centers, customers as data objects and the objective to be optimized in the facility location problem as a cluster dispersion measure, facility location problem with such constraints and the clustering with cluster-level constraints are very alike. For example, in [57], Drezner proposes two heuristics and an optimal algorithm for the p-center problem encountered in the facility location literature which can be considered as clustering with upper bounds on the radii of the clusters. ...
... In the case where all the target sets are singletons with i = {a i } for all i ∈ I , problem (GkC) reduces to the classical continuous k-center problem (see, e.g., [9,11]): ...
Article
Full-text available
The continuous k-center problem aims at finding k balls with the smallest radius to cover a finite number of given points in Rn\mathbb {R}^n. In this paper, we propose and study the following generalized version of the k-center problem: Given a finite number of nonempty closed convex sets in Rn\mathbb {R}^n, find k balls with the smallest radius such that their union intersects all of the sets. Because of its nonsmoothness and nonconvexity, this problem is very challenging. Based on nonsmooth optimization techniques, we first derive some qualitative properties of the problem and then propose new algorithms to solve the problem. Numerical experiments are also provided to show the effectiveness of the proposed algorithms.
... To solve to optimality discrete p-center problems, tighter ILP formulations, with efficient exact algorithms, rely on the ILP models [8,18]. Exponential exact algorithms were also provided for the continuous p-center problem [9,11]. An N O( √ p) -time algorithm was provided for the continuous Euclidean p-center problem in the plane [28]. ...
Preprint
Full-text available
This paper is motivated by real-life applications of bi-objective optimization. Having many non dominated solutions, one wishes to cluster the Pareto front using Euclidian distances. The p-center problems, both in the discrete and continuous versions, are proven solvable in polynomial time with a common dynamic programming algorithm. Having N points to partition in K3K\geqslant 3 clusters, the complexity is proven in O(KNlogN)O(KN\log N) (resp O(KNlog2N)O(KN\log^2 N)) time and O(KN) memory space for the continuous (resp discrete) K-center problem. 2-center problems have complexities in O(NlogN)O(N\log N). To speed-up the algorithm, parallelization issues are discussed. A posteriori, these results allow an application inside multi-objective heuristics to archive partial Pareto Fronts.
... Many exact, heuristics, and metaheuristics algorithms and approaches have been proposed to solve the p-center problem as reviewed by ReVelle et al. (2008) andde Smith, Goodchild, andLongley (2018). For example, Drezner (1984a), Daskin (2013), Calik and Tansel (2013), Chen and Chen (2013) proposed exact algorithms to solve the p-center problem under certain conditions. Still, exact algorithms may not be able to solve the p-center problem (Dantrakul, Likasiri, & Pongvuthithum, 2014) as this problem is NP-hard (Daskin, 2013). ...
... Various heuristics and approximation algorithms have been proposed over time to solve the problem. Research is still ongoing for solving multiple variants of the problem like continuous or discrete location problems [5]. The location problem is continuous if the set of candidate locations for the facility is infinite. ...
Article
This paper aims to locate p resources in a nonconvex demand plane having n demand points. The objective of the location problem is to find the location for these p resources so that the distance from each of n demand points to its nearest resource is minimized, thus simulating a p-center problem. We employ various geometrical structures for solving this location problem. The suggested approach is also capable of finding the optimal value of p so that all demand points have at least one resource at a distance Δ, where Δ is the maximum permissible distance for emergency services. Finally, an implementation of the proposed approach is presented and it is observed that the suggested approach rapidly converges towards the optimal location.
... However, in the latter the facilities can be located anywhere in the plane. In this work, we will explore the absolute p−centre problem by revisiting an early optimal algorithm proposed by Drezner (1984) to solve this problem. ...
Article
We consider the problem of scheduling parallel jobs on heterogeneous platforms. Given a set of n jobs where each job is described by a pair ( , ) with a processing time and number of processors required and a set of N heterogeneous platforms with processors, the goal is to find a schedule for all jobs on the platforms minimizing the maximum completion time. The problem is directly related to a two-dimensional multi strip packing problem. Unless there is no approximation algorithm with absolute ratio better than 2 for the problem. We propose an approximation algorithm with absolute ratio 2 improving the previously best known approximation algorithms. This closes the gap between the lower bound of <2 and the best approximation ratio.
... If we consider the facility locations as cluster 766 centers, customers as data objects and the objective to be optimized in the facility 767 location problem as a cluster distortion measure, facility location problem with such 768 constraints and the clustering with cluster-level constraints are very alike. Interested 769 readers are referred to [24] in which Drezner proposes two heuristics and an optimal 770 algorithm for the p-center problem encountered in the facility location literature 771 which can be considered as clustering with upper bounds on the radii of the clusters. 772 ...
Chapter
Traditional data mining methods for clustering only use unlabeled data objects as input. The aim of such methods is to find a partition of these unlabeled data objects in order to discover the underlying structure of the data. In some cases, there may be some prior knowledge about the data in the form of (a few number of) labels or constraints. Performing traditional clustering methods by ignoring the prior knowledge may result in extracting irrelevant information for the user. Constrained clustering, i.e., clustering with side information or semi-supervised clustering, addresses this problem by incorporating prior knowledge into the clustering process to discover relevant information from the data. In this chapter, a survey of advances in the area of constrained clustering will be presented. Different types of prior knowledge considered in the literature, and clustering approaches that make use of this prior knowledge will be reviewed.
... 1. F can be decomposed into a sum of one-dimensional functions, i.e. F z mn , d(X n ,X m ) n∈N,m∈M = as is the objective function of the well-known center problem (Drezner, 1984). Center problems are more suitable for modeling public sector problems, since the objective is to minimize the maximum weighted distance from demand points to new facilities. ...
Article
This paper presents a general modeling framework for restricted facility location problems with arbitrarily shaped forbidden regions or barriers, where regions are modeled using phi-objects. Phi-objects are an efficient tool in mathematical modeling of 2D and 3D geometric optimization problems, and are widely used in cutting and packing problems and covering problems. The paper shows that the proposed modeling framework can be applied to both median and center facility location problems, either with barriers or forbidden regions. The resulting models are either mixed-integer linear or nonlinear programming formulations, depending on the shape of the restricted region and the considered distance measure. Using the new framework, all instances from the existing literature for this class of problems are solved to optimality. The paper also introduces and optimally solves a realistic multi-facility problem instance derived from an archipelago vulnerable to earthquakes. This problem instance is significantly more complex than any other instance described in the literature.
... For euclidean distance p-center problems exact solution techniques are provided by CHEN (1983), DREZNER (1984a), VIJAY (1985) and CHEN-HANDLER (1987), all involving rather inefficient branch and bound schemes. Several heuristic methods exist, applicable to general norms situations in any dimension, provided the single facility problem may be efficiently solved. ...
... If a facility is located at the center of the circle, and the points represent demands, the maximum distance between the facility and a demand will be minimized. An optimal algorithm for the k-center problem on the plane was given by Drezner [9] in 1984. ...
Research
Full-text available
The k-center problem is that of choosing k vertices as centers in a weighted undirected graph in which the edge weights obey the triangle inequality so that the maximum distance of any vertex to its nearest center is minimized. The problem is NP-hard, but there is a simple greedy 2-approximation algorithm which has been shown to be optimal. We consider here the capacitated k-center problem, where additionally each vertex has a capacity, which is a bound on the number of 'clients' it can serve if it is opened as a center. Unlike the uncapacitated k-center problem, our understanding of the capacitated version is far from complete and the first constant factor approximation was given fairly recently. We mainly concern ourselves with the case when all capacities are equal, which is called the uniform capacity k-center problem. We give here an L-approximation for the uniform k-center problem where each vertex has capacity L. This is an improvement over the current best approximation ratio for L ≤ 5.
... When the service must be available for all the potential users but the budget is limited, minimizing the maximum distance between demand points and their assigned facilities seems appropriate. This approach has led to minimax formulations on the plane (Elzinga and Hearn 1972) and on a network (Hakimi 1964) that were generalized to p-center formulations (Minieka 1970;Drezner 1984Drezner , 1987Handler 1990) when more than one facility is located. Customers are assumed to get the service from the closest facility. ...
Article
In this paper we propose a stochastic model for the location of emergency facilities. The model is formulated and analyzed. The location of one facility in the plane is optimally solved. Optimal algorithms are proposed for the location of multiple facilities on a network. Computational experiments illustrate the effectiveness of these solution procedures.
Article
In this paper we analyze a continuous version of the maximal covering location problem, in which the facilities are required to be linked by means of a given graph structure (provided that two facilities are allowed to be linked if a given distance is not exceed). We propose a mathematical programming framework for the problem and different resolution strategies. First, we provide a Mixed Integer Non Linear Programming formulation for the problem and derive some geometrical properties that allow us to reformulate it as an equivalent pure integer linear programming problem. We propose two branch-&-cut approaches by relaxing some sets of constraints of the former formulation. We also develop a math-heuristic algorithm for the problem capable to solve instances of larger sizes. We report the results of an extensive battery of computational experiments comparing the performance of the different approaches.
Article
We study the k-center problem in a kinetic setting: given a set of continuously moving points P in the plane, determine a set of k (moving) disks that cover P at every time step, such that the disks are as small as possible at any point in time. Whereas the optimal solution over time may exhibit discontinuous changes, many practical applications require the solution to be stable: the disks must move smoothly over time. Existing results on this problem require the disks to move with a bounded speed, but this model allows positive results only for k<3. Hence, the results are limited and offer little theoretical insight. Instead, we study the topological stability of k-centers. Topological stability was recently introduced and simply requires the solution to change continuously, but may do so arbitrarily fast. We prove upper and lower bounds on the ratio between the radii of an optimal but unstable solution and the radii of a topologically stable solution—the topological stability ratio—considering various metrics and various optimization criteria. For k=2 we provide tight bounds, and for small k>2 we can obtain nontrivial lower and upper bounds. Finally, we provide an algorithm to compute the topological stability ratio in polynomial time for constant k.
Article
In this paper we define and heuristically solve the multiple weighted obnoxious facilities location problem maximizing the minimum weighted distance between facilities and a given set of communities. Each community may have a different weight because different communities may be affected differently by a facility. The distance between pairs of facilities must exceed a given minimum distance. Solving the problem by the multi-purpose non-linear solver SNOPT from random starting locations performed poorly. Three approaches are proposed to generate “good” starting solutions. The best known solutions were established by these approaches.
Article
This article introduces a variation of the p-center problem (PCP), called distributionally robust conditional vertex p-center problem. This problem differs from the conventional PCP in the sense that (i) some key centers in a given set of candidates are designated, and (ii) a distributionally robust optimization (DRO) method is developed. We present a distributionally robust chance-constrained model to formulate this problem. In terms of tractability, we propose a safe tractable approximation method to reformulate the original DRO model as mixed-integer second-order cone programs under the bounded and Gaussian perturbation ambiguous sets. We further use the branch-and-cut algorithm to solve the tractable counterpart models. The application of the DRO model is illustrated for locating emergency rescue stations in the high-speed railway network by two different sized case studies. Finally, we demonstrate the advantages of the DRO model in comparison with the traditional robust optimisation model and the nominal stochastic programming model.
Conference Paper
Full-text available
This paper proposes a mathematical model for the stochastic green capacitated p-center problem a using mixed integer linear programming (MILP). A model is built considering the total emission produced by vehicles and the uncertain parameters including the customer demand, the travel time needed by a vehicle to travel from a facility to a customer and the capacity of a facility to satisfy the customer demand. The proposed model and method are evaluated using instances that are available in the literature. According to the computational experiments, the proposed methods produce interesting results.
Chapter
Having many non dominated solutions in bi-objective optimization problems, this paper aims to cluster the Pareto front using Euclidean distances. The p-center problems, both in the discrete and continuous versions, become solvable with a dynamic programming algorithm. Having N points, the complexity of clustering is (resp. ) time and O(N) memory space for the continuous (resp. discrete) K-center problem for , and in time for such 2-center problems. Furthermore, parallel implementations allow quasi-linear speed-up for the practical applications.
Article
We consider the optimal covering of the unit square by N circles. By optimal, we mean the covering that can be done with N circles of minimum radius. Equivalently, we study the problem of the optimal placement of N points such that the maximum over all locations in the square of the distance of the location to the set of points is minimized. We propose a new algorithm that can identify optimal coverings to high precision. Numerical predictions of optimal coverings for N = 1 to 16 agree with the best known results in the literature. We use the optimal designs in approximations to two novel, related problems involving the optimal placement of curves.
Article
Let G be a connected graph and k∈N. The k‐distance domination number of G is the smallest cardinality of a set S of vertices such that every vertex of G is within distance k from some vertex of S. While for k=1, that is, for the ordinary domination number, the problem of finding asymptotically sharp upper bounds in terms of order and minimum degree of the graph has been solved, corresponding bounds for k>1 have remained elusive. In this paper, we solve this problem and present an asymptotically sharp upper bound on the k‐distance domination number of a graph in terms of its order and minimum degree, which significantly improves on bounds in the literature. We also obtain an asymptotically sharp upper bound on the p‐radius of graphs in terms of order and minimum degree. For p∈N, the p‐radius of G is defined as the smallest integer d such that there exists a set S of p vertices of G having the property that every vertex of G is within distance d of some vertex in S. We also present improved bounds for graphs of given order, minimum degree and maximum degree, for triangle‐free graphs and for graphs not containing a 4‐cycle as a subgraph.
Chapter
I started my career earning a BA in Mathematics in 1965 from the Technion, Israel Institute of Technology. I then served a mandatory service in the military for an extended period of time so I could serve at the newly established computer center as a computer programmer. The first computer, Philco 212, had a 32K memory and its machine language was TAC. Later on the computer language ALTAC (Algebraic TAC) was designed. Some years later ALTAC was modified to FORTRAN (FORmula TRANslator) which is still widely used today. Most of my programs nowadays I code in Fortran.
Chapter
Dr. Zvi Drezner’s research career has touched on many areas of location analysis. We devote the first part of this chapter to summarizing Zvi’s vast contributions to the studies of the minimax and the maximum facility location problems. His relevant publications are grouped in terms of the characteristics of the problems investigated, including space, the number of facilities to locate, and completeness of information. In particular, we provide an overview of Zvi’s work in the deterministic planar minimax problems. The second part of the chapter is our own paper on a network median problem when demand weights are independent random variables. The objective of the model proposed is to locate a single facility so as to minimize the expected total demand-weighted distance subject to a constraint on the value-at-risk (VaR). The study integrates the expectation criterion with the VaR measure and links different median models with random demand weights. Methods are suggested to identify dominant points for the optimal solution. An algorithm is developed for solving the problem.
Chapter
Neighbourhood reductions for a class of location problems known as the vertex (or discrete) and planar (or continuous) p-centre problems are presented. A brief review of these two forms of the p-centre problem is first provided followed by those respective reduction schemes that have shown to be promising. These reduction schemes have the power of transforming optimal or near optimal methods such as metaheuristics or relaxation-based procedures, which were considered relatively slow, into efficient and exciting ones that are now able to find optimal solutions or tight lower/upper bounds for larger instances. Research highlights of neighbourhood reduction for global and combinatorial optimisation problems in general and for related location problems in particular are also given.KeywordsNeighbourhood reduction p-centre problemContinuous and discrete spacesHeuristic searchOptimal methods
Article
This paper examines a special case of multi-facility location problems where the set of demand points is partitioned into a given number of subsets or clusters that can be treated as smaller independent sub-problems once the number of facilities allocated to each cluster is determined. A dynamic programming approach is developed to determine the optimal allocation of facilities to clusters. The use of clusters is presented as a new idea for designing heuristics to solve general multi-facility location problems.
Article
Balancing workload among a set of facility centers is one of the practical objectives in location problems. In this paper, we introduce a multi-objective optimization facility location problem which considers two goals: minimizing the maximum distance between each client and its closest center, and maximizing workload balance among the centers. To achieve the second goal, we define two objectives, minimizing the maximum number of clients allocated to each center, and minimizing the difference between the maximum and minimum number of clients allocated to each center. We prove the hardness of finding even one Pareto optimal solution in the resulted multi-objective problem. Also, we propose a simple iterative algorithm based on the Voronoi diagram to solve the problem. We show the efficiency of the proposed algorithm using test problems and compare the results with a robust multi-objective evolutionary algorithm.
Article
This paper aims to solve large continuous p-centre problems optimally by re-examining a recent relaxation-based algorithm. The algorithm is strengthened by adding four mathematically supported enhancements to improve its efficiency. This revised relaxation algorithm yields a massive reduction in computational time enabling for the first time larger data-sets to be solved optimally (e.g., up to 1323 nodes). The enhanced algorithm is also shown to be flexible as it can be easily adapted to optimally solve related practical location problems that are frequently faced by senior management when making strategic decisions. These include the (Formula presented.)-neighbour p-centre problem and the conditional p-centre problem. A scenario analysis using variable (Formula presented.) is also performed to provide further managerial insights.
Article
In this paper we analyze nine different location models when demand points are partitioned into groups and each group defines an individual term in the definition of the objective function. We find the set of all possible optimal locations for these problems. The multiple facility case is solved using tabu search and simulated annealing metaheuristics.
Article
We develop a multi-period capacitated flow refueling location problem for electric vehicles (EVs) as the EV market responds to the charging infrastructure. The optimization model will help us determine the optimal location of level 3 chargers as well as the number of charging modules at each station over multiple time periods. Our model can also be applied to fast-filling gaseous alternative fuel vehicles under similar assumptions. We define a number of demand dynamics, including flow demand growth as a function of charging opportunities on path as well as natural demand growth independent of charging infrastructure. We also present an alternative objective function of maximizing electric vehicle demand in addition to maximizing flow coverage. A case study based on a road network around Washington, D.C., New York City, and Boston is presented to provide numerical experiments related to demand dynamics, showing the potential problems in multi-period planning.
Chapter
The minimax facility location problem (also called the one center problem) seeks to locate a facility so that the maximum distance to a set of demand points is minimized. Using Euclidean distances in the plane, this problem is equivalent to finding the center of the smallest circle enclosing all points, hence the term “center” regarding this problem. When other metrics are used, the 1-center problem is equivalent to covering all points with a shape similar to the unit ball of the metric. For example, when rectilinear distances are used, the problem is to cover all points with the smallest possible diamond.
Article
In this case study we analyse the positioning of rescue helicopters in the South Tyrol region of Northern Italy. Three rescue helicopters are co-ordinated by the "Weißes Kreuz" organisation. We were provided with the number of missions in the period of one year, listed by town (there are 116 towns). The model is a p-center location-allocation model with weighted Euclidean distance. In a first step, we analysed the current positions, computing the optimal location of each hellicopter in its area of operation. We then employed several heuristics to improve the current solution. Those were based on clustering methods for allocation and subsequent optimal location with respect to demand point clusters. The obtained results show that response times to emergency calls might be considerably improved by relocating at least two of the helicopter bases.
Article
The Mexican government studies the possibility of building a new international airport (NAICM) in ZFLT or Tizayuca. The aim of this paper is to determine which the best location site of the NAICM is (ZFLT or Tizayuca) considering the maximization of the sum of expected air pax demand as main factor. To solve such problem, we propose: a mathematical formulation with the objective of maximizing the sum of expected air pax demand and a methodology to estimate air pax demand at each demand point based on an index to measure wealth. Results indicate that Tizayuca is the place where the NAICM should be located for a catchment area smaller than 500km or 4 hours travel time, and ZFLT is the place where the NAICM should be constructed for a catchment area longer than 500km or 4 hours travel time.
Article
The p-center problem seeks the location of p facilities. Each demand point receives its service from the closest facility. The objective is to minimize the maximal distance for all demand points. In this paper, the p-center location problem for demand originating in an area is investigated. This problem is equivalent to covering every point in the area by p circles with the smallest possible radius. Heuristic procedures are proposed and upper bounds on the optimal solution in a square are given. Computational results for the special case of a square area are reported. Some cases such as p=9 centers in a square yield unexpected and interesting results.
ResearchGate has not been able to resolve any references for this publication.