## No full-text available

To read the full-text of this research,

you can request a copy directly from the author.

The n-queens problem is often used as a benchmark problem for AI research and in combinatorial optimization. An example is the recent article [1] in this magazine that presented a polynomial time algorithm for finding a solution. Several CPU-hours were spent finding solutions for some n up to 500,000.

To read the full-text of this research,

you can request a copy directly from the author.

... Finding the first solution as fast as possible is an alternative goal. The latter problem can be solved in polynomial time through conventional means [23], [24], but finding as many solutions as possible is a challenging target which makes deterministic algorithms impractical. ...

... While our primary goal was to compare which approach generated as many solutions as possible on a fixed budget, the secondary aim of the N-Queens problem is to generate a single solution as quickly as possible. This can theoretically be performed in polynomial time using deterministic methods [23], [24], and is shown to be most efficient in our results presented in Table VI. When the problem size is very large this is nowhere near as fast or efficient as stochastic approaches. ...

This paper proposes a novel algorithm for solving combinatorial optimization problems using genetic algorithms (GA) with self-adaptive mutation. We selected the N-Queens problem (8 ≤ N ≤ 32) as our benchmarking test suite, as they are highly multi-modal with huge numbers of global optima. Optimal static mutation probabilities for the traditional GA approach are determined for each N to use as a best-case scenario benchmark in our conducted comparative analysis. Despite an unfair advantage with traditional GA using optimized fixed mutation probabilities, in large problem sizes (where N > 15) multi-objective analysis showed the self-adaptive approach yielded a 65% to 584% improvement in the number of distinct solutions generated; the self-adaptive approach also produced the first distinct solution faster than traditional GA with a 1.90% to 70.0% speed improvement. Self-adaptive mutation control is valuable because it adjusts the mutation rate based on the problem characteristics and search process stages accordingly. This is not achievable with an optimal constant mutation probability which remains unchanged during the search process.

... There are several strategies for solving the n-Queens puzzle [AY89], [BS09], [BD75], [BM02], [Eng07], [ET92], [Ber91]. For instance, there is a simple way to extend a solution for dimension (n−1)×(n−1) to dimension n×n. Figure 1 shows a solution that exhibits stair-stepped patterns for n = 8 to n = 9 by adding a queen in a corner square of the chessboard. ...

In this paper a new method for solving the problem of placing n queens on a n×n chessboard such that no two queens directly threaten one another and considering that several immovable queens are already occupying established positions on the board is presented. At first, it is applied to the 8–Queens puzzle on a classical chessboard and finally to the n Queens completion puzzle. Furthermore, this method allows finding repetitive patterns of solutions for any n .

... 4. Soluzioak n> = 4 den kasurako, ondoko formulak erabiliz kalkula daiteke n-erreginaren problemaren soluzioa (Bernhardsson, 1991;Hoffman, Loessi & Moore, 1969): ...

Vivimos años en los que las tendencias emergentes en educación están tomando un papel relevante en las discusiones sobre la deriva de las instituciones de enseñanza. La inclusión de las
TIC y la proliferación de lo que se conoce bajo el nombre de m-learning conlleva repensar las formas en las que los estudiantes aprenden y los docentes enseñan. Este es un escenario convulso y
en constante cambio, pero que abre puertas a nuevas formas de hacer en educación. Un camino
hacia una educación más abierta, democrática y eficaz.
No es un camino sencillo. Con frecuencia, la rapidez con la que se producen los avances tecnológicos es más rápida que nuestra capacidad de integrar estas nuevas posibilidades en esquemas teóricos significativos y en propuestas de investigación novedosas. Las buenas prácticas y la
investigación conforman los dos ejes de esta publicación.
Relacionado con las buenas prácticas, encontraremos varios capítulos sobre diferentes experiencias innovadoras basadas en la utilización de la tecnología en los diversos niveles educativos
que conforman nuestro sistema educativo actual. El lector podrá leer y reflexionar sobre la variedad de usos que se le pueden dar a la tecnología en diversos contextos educativos y relacionado
con temas básicos y transversales que se desarrollan o se deberían trabajar en las aulas actuales y
del futuro.
Unido a esto último tampoco podemos olvidarnos de la investigación educativa con TIC, aspecto que se recoge desde diversas perspectivas en los diferentes capítulos de esta publicación.
Para poder realizar propuestas innovadoras y significativas, debemos analizar, comparar y reflexionar, por lo que la investigación rigurosa y académica se convierte en pilar fundamental para
el cambio.

... Because of the ease of finding a solution, the n-Queens problem has been subject to repeated controversy in AI over whether it should be used as a benchmark at all. For example a sequence of papers argued the point in the pages of SIGART Bulletin in the early 1990s (Sosic & Gu, 1990;Johnson, 1991;Bernhardsson, 1991;Gu, 1991;Valtorta, 1991), and then in 2014 the issue was raised again in a blog post by Smet (2014). We resolve this issue in the sense that, as an NP-and #P-Complete problem, n-Queens Completion does provide a valid benchmark problem. ...

The n-Queens problem is to place n chess queens on an n by n chessboard so that no two queens are on the same row, column or diagonal. The n-Queens Completion problem is a variant, dating to 1850, in which some queens are already placed and the solver is asked to place the rest, if possible. We show that n-Queens Completion is both NP-Complete and #P-Complete. A corollary is that any non-attacking arrangement of queens can be included as a part of a solution to a larger n-Queens problem. We introduce generators of random instances for n-Queens Completion and the closely related Blocked n-Queens and Excluded Diagonals Problem. We describe three solvers for these problems, and empirically analyse the hardness of randomly generated instances. For Blocked n-Queens and the Excluded Diagonals Problem, we show the existence of a phase transition associated with hard instances as has been seen in other NP-Complete problems, but a natural generator for n-Queens Completion did not generate consistently hard instances. The significance of this work is that the n-Queens problem has been very widely used as a benchmark in Artificial Intelligence, but conclusions on it are often disputable because of the simple complexity of the decision problem. Our results give alternative benchmarks which are hard theoretically and empirically, but for which solving techniques designed for n-Queens need minimal or no change.

... The initial set of 100,000 particle positions and forces are stored in NVRAM, as are the resultant particle positions and forces after each time-step. N-queens: It is a lock-based, multi-threaded implementation of a recursive search algorithm for finding a solution to the n-queens problem [10]. It uses a pool of workers and work queues. ...

Byte addressable non-volatile memory (NVRAM) is likely to supplement, and perhaps eventually replace, DRAM. Applications can then persist data structures directly in memory instead of serializing them and storing them onto a durable block device. However, failures during execution can leave data structures in NVRAM unreachable or corrupt. In this paper, we present Makalu, a system that addresses non-volatile memory management. Makalu offers an integrated allocator and recovery-time garbage collector that maintains internal consistency, avoids NVRAM memory leaks, and is efficient, all in the face of failures.
We show that a careful allocator design can support a less restrictive and a much more familiar programming model than existing persistent memory allocators. Our allocator significantly reduces the per allocation persistence overhead by lazily persisting non-essential metadata and by employing a post-failure recovery-time garbage collector. Experimental results show that the resulting online speed and scalability of our allocator are comparable to well-known transient allocators, and significantly better than state-of-the-art persistent allocators.

... The initial set of 100,000 particle positions and forces are stored in NVRAM, as are the resultant particle positions and forces after each time-step. N-queens: It is a lock-based, multi-threaded implementation of a recursive search algorithm for finding a solution to the n-queens problem [10]. It uses a pool of workers and work queues. ...

Byte addressable non-volatile memory (NVRAM) is likely to supplement, and perhaps eventually replace, DRAM. Applications can then persist data structures directly in memory instead of serializing them and storing them onto a durable block device. However, failures during execution can leave data structures in NVRAM unreachable or corrupt. In this paper, we present Makalu, a system that addresses non-volatile memory management. Makalu offers an integrated allocator and recovery-time garbage collector that maintains internal consistency, avoids NVRAM memory leaks, and is efficient, all in the face of failures.
We show that a careful allocator design can support a less restrictive and a much more familiar programming model than existing persistent memory allocators. Our allocator significantly reduces the per allocation persistence overhead by lazily persisting non-essential metadata and by employing a post-failure recovery-time garbage collector. Experimental results show that the resulting online speed and scalability of our allocator are comparable to well-known transient allocators, and significantly better than state-of-the-art persistent allocators.

... While it has been well known that the solution to the n-queens problem is n, numerous solutions have been published since the original problem was proposed. Many of these solutions rely on providing a specific formula for placing queens or transposing smaller solutions sets to provide solutions for larger values of n (Bernhardsson, 1991 andHoffman et al., 1969). ...

... Considering that a valid combination can generate up to 8 different solutions, which are rotations of the same combination, the number of distributions to be evaluated can be reduced. The best sequential algorithm found for this problem is based on this fact [3][23][24]. There follows a brief description of said algorithm on an NxN board. ...

This paper analyzes the dynamic and static balancing of non-homogenous cluster architectures, simultaneously analyzing the theoretical parallel Speedup as well as the Speedup experimentally obtained. Three interconnected clusters have been used in which the machines within each cluster have homogeneous processors although different among clusters. Thus, the set can be seen as a 25-processor heterogeneous cluster or as a multi-cluster scheme with subsets of homogeneous processors. A classical application (Parallel N-Queens) with a parallel solution algorithm, where processing predominates upon communication, has been chosen so as to go deep in the load balancing aspects (dynamic or static) without distortion of results caused by communication overhead. At the same time, three forms of load distribution in the processors (Direct Static, Predictive Static and Dynamic by Demand) have been studied, analyzing in each case parallel Speedup and load unbalancing regarding problem size and the processors used.

... There are numerous analytical solutions to the n-queens problem where an explicit formula for a queen placement is given or solutions to the smaller sized problems are linked up together [1]. The common problem of these analytical solutions is that only a small number of solutions from a highly restricted subset of solutions can be obtained. ...

The n-queens problem is a classical combinatorial problem that has been tackled by various problem-solving strategies. There are well known probabilistic local search algorithms that are based on a conflict minimization heuristic and do not use any kind of backtracking. These algorithms run in polynomial time and are capable of solving even a very large n-queens problem. Regardless this fact, there are many researchers worldwide attempting to solve the problem by means of utilization of genetic algorithms. The aim of this paper is to present various approaches to this problem, describe their principles and especially to summarize the problems and difficulties they have in common. We believe that such an overview could be useful for understanding of other constrain-based search problems too.

... Finding a single solution for the problem of placing n queens on an n by n grid can be done in O(n) time by the deterministic algorithm reported by Hoffman, Loessi and Moore in their article in the 1969 Mathematics Magazine [1]. It was referenced in 1991 by Bernhardsson in a note to the ACM SIGART Bulletin [2]. Bernhardsson was responding to a 1990 article by Sosič and Gu in the ACM SIGART Bulletin [3], presenting a probabilistic algorithm that was reported to run in polynomial time. ...

This paper presents two Las Vegas algorithms to generate single solutions to the n-queens problem. One algorithm generates and improves on random permutation vectors until it achieves one that is a successful solution, while the other algorithm randomly positions queens within each row in positions not under attack from above.

... The n-queens problem is well suited for solution by backtracking algorithms. As Bernhardsson notes in [19] though, it is not necessary to find solutions with algorithms, because there are explicit ways to construct solutions. However, the n-queens problem is still an interesting test case for algorithms, and since there does not exist a closed form solution yielding all solutions for arbitrary n, computation is necessary to enumerate the solutions in general. ...

In this paper we survey known results for the n-queens problem of placing n nonattacking queens on an n×n chessboard and consider extensions of the problem, e.g. other board topologies and dimensions. For all solution constructions, we either give the construction, an outline of it, or a reference. In our analysis of the modular board, we give a simple result for finding the intersections of diagonals. We then investigate a number of open research areas for the problem, stating several existing and new conjectures. Along with the known results for n-queens that we discuss, we also give a history of the problem. In particular, we note that the first proof that n nonattacking queens can always be placed on an n×n board for n>3 is by E. Pauls, rather than by W. Ahrens who is typically cited. We have attempted in this paper to discuss all the mathematical literature in all languages on the n-queens problem. However, we look only briefly at computational approaches.

Search problems are ubiquitous. The search process is an adaptive process of cumulative performance selection. The structure of a given problem and the environment impose constraints. With the given constraints, a search process transforms a given problem from an initial state to a solution state.

In this chapter we will focus on the combination of evolutionary computation (EC
) techniques and constraint satisfaction problems (CSP
s). Constraint programming (CP
) is another approach to deal with constraint satisfaction problems. In fact, it is an important prelude to the work covered here as it advocates itself as an alternative approach to programming [1]. The first step is to formulate a problem as a CSP such that techniques from CP, EC, combinations of the two, often referred to as hybrids [2, 3], or other approaches can be deployed to solve the problem. The formulation of a problem has an impact on its complexity in terms of effort required to either find a solution or that proof no solution exists. It is, therefore, vital to spend time on getting this right.
CP defines search as iterative steps over a search tree where nodes are partial solutions to the problem where not all variables are assigned values. The search then maintains a partial solution that satisfies all variables with assigned values. Instead, in EC algorithms sample a space of candidate solutions where for each sample point variables are all assigned values. None of these candidate solutions will satisfy all constraints in the problem until a solution is found. Such algorithms are often classified as Davis–Putnam–Logemann–Loveland (DPLL
) algorithms, after the first backtracking algorithm for solving CSP [4].
Another major difference is that many constraint solvers from CP are sound, whereas EC solvers are not. A solver is sound if it always finds a solution if it exists. Furthermore, most constraint solvers from CP can easily be made complete, although this is often not a desired property for a constraint solver. A constraint solver is complete if it can find every solution to a problem.

Local search is currently one of the most used methods for finding solutions in real-life problems. It is usually considered when the research is interested in the final solution of the problem instead of the how the solution is reached. In this paper, the authors present an implementation of local search with Membrane Computing techniques applied to the N-queens problem as a case study. A CLIPS program inspired in the Membrane Computing design has been implemented and several experiments have been performed. The obtained results show better average times than those obtained with other Membrane Computing implementations that solve the N-queens problem.

Graphical simulation is an attempt of predicting the aspects of behavior of some system by developing an approximate model of it. Simulations have great impact on education and in training. Simulation based learning is a practical way of learning of practices that involves building connections: connections among what is being learned and what is important to the actor and the situations in which it is applied. N-Queens problem refers to the problem in which one has to place N-Queens on an n*n chess board such that no queen is attacking the other, i.e. no two queens occupy the same row, column or diagonal. Here we use graphical simulation to view various solutions to N-Queens problem. The n-queens problem is implemented by using core java. The packages used in these implementations are java.awt, java.lang, java.applet. The graphical simulation is used because n-queens problem is more complicated. It is easy for any system user or normal user to understand this problem by observing simulations. With the help of simulation the problem is explained very clearly and effectively. This will help in generating interest among perceivers because it has real time applications and will make learning better.

Dear Dr. Johnson:I read with interest the correspondence regarding the N-Queen problem in the April 1991 issue of the SIGART Bulletin, including your comments in the "Letter from the Editor." I would like to contribute some additional considerations.

The explicit solution for the n–queens problem, mentioned in a letter from Bo Bernhardsson [2], is basically Pauls's solution analyzed by Ahrens (See reference [1] of our previous article in SIGART October issue 1990). The result was in public domain long before 1918 (not 1969). We also mentioned its weakness, namely: The class of solutions provided by analytical methods is very restricted, as Ahrens pointed out in [1]. They can only provide one solution for the n–queens problem and can not provide any solution (much better explicit solutions for the n–queens problem exist). This is not the case for search methods which can find, in principle, any solution. This distinction is crucial for practical applications of the n-queens problem.

The satisfiability (SAT) problem is a central problem in mathematical logic, computing theory, and artificial intelligence. In practice, the SAT problem is fundamental in solving many practical application problems. Methods to solve the SAT problem play an important role in the development of computing theory and intelligent computing systems. There has been great interest in the design of efficient algorithms to solve the SAT problem. Since the past decade, a variety of parallel processing techniques have been developed to solve the SAT problem. The goal of this paper is to expose the rapidly growing field, to discuss tradeoff and limitations, and to describe ten state-of-the-art techniques, both algorithms and special-purpose architectures, for parallel processing of the satisfiability problem.

The pitting corrosion resistance of a new family of duplex stainless steels has been evaluated. These non-standard duplex stainless steels are characterised by low Ni content and high N and Mn levels. Potentiodynamic polarisation scans in NaCl solution have been carried out to determine pitting potentials. A crevice-free cell has been used to perform the electrochemical tests.An exponential equation is obtained in the regression analysis between the pitting potential and chemical composition which allows an estimate of the pitting resistance of these new duplex stainless steels.

We present a new heuristic search for the N-queens problem, which is neither backtracking nor random search. This algorithm finds systematically a solution in linear time. Its speed is faster than the fastest algorithm known so far. On an ordinary personal computer, it can find a solution for 3000000 queens in less than 5 seconds.

Backtracking search is frequently applied to solve a constraint-based search problem, but it often suffers from exponential growth of computing time. We present an alternative to backtracking search: local search with conflict minimization. We have applied this general search framework to study a benchmark constraint-based search problem, the n-queens problem. An efficient local search algorithm for the n-queens problem was implemented. This algorithm, running in linear time, does not backtrack. It is capable of finding a solution for extremely large size n-queens problems. For example, on a workstation it can find a solution for 3000000 queens in less than 55 s.

This paper discusses the dynamic and static balancing of non-homogenous cluster architectures, simultaneously analyzing the theoretical parallel speedup as well as the speedup experimentally obtained.
A classical application (Parallel N-Queens) with a parallel solution algorithm, where processing predominates upon communication, has been chosen so as to go deep in the load balancing aspects (dynamic or static) without distortion of results caused by communication overhead.
Four interconnected clusters have been used in which the machines within each cluster have homogeneous processors although different among clusters. Thus, the set can be seen as a N-processor heterogeneous cluster or as a multi-cluster scheme with 4 subsets of homogeneous processors.
At the same time, three forms of load distribution in the processors (Direct Static, Predictive Static and Dynamic by Demand) have been studied, analyzing in each case parallel speedup and load unbalancing regarding problem size and the processors used.

this document and other useful information can be found on the World-Wide Web (Parberry [7]). I welcome further errata and comments from readers --- I prefer electronic mail, but if you must use more primitive modes of communication, full contact information can be found in [6].

Local search is currently one of the most used methods for finding solutions in real-life problems. It is usually considered when the research is interested in the final solution of the problem instead of the how the solution is reached. In this paper, the authors present an implementation of local search with Membrane Computing techniques applied to the N-queens problem as a case study. A CLIPS program inspired in the Membrane Computing design has been implemented and several experiments have been performed. The obtained results show better average times than those obtained with other Membrane Computing implementations that solve the N-queens problem.

This paper presents a set of new decision rules for exact search in N-Queens. Apart from new tiebreaking strategies for value and variable ordering, we introduce the notion of ‘free diagonal’ for decision taking at each step of the search. With the proposed new decision heuristic the number of subproblems needed to enumerate the first K solutions (typically K=1, 10 and 100) is greatly reduced w.r.t. other algorithms and constitutes empirical evidence that the average solution density (or its inverse, the number of subproblems per solution) remains constant independent of N. Specifically finding a valid configuration was backtrack free in 994 cases out of 1000, an almost perfect decision ratio.
This research is part of a bigger project which aims at deriving new decision rules for CSP domains by evaluating, at each step, a constraint value graph Gc. N-Queens has adapted well to this strategy: domain independent rules are inferred directly from Gc whereas domain dependent knowledge is represented by an induced hypergraph over Gc and computed by similar domain independent techniques. Prior work on the Number Place problem also yielded similar encouraging results.

. The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computer-aided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, computer architecture design, and computer network design. Traditional methods treat SAT as a discrete, constrained decision problem. In recent years, many optimization methods, parallel algorithms, and practical techniques have been developed for solving SAT. In this survey, we present a general framework (an algorithm space) that integrates existing SAT algorithms into a unified perspective. We describe sequential and parallel SAT algorithms including variable splitting, resolution, local search, global optimization, mathematical programming, and practical SAT algorithms. We give performance evaluation of some existing SAT algorithms. Finally, we provide a set of practical applicatio...

The n -queens problem is a classical combinatorial problem in the artificial intelligence (AI) area. Since the problem has a simple and regular structure, it has been widely used as a testbed to develop and benchmark new AI search problem-solving strategies. Recently, this problem has found practical applications in VLSI testing and traffic control. Due to its inherent complexity, currently even very efficient AI search algorithms developed so far can only find a solution for the n -queens problem with n up to about 100. In this paper we present a new, probabilistic local search algorithm which is based on a gradient-based heuristic. This efficient algorithm is capable of finding a solution for extremely large size n -queens problems. We give the execution statistics for this algorithm with n up to 500,000.