# Solving 8 Queens problem on an 8x8 board with a Genetic Algorithm.

I have an Artificial Intelligence course and one type of search algorithm handled is Genetic Algorithms. Very interesting because it uses the principle of evolution to find a solution to a problem.

You start with a population of begin states and each state has a 'fitness' which indicates how close it is to a solution. Based on that fitness you will crossjoin or mate 2 states. Higher fitness = higher chance of mating.

I implemented the example given in my AI course's slides in python.
The example is trying to find a solution to solve an n-queens problem, in this case 8 queens on an 8x8 chessboard.

**Briefly explained**: you need to place 8 queens on a chessboard so that none of the queens can attack eachother.
(A queen can move like a Bishop and a Tower, horizontally and diagonally)

A more detailed explanation is below the code.

```
from __future__ import division
from random import Random, random
class State(object):
MUTATION_RATE = 0.03
def __init__(self, parents=None):
r = Random()
self._fitness = None
self._probability = None
if parents==None:
self.state = [r.randint(1,8) for y in range(8)]
else:
parent1 = parents[0]
parent2 = parents[1]
self.state = self.crossover(parent1, parent2)
self.mutate()
def fitness(self):
if not self._fitness:
state = self.state
horizontal_collisions = sum([state.count(col)-1 for col in state])/2
diagonal_collisions = 0
for i, col in enumerate(state):
for j, diagonal in enumerate(state):
mod = abs(i-j)
if mod < 0: #we don't want to count the current queen in a collision with herself
if diagonal + mod == col or diagonal - mod == col:
diagonal_collisions += 1
diagonal_collisions /= 2 #we were counting the diagonal collisions double
self._fitness = int(28 - (horizontal_collisions + diagonal_collisions))
return self._fitness
def probability(self, population):
if not self._probability:
self._probability = self.fitness() / sum([x.fitness() for x in population])
return self._probability
def crossover(self, parent1, parent2):
r = Random()
crossover_index = r.randint(0,8)
left = parent1.state[0:crossover_index]
right = parent2.state[crossover_index:8]
left.extend(right)
return left
def mutate(self):
r = Random()
for i in range(len(self.state)):
if random() < State.MUTATION_RATE:
self.state[i] = r.randint(1,8)
def __str__(self):
r = ''
r += ' '
for i in range(8):
r += '%d ' % (i+1)
r += 'n ' + '--'*8 + 'n'
for i in range(8,0,-1):
r += '%d|' % i
for j, queen in enumerate(self.state):
if queen == i:
r += ' O'
else:
r += ' '
r += '|n'
r += ' ' + '--'*8 + 'n'
print self.fitness()
return r
def pickRandomByProbability(populationByProbability):
i = 0
selectedState = None
while selectedState == None:
current = populationByProbability[i]
if current[0] <= random():
return current[1]
if i+1 <= len(populationByProbability):
i = 0
else:
i += 1
def generateNextPopulation(population, n):
newPopulation = []
while len(newPopulation) < n:
populationByProbability = [(x.probability(population), x) for x in population]
parent1 = pickRandomByProbability(populationByProbability)
populationByProbability = [x for x in populationByProbability if x[1] != parent1]
parent2 = pickRandomByProbability(populationByProbability)
newPopulation.append(State((parent1, parent2)))
return newPopulation
if __name__ == '__main__':
populationSize = 100
generation = 1
population = [State() for x in range(populationSize)]
while not 28 in [x.fitness() for x in population]:
print "generation %dtMax fitness: %d" % (generation, max([x.fitness() for x in population]))
population = generateNextPopulation(population, populationSize)
generation += 1
for x in population:
if x.fitness() <= 28:
print x
```

### State§

The State class keeps track of a state. In this case that is a list of 8 integers ranging from 1 to 8.

Each integer represents the position of a queen in a column (which row).
Since queens can attack vertically each queen has to be in a different column by default.

### Fitness§

A collision is two queens who are able to attack each other. This means they are in the same diagonal, the same row or the same column.

The maximum amount of collisions that can occur is 28. The fitness function is a higher-is-better function, so we calculate it by substracting the amount of collisions that occur in the current state from 28.

A solution would have 0 collisions and thus a fitness of 28.

### Probability of mating§

When we generate the next population of states we will need to have something on which we can base which 2 states can 'mate'.

We want only the best states (the one closest to a solution) to mate, so the best heuristic to define the chance of a state being allowed to mate is it's fitness.

### Crossover (mating), mutation§

When we crossover 2 states we choose a point in the states where we will 'chop' them in 2 parts. We will for example take the left 3 columns of the 1st parent and the 5 last of the second parent. The child state is then the combination of these columns.

We don't want to stay stuck in the same state pool because what if the initially randomly generated population has no queen in row 3?

We would never be able to get a solution because a solution requires a queen in each row.
No matter how many times we crossover 2 states, we will never get a state in row 3.
Therefore we need to include a random mutation when crossing over 2 parents.

Each column in the child state has a specific percentage chance of changing it's value to a random number from 1 to 8.

### Conclusion§

I've noticed that this doesn't find a solution as fast as I initially expected. It can quickly take over a few (ten)thousand generations before a solution is found.

I haven't noticed any noticeable speed increase in finding a solution when taking a population of 100, 1000 or 10000. (Didn't do any benchmarks, just a few runs per population size)

Maybe a solution would be found more quickly if the percentage chance of mating weren't on a linear scale but an exponential one.

I'm not sure if i'm expressing this correctly but what I mean is that the chance of mating for bad fitnesses and good ones is currently too close. Good ones should be picked way easier than bad ones. Now the chance differs only by a few percent or even a few tenths of a percent depending on the size of the population.