From Theory to Practice: Implementing Genetic Algorithms
Genetic algorithms (GA) represent one of the most fascinating intersections of biology and computer science, mimicking natural evolution to solve complex optimization problems. Building on foundational GA concepts, let's explore a practical Python implementation that tackles the classic N-Queens problem.
The N-Queens Challenge
The N-Queens problem asks us to place N chess queens on an N×N chessboard such that no two queens can attack each other. This seemingly simple puzzle becomes exponentially complex as the board size increases, making it an ideal candidate for genetic algorithm optimization.
Setting Up the Genetic Algorithm Framework
The implementation begins with a straightforward entry point that accepts three crucial parameters:
- Chromosome Size: The dimensions of the chessboard (number of queens)
- Population Size: The number of candidate solutions in each generation
- Epochs: The number of evolutionary iterations
This parameter-driven approach allows for flexible experimentation with different problem scales and algorithm configurations.
The Fitness Function: Measuring Solution Quality
The heart of any genetic algorithm lies in its fitness function. For the N-Queens problem, the implementation uses an elegant approach:
def fitness(chrom, chromosome_size):
q = 0
# Check diagonal conflicts
for i1 in range(chromosome_size):
tmp = i1 - chrom[i1]
for i2 in range(i1+1, chromosome_size):
q = q + (tmp == (i2 - chrom[i2]))
# Check anti-diagonal conflicts
for i1 in range(chromosome_size):
tmp = i1 + chrom[i1]
for i2 in range(i1+1, chromosome_size):
q = q + (tmp == (i2 + chrom[i2]))
return 1/(q+0.001)
This function counts queen conflicts and returns a higher fitness score for solutions with fewer collisions. The reciprocal formula (1/(q+0.001)) ensures that perfect solutions (no conflicts) achieve the maximum fitness score of approximately 1000.
Evolution in Action: The Training Process
The training loop demonstrates classic genetic algorithm principles:
- Evaluation: Calculate fitness scores for all chromosomes
- Selection: Identify the best-performing parents
- Reproduction: Generate offspring through mutation
- Replacement: Replace weaker chromosomes with stronger offspring
The algorithm maintains diversity by selecting the top performers while introducing variations through mutation, preventing premature convergence on suboptimal solutions.
Convergence Patterns and Learning Curves
One fascinating aspect of this implementation is its learning behavior. The algorithm often exhibits step-wise improvements, remaining at plateau levels before sudden jumps in fitness scores. For instance, it might stay at fitness score 0 for 28 epochs before jumping to 100, eventually reaching the optimal solution around epoch 70.
This behavior reflects the nature of the search space – small improvements in queen placement can lead to dramatic fitness improvements when conflicts are resolved.
Practical Applications and Extensions
While the N-Queens problem serves as an excellent learning example, the principles demonstrated here extend to numerous real-world optimization challenges:
- Resource allocation and scheduling
- Network routing optimization
- Machine learning hyperparameter tuning
- Portfolio optimization in finance
Key Takeaways for AI Practitioners
This implementation showcases several important concepts for anyone working with evolutionary algorithms:
- Fitness Design: The fitness function directly influences solution quality and convergence speed
- Parameter Tuning: Population size and epoch count significantly impact performance
- Termination Conditions: Knowing when to stop prevents unnecessary computation
- Visualization: Learning curves provide valuable insights into algorithm behavior
Looking Forward
This Python implementation provides a solid foundation for exploring genetic algorithms. The modular design makes it easy to experiment with different selection strategies, mutation operators, and crossover techniques. Whether you're solving constraint satisfaction problems or optimizing complex systems, understanding these fundamental patterns will serve you well in your AI journey.
Based on the work by Hossein Chegini, originally published on Towards AI.