Genetic Algorithm Solver & Visualizer

Define a fitness function, tune GA parameters, and watch the population evolve in real time.

1. Problem & Fitness Function

Choose how individuals are represented internally.

For real-valued chromosomes only.

Advanced

Must return a numeric fitness. For real-valued: individual is an array of numbers. For binary: individual is a string of 0/1.

You can use Math functions (e.g. Math.sin, Math.exp).

Each variable is initialized uniformly in [lower, upper].

2. Genetic Algorithm Parameters

Probability of crossover between two parents.

Per-gene mutation probability.

3. Results & Evolution

Best solution

Generation:

Fitness:

Chromosome:

Fitness over generations

Blue: best fitness, Gray: average fitness.

Current population (top 10)

Sorted by fitness (descending)
Rank Fitness Chromosome
Run the GA to see population details.

Evolution log

Ready. Configure the GA and click “Run genetic algorithm”.

How this genetic algorithm solver works

This tool implements a classic genetic algorithm (GA) with configurable chromosome representation, selection, crossover, mutation, and elitism. It is designed for experimentation and teaching: you can plug in your own fitness function and immediately see how the population evolves.

1. Representation (chromosomes)

  • Real-valued chromosomes: each individual is an array of real numbers, e.g. \([x_1, x_2, \dots, x_n]\). Use this for continuous optimization problems.
  • Binary chromosomes: each individual is a bit string, e.g. 0101011010. This is common for combinatorial problems or when mapping bits to parameters.

For real-valued chromosomes, initial values are sampled uniformly from the interval \([ \text{lower bound}, \text{upper bound} ]\). For binary chromosomes, bits are initialized randomly with equal probability of 0 or 1.

2. Fitness function

The fitness function measures how good a candidate solution is. The GA always tries to maximize fitness. You provide the fitness function in JavaScript:

// Real-valued example: maximize f(x, y) = -(x^2 + y^2) + 4
function fitness(individual) {
  const x = individual[0];
  const y = individual[1];
  return -(x*x + y*y) + 4;
}

For binary chromosomes, the argument is a string:

// Binary example: maximize number of 1-bits
function fitness(individual) {
  let count = 0;
  for (let i = 0; i < individual.length; i++) {
    if (individual[i] === '1') count++;
  }
  return count;
}

3. Selection

  • Tournament selection: randomly sample a few individuals and pick the one with the highest fitness. This gives strong selection pressure and is robust to scaling.
  • Roulette wheel selection: each individual gets a slice of a “wheel” proportional to its fitness. Higher fitness means higher probability of being selected.

4. Crossover and mutation

Crossover combines two parents to produce offspring. Mutation introduces random variation to avoid premature convergence.

  • Crossover rate: probability that two selected parents will be crossed. If crossover does not occur, the parents are copied directly.
  • Mutation rate: per-gene probability of mutation. For binary chromosomes, mutation flips bits. For real-valued chromosomes, a small Gaussian noise is added and then clipped to the bounds.

5. Elitism

Elitism copies the top k individuals directly into the next generation. This guarantees that the best fitness found so far is never lost due to random variation.

Interpreting the fitness chart

  • Best fitness (blue): the highest fitness in each generation.
  • Average fitness (gray): the mean fitness of the population in each generation.

A healthy run often shows the best fitness increasing quickly at first, then plateauing as the GA converges. If both curves stagnate very early, try increasing mutation rate or population size.

Tips for using genetic algorithms effectively

  • Start with moderate settings: population 30–100, mutation 0.01–0.05, 50–200 generations.
  • Scale or shift your fitness values so that better solutions clearly have higher fitness.
  • For noisy or rugged fitness landscapes, use larger populations and more generations.
  • For binary problems, keep mutation relatively low; for real-valued problems, use small Gaussian noise.

Frequently asked questions

What is a genetic algorithm?

A genetic algorithm is a population-based search method inspired by biological evolution. It maintains a set of candidate solutions, evaluates them with a fitness function, and iteratively applies selection, crossover, and mutation to evolve better solutions.

Can this solver guarantee the global optimum?

No. Genetic algorithms are heuristic methods: they often find very good solutions, but they do not guarantee the global optimum. However, by tuning parameters and running multiple times, you can often get close to the best possible value for many practical problems.

Is this tool suitable for very large or time-consuming fitness functions?

This solver runs entirely in your browser, so extremely heavy fitness functions or very large populations may be slow. For research-scale problems, you would typically implement a GA in a dedicated environment (e.g. Python, MATLAB, C++) and possibly parallelize evaluations.