**Genetic Algorithms** (GA) are a technique of search-driven optimization based on Genetics and Natural Selection principles.

It is used to discover optimum or near-optimal solutions to complex situations that would otherwise take a lifetime to solve. It is also used to solve problems of optimization, in science, and in machine learning.

Table of Contents

## History of Genetic Algorithms

The Genetic Algorithm was invented by John Holland in 1975, based on Darwin’s principles.

Genetic Algorithms are used by copying the evolutionary actions of animals to solve optimization problems. This species is advanced by the selection, mutation, and crossover operators, inspired by natural evolution from an initial random population of solutions.

The society goes through an iterative procedure in which it enters different states by applying the specified series of operations, and each one is called generation.

## The Architecture of Genetic Algorithms

A population of candidate solutions (called phenotypes) to an optimization issue is evolved in a genetic algorithm towards better solutions.

Each candidate solution has a series of properties that can be mutated and altered (its chromosomes or genotype); solutions are usually represented as sequences of 0s and 1s in binary, although other encodings are also possible.

**1. Evolution phase**

Evolution typically begins with a randomly generated population, which is an iterative process, with a generation called the population in each iteration.

The fitness of any person in the population is measured in each generation; fitness is typically the value of the objective function being solved in the optimization problem.

The most desirable individuals from the existing population are stochastically chosen, and the genome of each individual is modified (recombined and probably spontaneously mutated) to form a new generation.

In the next iteration of the algorithm, the latest set of candidate solutions is then used.

The algorithm typically stops when either a sufficient number of generations has been generated or a suitable fitness standard for the population has been achieved, i.e., the overall fitness function has been optimized.

**2. Candidate solutions**

Represented as an array of bits. In principle, arrays of other types and configurations may be used the same way.

Because of their fixed dimensions, the one property that makes these genetic representations useful is that their pieces are conveniently matched, which enables easy crossover operations.

Variable-length representations can also be used, but in this case, the implementation of crossover is more complicated.

**3. Initialization** of Population

The size of the population depends on the type of the problem, but usually requires several hundred or thousands of potential solutions.

The initial population is always randomly generated, making the entire continuum of potential solutions (the search space).

Occasionally, in places where optimal solutions are likely to be discovered, the solutions can be ‘seeded’.

### 4. Selection

Selection is the stage of a genetic algorithm in which for later breeding (using the crossover operator), individual genomes are picked from a population.

It is possible to introduce a generic collection protocol as follows:

- For each individual, the fitness feature is measured, delivering fitness values, which are then normalized. Normalization involves dividing each individual’s fitness value by the sum of all fitness values, such that the sum of all fitness values resulting from it is equal to 1.
- Accumulated normalized fitness values are calculated: an individual’s accumulated fitness value is the sum of its own fitness value plus the fitness values of all previous people; the last person’s accumulated fitness should be 1, or in the normalization stage something went wrong.
- A random number R is selected between 0 and 1.
- The person chosen is the first one whose normalized cumulative value is greater than or equal to R.

The above algorithm could be computer-demanding for several issues. The so-called stochastic acceptance makes use of a much simplified and quicker alternative.

### 5. Cross – mutation

The next step is to produce solutions from those chosen by a mixture of genetic operators for a second-generation population: **crossover **(also called recombination) and **mutation**.

A pair of ‘parent’ solutions for each new solution to be created are chosen for breeding from the pool previously selected.

A new solution that usually shares many of the characteristics of its “parents” is created by creating a “child” solution using the above crossover and mutation approaches.

### 6. Termination

This generational process is replicated until it has reached a termination condition. Popular terminating situations are:

- A solution that fits the minimum requirements is found
- Set number achieved for generations
- Reached the allotted budget (computing time/money)
- The fitness of the top-ranked solution reaches or has reached a plateau so that subsequent iterations no longer yield improved outcomes.
- Inspection manually
- Combinations of the following

## Popular Genetic Algorithms

The most popular named Genetic Algorithms are:

- Gene expression programming (GEP)
- Ant colony optimization (ACO)
- Particle swarm optimization (PSO)
- Memetic algorithm (MA), often called hybrid genetic algorithm
- Simulated annealing (SA), and
- Tabu search (TS)

## Ending Notes

We’ll talk about and implement a few of these in further articles. If you liked reading this article, follow me as an author. Until then, keep coding!