My goals in this talk:
Evolutionary algorithms are great at solving a particular type of problem:
Evolutionary algorithms also perform extremely well when the environment (and thus the best solution) changes regularly.
Good uses for Evolutionary Algorithms: NP-Hard problems.
Evolutionary Algorithms help solve min-max problems and can avoid hill-climbing issues.
Evolutionary Algorithms tend to solve hill-climbing problems:
Many algorithms tend to get stuck at local maxima/minima; EAs often don't.
Evolutionary algorithms guarantee none of these properties.
Going further, evolutionary algorithms have a number of limitations:
But...they do still tend to get the job done.
Genes make up DNA sequences called genotypes. Each gene has a number of alternative forms, called alleles.
Genotypes are possible genetic structures. Given 2 alleles for each of 32 genes, we would have 2^32 or 4,294,967,296 genotypes.
BEWARE: there is not a 1:1 correspondence between genotypes and phenotypes, as phenotypes often depend upon a specific combination of alleles. When even one allele is missing, the desired effect may not appear. Close doesn't count with genetics.
Genetic algorithms were popularized with John Holland's work on the topic, particularly Adaptation in Natural and Artificial Systems.
Most simple genetic algorithms are arrays of genes with binary alleles:
We build up a population of organisms, large enough to ensure genetic diversity.
After building the population, we score each chromosome using the same fitness function:
Now each chromosome has its own score and can compare to other chromosomes.
For each organism (or pair of organisms) in the next generation, determine two "parents" based on some algorithm. A simple algorithm is roulette wheel, where we pull based on percentage fitness.
We then apply crossover with some probability. If that RNG pull succeeds, we cross over at some starting and ending point (also RNG determined):
Typically, there is one crossover range per organism pair. Sometimes, the crossover check fails and we keep the two parents. Usually we perform crossover with p = [0.6, 0.8].
For each gene in each new organism, we apply some mutation operation:
Mutation is a tool to help us get out of local maxima, but too much mutation and you have random search. Usually we perform mutation with p = [0.0001, 0.01]
Now we have a new organism, which is hopefully better than the old organisms in terms of solving our fitness function:
Repeat for each organism in the next generation, and repeat the process.
The Holyfield Problem comes from Evander Holyfield's Real Deal Boxing for the Sega Genesis.
You have several options to help make your boxer the best but a limited budget that prohibits you from taking everything. How do you make the right decisions?
Result: genetic algorithms let us choose the best set of options, making our boxer the best there is.
We want to maximize returns for a portfolio while minimizing risk.
John Koza popularized Genetic Programming with his eponymous series of books.
Genetic programming is an extension of genetic algorithm concepts. Genetic algorithms feature fixed-sized chromosomes with data-representational alleles.
Genetic programming takes the mechanisms of genetic algorithms and applies it to a higher-level problem.
Similarities:
But there's one major difference: the organisms/chromosomes are entire programs.
Suppose we want to solve a problem where the fitness function is (56 - x)^2
. One candidate solution:
In practice, genetic programs are significantly less parsimonious.
Genetic programs are great for deriving mathematical relationships.
Genetic programs can help us derive non-linear function results, similar to a regression model. If you are regressing, beware that genetic programs will overtrain on the current data; they're much more useful for identifying formulas.
Another use for genetic programming is building regression trees, which use decision tree mechanics but solve the same problem as a regression model: predicting the dependent variable's value.
For example: how much should you be making?
Our regression tree was not better than a linear regression based on the attributes we selected, but was more understandable to a layman. In other cases, we can end up with better results.
There's some similarity to evolutionary algorithms!
Reinforcement Learning:
Genetic Algorithms:
Paper by Irmouli, et al: Genetic Algorithm enhanced by Deep Reinforcement learning in parent selection mechanism and mutation
Neural networks have dominated machine learning and artificial intelligence over the past 12+ years. They tend to be significantly better at approximating functions with continuous data, such as image classification or regression techniques.
Genetic algorithms work best with broad-spaced and complicated search spaces.
Genetic algorithms provide the fitness function for a neural network to solve a Super Mario World level.
Video link: https://www.youtube.com/watch?v=qv6UVOQ0F44
Evolutionary algorithms, including genetic algorithms and genetic programming, offer up an interesting approach to navigating large hill-climbing type search spaces with adequate efficiency and relatively high likelihood of avoiding local minima/maxima.
To learn more, go here:
https://csmore.info/on/gia
And for help, contact me:
feasel@catallaxyservices.com | @feaselkl
Catallaxy Services consulting:
https://CSmore.info/contact