In this article, we'll train the car to do self-parking using a . genetic algorithm We'll create of cars with random genomes that will behave something like this: the 1st generation the cars start learning what the self-parking is and start getting closer to the parking spot: On the ≈40th generation Another example with a bit more challenging starting point: Yeah-yeah, the car is hitting some other cars along the way, and also are not perfectly fitting the parking spot, but this is only the 40th generation since the creation of the world for them, so be merciful and give the cars some space to grow :D You may launch the 🚕 to see the evolution process directly in your browser. The simulator gives you the following opportunities: Self-parking Car Evolution Simulator You may and adjust genetic parameters by yourself train the cars from scratch You may see the trained self-parking cars in action You may also try to park the car manually The genetic algorithm for this project is implemented in TypeScript. The full genetic source code will be shown in this article, but you may also find the final code examples in the . Evolution Simulator repository We're going to use a genetic algorithm for the particular task of evolving cars' genomes. However, this article only touches on the basics of the algorithm and is by no means a complete guide to the genetic algorithm topic. Having that said, let's deep dive into more details... The Plan Step-by-step we're going to break down a high-level task of creating the self-parking car to the straightforward low-level optimization problem of finding the optimal combination of bits (finding the optimal car genome). 180 Here is what we're going to do: 💪🏻 Give the (engine, steering wheel) to the car so that it could move towards the parking spot. muscles 👀 Give the (sensors) to the car so that it could see the obstacles around. eyes 🧠 Give the to the car that will control the muscles (movements) based on what the car sees (obstacles via sensors). The brain will be simply a pure function . brain movements = f(sensors) 🧬 to do the right moves based on the sensors input. This is where we will apply a genetic algorithm. Generation after generation our brain function will learn how to move the car towards the parking spot. Evolve the brain movements = f(sensors) Giving the muscles to the car To be able to move, the car would need "muscles". Let's give the car two types of muscles: - allows the car to move , , or (neutral gear) Engine muscle ↓ back ↑ forth ◎ stand steel - allows the car to turn , , or while moving Steering wheel muscle ← left → right ◎ go straight With these two muscles the car can perform the following movements: In our case, the muscles are receivers of the signals that come from the brain once every (milliseconds). Based on the value of the brain's signal the muscles act differently. We'll cover the "brain" part below, but for now, let's say that our brain may send only 3 possible signals to each muscle: , , or . 100ms -1 0 +1 type MuscleSignal = -1 | 0 | 1; For example, the brain may send the signal with the value of to the engine muscle and it will start moving the car forward. The signal to the engine moves the car backward. At the same time, if the brain will send the signal of to the steering wheel muscle, it will turn the car to the left, etc. +1 -1 -1 Here is how the brain signal values map to the muscle actions in our case: Muscle Signal = -1 Signal = 0 Signal = +1 Engine ↓ Backward ◎ Neutral ↑ Forward Steering wheel ← Left ◎ Straight → Right You may and try to park the car manually to see how the car muscles work. Every time you press one of the keyboard keys (or use a touch-screen joystick) you send these , , or signals to the engine and steering wheel muscles. use the Evolution Simulator WASD -1 0 +1 Giving the eyes to the car Before our car learns how to do self-parking using its muscles, it needs to be able to "see" the surroundings. Let's give it the eyes in the form of distance sensors: 8 Each sensor can detect the obstacle in a distance range of (meters). 0-4m Each sensor reports the latest information about the obstacles it "sees" to the car's "brain" every . 100ms Whenever the sensor doesn't see any obstacles, it reports the value of . On the contrary, if the value of the sensor is small but not zero (i.e. ) it would mean that the obstacle is close. 0 0.01m You may and see how the color of each sensor changes based on how close the obstacle is. use the Evolution Simulator type Sensors = number[]; Giving the brain to the car At this moment, our car can "see" and "move," but there is no "coordinator" that would transform the signals from the "eyes" to the proper movements of the "muscles" We need to give the car a "brain.” Brain input As input from the sensors, every the brain will be getting float numbers, each one in the range of . For example, the input might look like this: 100ms 8 [0...4] const sensors: Sensors = [s0, s1, s2, s3, s4, s5, s6, s7]; // i.e. 🧠 ← [0, 0.5, 4, 0.002, 0, 3.76, 0, 1.245] Brain output Every the brain should produce two integers as an output: 100ms One number as a signal for the engine: engineSignal One number as a signal for the steering wheel: wheelSignal Each number should be of the type and might take one of three values: , , or . MuscleSignal -1 0 +1 Brain formulas/functions Keeping in mind the brain's input and output mentioned above, we may say that the brain is just a function: const { engineSignal, wheelSignal } = brainToMuscleSignal( brainFunction(sensors) ); // i.e. { engineSignal: 0, wheelSignal: -1 } ← 🧠 ← [0, 0.5, 4, 0.002, 0, 3.76, 0, 1.245] Where is a function that converts raw brain signals (any float number) to muscle signals (to , , or number) so that muscles could understand it. We'll implement this converter function below. brainToMuscleSignal() -1 0 +1 The main question now is what kind of a function the is. brainFunction() To make the car smarter and its movements to be more sophisticated, we could go with a . The name is a bit scary, but this is a simple Neural Network with a basic architecture (think of it as a big formula with many parameters/coefficients). Multilayer Perceptron I've covered Multilayer Perceptrons with a bit more details in my , , and projects. You may even challenge that simple network . homemade-machine-learning machine-learning-experiments nano-neuron to recognize your written digits However, to avoid the introduction of a whole new concept of Neural Networks, we'll go with a much simpler approach, and we'll use two with multiple variables (to be more precise, each polynomial will have exactly variables, since we have sensors) which will look something like this: Linear Polynomials 8 8 engineSignal = brainToMuscleSignal( (e0 * s0) + (e1 * s1) + ... + (e7 * s7) + e8 // <- brainFunction ) wheelSignal = brainToMuscleSignal( (w0 * s0) + (w1 * s1) + ... + (w7 * s7) + w8 // <- brainFunction ) Where: - the variables, which are the sensor values. These are dynamic. [s0, s1, ..., s7] 8 8 - the coefficients for the engine polynomial. These the car will need to learn, and they will be static. [e0, e1, ..., e8] 9 - the coefficients for the steering wheel polynomial. These the car will need to learn, and they will be static [w0, w1, ..., w8] 9 The cost of using the simpler function for the brain will be that the car won't be able to learn some sophisticated moves and also won't be able to generalize well and adapt well to unknown surroundings. But for our particular parking lot and for the sake of demonstrating the work of a genetic algorithm, it should still be enough. We may implement the generic polynomial function in the following way: type Coefficients = number[]; // Calculates the value of a linear polynomial based on the coefficients and variables. const linearPolynomial = (coefficients: Coefficients, variables: number[]): number => { if (coefficients.length !== (variables.length + 1)) { throw new Error('Incompatible number of polynomial coefficients and variables'); } let result = 0; coefficients.forEach((coefficient: number, coefficientIndex: number) => { if (coefficientIndex < variables.length) { result += coefficient * variables[coefficientIndex]; } else { // The last coefficient needs to be added up without multiplication. result += coefficient } }); return result; }; The car's brain, in this case, will consist of two polynomials and will look like this: const engineSignal: MuscleSignal = brainToMuscleSignal( linearPolynomial(engineCoefficients, sensors) ); const wheelSignal: MuscleSignal = brainToMuscleSignal( linearPolynomial(wheelCoefficients, sensors) ); The output of a function is a float number. The function needs to convert the wide range of floats to three particular integers, and it will do it in two steps: linearPolynomial() brainToMuscleSignal() Convert the float of a wide range (i.e. or or ) to the float in a range of (i.e. or ) 0.456 3673.45 -280 (0...1) 0.05 0.86 Convert the float in a range of to one of three integer values of , , or . For example, the floats that are close to will be converted to , the floats that are close to will be converted to and the floats that are close to will be converted to . (0...1) -1 0 +1 0 -1 0.5 0 1 1 To do the first part of the conversion, we need to introduce a that implements the following formula: Sigmoid Function It converts the wide range of floats (the axis) to float numbers with a limited range of (the axis). This is exactly what we need. x (0...1) y Here is how the conversion steps would look on the Sigmoid graph. The implementation of two conversion steps mentioned above would look like this: // Calculates the sigmoid value for a given number. const sigmoid = (x: number): number => { return 1 / (1 + Math.E ** -x); }; // Converts sigmoid value (0...1) to the muscle signals (-1, 0, +1) // The margin parameter is a value between 0 and 0.5: // [0 ... (0.5 - margin) ... 0.5 ... (0.5 + margin) ... 1] const sigmoidToMuscleSignal = (sigmoidValue: number, margin: number = 0.4): MuscleSignal => { if (sigmoidValue < (0.5 - margin)) { return -1; } if (sigmoidValue > (0.5 + margin)) { return 1; } return 0; }; // Converts raw brain signal to the muscle signal. const brainToMuscleSignal = (rawBrainSignal: number): MuscleSignal => { const normalizedBrainSignal = sigmoid(rawBrainSignal); return sigmoidToMuscleSignal(normalizedBrainSignal); } Car's genome (DNA) ☝🏻 The main conclusion from the "Eyes", "Muscles" and "Brain" sections above should be this: the coefficients and defines the behavior of the car. These numbers together form the unique car's Genome (or car's DNA). [e0, e1, ..., e8] [w0, w1, ..., w8] 18 Car genome in a decimal form Let's join the and brain coefficients together to form a car's genome in a decimal form: [e0, e1, ..., e8] [w0, w1, ..., w8] // Car genome as a list of decimal numbers (coefficients). const carGenomeBase10 = [e0, e1, ..., e8, w0, w1, ..., w8]; // i.e. carGenomeBase10 = [17.5, 0.059, -46, 25, 156, -0.085, -0.207, -0.546, 0.071, -58, 41, 0.011, 252, -3.5, -0.017, 1.532, -360, 0.157] Car genome in a binary form Let's move one step deeper (to the level of the genes) and convert the decimal numbers of the car's genome to the binary format (to the plain s and s). 1 0 I've described in the detail the process of converting the floating-point numbers to binary numbers in the article. You might want to check it out if the code in this section is not clear. Binary representation of the floating-point numbers Here is a quick example of how the floating-point number may be converted to the binary number (again, feel free to if the example is confusing): 16 bits read this first In our case, to reduce the genome length, we will convert each floating coefficient to the non-standard binary number ( sign bit, exponent bits, fraction bits). 10 bits 1 4 5 We have coefficients in total, every coefficient will be converted to bits number. It means that the car's genome will be an array of s and s with a length of . 18 10 0 1 18 * 10 = 180 bits For example, for the genome in a decimal format that was mentioned above, its binary representation would look like this: type Gene = 0 | 1; type Genome = Gene[]; const genome: Genome = [ // Engine coefficients. 0, 1, 0, 1, 1, 0, 0, 0, 1, 1, // <- 17.5 0, 0, 0, 1, 0, 1, 1, 1, 0, 0, // <- 0.059 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, // <- -46 0, 1, 0, 1, 1, 1, 0, 0, 1, 0, // <- 25 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, // <- 156 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, // <- -0.085 1, 0, 1, 0, 0, 1, 0, 1, 0, 1, // <- -0.207 1, 0, 1, 1, 0, 0, 0, 0, 1, 1, // <- -0.546 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, // <- 0.071 // Wheels coefficients. 1, 1, 1, 0, 0, 1, 1, 0, 1, 0, // <- -58 0, 1, 1, 0, 0, 0, 1, 0, 0, 1, // <- 41 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, // <- 0.011 0, 1, 1, 1, 0, 1, 1, 1, 1, 1, // <- 252 1, 1, 0, 0, 0, 1, 1, 0, 0, 0, // <- -3.5 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, // <- -0.017 0, 0, 1, 1, 1, 1, 0, 0, 0, 1, // <- 1.532 1, 1, 1, 1, 1, 0, 1, 1, 0, 1, // <- -360 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, // <- 0.157 ]; Oh my! The binary genome looks so cryptic. But can you imagine that these zeroes and ones alone define how the car behaves in the parking lot! It's like you hacked someone's DNA and know what each gene means exactly. Amazing! 180 By the way, you may see the exact values of genomes and coefficients for the best performing car on the dashboard: Evolution Simulator Here is the source code that performs the conversion from binary to decimal format for the floating-point numbers (the brain will need it to decode the genome and to produce the muscle signals based on the genome data): type Bit = 0 | 1; type Bits = Bit[]; type PrecisionConfig = { signBitsCount: number, exponentBitsCount: number, fractionBitsCount: number, totalBitsCount: number, }; type PrecisionConfigs = { custom: PrecisionConfig, }; const precisionConfigs: PrecisionConfigs = { // Custom-made 10-bits precision for faster evolution progress. custom: { signBitsCount: 1, exponentBitsCount: 4, fractionBitsCount: 5, totalBitsCount: 10, }, }; // Converts the binary representation of the floating-point number to decimal float number. function bitsToFloat(bits: Bits, precisionConfig: PrecisionConfig): number { const { signBitsCount, exponentBitsCount } = precisionConfig; // Figuring out the sign. const sign = (-1) ** bits[0]; // -1^1 = -1, -1^0 = 1 // Calculating the exponent value. const exponentBias = 2 ** (exponentBitsCount - 1) - 1; const exponentBits = bits.slice(signBitsCount, signBitsCount + exponentBitsCount); const exponentUnbiased = exponentBits.reduce( (exponentSoFar: number, currentBit: Bit, bitIndex: number) => { const bitPowerOfTwo = 2 ** (exponentBitsCount - bitIndex - 1); return exponentSoFar + currentBit * bitPowerOfTwo; }, 0, ); const exponent = exponentUnbiased - exponentBias; // Calculating the fraction value. const fractionBits = bits.slice(signBitsCount + exponentBitsCount); const fraction = fractionBits.reduce( (fractionSoFar: number, currentBit: Bit, bitIndex: number) => { const bitPowerOfTwo = 2 ** -(bitIndex + 1); return fractionSoFar + currentBit * bitPowerOfTwo; }, 0, ); // Putting all parts together to calculate the final number. return sign * (2 ** exponent) * (1 + fraction); } // Converts the 8-bit binary representation of the floating-point number to decimal float number. function bitsToFloat10(bits: Bits): number { return bitsToFloat(bits, precisionConfigs.custom); } Brain function working with binary genome Previously our brain function was working with the decimal form of and polynomial coefficients directly. However, these coefficients are now encoded in the binary form of a genome. Let's add a function that will extract coefficients from the genome and let's rewrite our brain functions: engineCoefficients wheelCoefficients decodeGenome() // Car has 16 distance sensors. const CAR_SENSORS_NUM = 8; // Additional formula coefficient that is not connected to a sensor. const BIAS_UNITS = 1; // How many genes do we need to encode each numeric parameter for the formulas. const GENES_PER_NUMBER = precisionConfigs.custom.totalBitsCount; // Based on 8 distance sensors we need to provide two formulas that would define car's behavior: // 1. Engine formula (input: 8 sensors; output: -1 (backward), 0 (neutral), +1 (forward)) // 2. Wheels formula (input: 8 sensors; output: -1 (left), 0 (straight), +1 (right)) const ENGINE_FORMULA_GENES_NUM = (CAR_SENSORS_NUM + BIAS_UNITS) * GENES_PER_NUMBER; const WHEELS_FORMULA_GENES_NUM = (CAR_SENSORS_NUM + BIAS_UNITS) * GENES_PER_NUMBER; // The length of the binary genome of the car. const GENOME_LENGTH = ENGINE_FORMULA_GENES_NUM + WHEELS_FORMULA_GENES_NUM; type DecodedGenome = { engineFormulaCoefficients: Coefficients, wheelsFormulaCoefficients: Coefficients, } // Converts the genome from a binary form to the decimal form. const genomeToNumbers = (genome: Genome, genesPerNumber: number): number[] => { if (genome.length % genesPerNumber !== 0) { throw new Error('Wrong number of genes in the numbers genome'); } const numbers: number[] = []; for (let numberIndex = 0; numberIndex < genome.length; numberIndex += genesPerNumber) { const number: number = bitsToFloat10(genome.slice(numberIndex, numberIndex + genesPerNumber)); numbers.push(number); } return numbers; }; // Converts the genome from a binary form to the decimal form // and splits the genome into two sets of coefficients (one set for each muscle). const decodeGenome = (genome: Genome): DecodedGenome => { const engineGenes: Gene[] = genome.slice(0, ENGINE_FORMULA_GENES_NUM); const wheelsGenes: Gene[] = genome.slice( ENGINE_FORMULA_GENES_NUM, ENGINE_FORMULA_GENES_NUM + WHEELS_FORMULA_GENES_NUM, ); const engineFormulaCoefficients: Coefficients = genomeToNumbers(engineGenes, GENES_PER_NUMBER); const wheelsFormulaCoefficients: Coefficients = genomeToNumbers(wheelsGenes, GENES_PER_NUMBER); return { engineFormulaCoefficients, wheelsFormulaCoefficients, }; }; // Update brain function for the engine muscle. export const getEngineMuscleSignal = (genome: Genome, sensors: Sensors): MuscleSignal => { const {engineFormulaCoefficients: coefficients} = decodeGenome(genome); const rawBrainSignal = linearPolynomial(coefficients, sensors); return brainToMuscleSignal(rawBrainSignal); }; // Update brain function for the wheels muscle. export const getWheelsMuscleSignal = (genome: Genome, sensors: Sensors): MuscleSignal => { const {wheelsFormulaCoefficients: coefficients} = decodeGenome(genome); const rawBrainSignal = linearPolynomial(coefficients, sensors); return brainToMuscleSignal(rawBrainSignal); }; Self-driving car problem statement ☝🏻 So, finally, we've got to the point when the high-level problem of making the car to be a self-parking car is broken down to the straightforward optimization problem of finding the optimal combination of ones and zeros (finding the "good enough" car's genome). Sounds simple, doesn't it? 180 Naive approach We could approach the problem of finding the "good enough" genome in a naive way and try out all possible combinations of genes: , and then... [0, ..., 0, 0] , and then... [0, ..., 0, 1] , and then... [0, ..., 1, 0] , and then... [0, ..., 1, 1] ... But, let's do some math. With bits and with each bit being equal either to or to we would have (or ) possible combinations. Let's say we would need to give to each car to see if it will park successfully or not. Let's also say that we may run a simulation for cars at once. Then we would need which is . Pretty long waiting time. Just as a side thought, it is only that have passed after Christ was born. 180 0 1 2^180 1.53 * 10^54 15s 10 15 * (1.53 * 10^54) / 10 = 2.29 * 10^54 [seconds] 7.36 * 10^46 [years] 2.021 * 10^3 [years] Genetic approach We need a faster algorithm to find the optimal value of the genome. This is where the genetic algorithm comes to the rescue. We might not find the best value of the genome, but there is a chance that we may find the optimal value of it. And, more importantly, we don't need to wait that long. With the , I was able to find a pretty good genome within . Evolution Simulator 24 [hours] Genetic algorithm basics Genetic (GA) are inspired by the process of natural selection and are commonly used to generate high-quality solutions to optimization problems by relying on biologically inspired operators such as , and . algorithms crossover mutation, selection The problem of finding the "good enough" combination of genes for the car looks like an optimization problem, so there is a good chance that GA will help us here. We're not going to cover a genetic algorithm in all details, but on a high level, here are the basic steps that we will need to do: – the very first generation of cars , so we will generate a set of random car genomes (set of binary arrays with the length of ) at the very beginning. For example, we may create cars. With a bigger population the chances to find the optimal solution (and to find it faster) increase. CREATE can't come out of nothing 180 ~1000 - we will need to select the fittest individuums out of the current generation for further mating (see the next step). The fitness of each individuum will be defined based on the fitness function, which in our case, will show how close the car approached the target parking spot. The closer the car to the parking spot, the fitter it is. SELECT – simply saying we will allow the selected to have with the selected so that their genomes could mix in a proportion and produce genomes. The idea is that the children cars might get better (or worse) in self-parking by taking the best (or the worst) bits from their parents. MATE "♂ father-cars" "sex" "♀ mother-cars" ~50/50 "♂♀ children-cars" - during the mating process, some genes may randomly mutate ( s and s in child genome may flip). This may bring a wider variety of children's genomes and, thus, a wider variety of children cars behavior. Imagine that the 1st bit was accidentally set to for all cars. The only way to try the car with the 1st bit being set to is through the random mutations. At the same time, extensive mutations may ruin healthy genomes. MUTATE 1 0 0 ~1000 1 Go to "Step 2" unless the number of generations has reached the limit (i.e. generations have passed) or unless the top-performing individuums have reached the expected fitness function value (i.e., the best car has approached the parking spot closer than ). Otherwise, quit. 100 1 meter Evolving the car's brain using a Genetic Algorithm Before launching the genetic algorithm, let's go and create the functions for the "CREATE", "SELECT", "MATE" and "MUTATE" steps of the algorithm. Functions for the CREATE step The function will create an array of random genomes (a.k.a. population or generation) and will accept two parameters: createGeneration() - defines the size of the generation. This generation size will be preserved from generation to generation. generationSize - defines the genome length of each individuum in the car’s population. In our case, the length of the genome will be . genomeLength 180 There is a chance for each gene of a genome to be either or . 50/50 0 1 type Generation = Genome[]; type GenerationParams = { generationSize: number, genomeLength: number, }; function createGenome(length: number): Genome { return new Array(length) .fill(null) .map(() => (Math.random() < 0.5 ? 0 : 1)); } function createGeneration(params: GenerationParams): Generation { const { generationSize, genomeLength } = params; return new Array(generationSize) .fill(null) .map(() => createGenome(genomeLength)); } Functions for the MUTATE step The function will mutate some genes randomly based on the value. mutate() mutationProbability For example, if the then there is a chance for each genome to be mutated. Let's say if we would have a genome of length that looks like , then after the mutation, there will be a chance that 1 gene will be mutated and we may get a genome that might look like . mutationProbability = 0.1 10% 10 [0, 0, 0, 0, 0, 0 ,0 ,0 ,0 ,0] [0, 0, 0, 1, 0, 0 ,0 ,0 ,0 ,0] // The number between 0 and 1. type Probability = number; // @see: https://en.wikipedia.org/wiki/Mutation_(genetic_algorithm) function mutate(genome: Genome, mutationProbability: Probability): Genome { for (let geneIndex = 0; geneIndex < genome.length; geneIndex += 1) { const gene: Gene = genome[geneIndex]; const mutatedGene: Gene = gene === 0 ? 1 : 0; genome[geneIndex] = Math.random() < mutationProbability ? mutatedGene : gene; } return genome; } Functions for the MATE step The function will accept the and the genomes and will produce two children. We will imitate the real-world scenario and also do the mutation during the mating. mate() father mother Each bit of the child genome will be defined based on the values of the correspondent bit of the father's or mother's genomes. There is a probability that the child will inherit the bit of the father or the mother. For example, let's say we have genomes of length (for simplicity reasons): 50/50% 4 Father's genome: [0, 0, 1, 1] Mother's genome: [0, 1, 0, 1] ↓ ↓ ↓ ↓ Possible kid #1: [0, 1, 1, 1] Possible kid #2: [0, 0, 1, 1] In the example above, the mutation were not taken into account. Here is the function implementation: // Performs Uniform Crossover: each bit is chosen from either parent with equal probability. // @see: https://en.wikipedia.org/wiki/Crossover_(genetic_algorithm) function mate( father: Genome, mother: Genome, mutationProbability: Probability, ): [Genome, Genome] { if (father.length !== mother.length) { throw new Error('Cannot mate different species'); } const firstChild: Genome = []; const secondChild: Genome = []; // Conceive children. for (let geneIndex = 0; geneIndex < father.length; geneIndex += 1) { firstChild.push( Math.random() < 0.5 ? father[geneIndex] : mother[geneIndex] ); secondChild.push( Math.random() < 0.5 ? father[geneIndex] : mother[geneIndex] ); } return [ mutate(firstChild, mutationProbability), mutate(secondChild, mutationProbability), ]; } Functions for the SELECT step To select the fittest individuums for further mating, we need a way to find out the fitness of each genome. To do this, we will use a so-called fitness function. The fitness function is always related to the particular task that we try to solve, and it is not generic. In our case, the fitness function will measure the distance between the car and the parking spot. The closer the car to the parking spot, the fitter it is. We will implement the fitness function a bit later, but for now, let's introduce the interface for it: type FitnessFunction = (genome: Genome) => number; Now, let's say we have fitness values for each individuum in the population. Let's also say that we sorted all individuums by their fitness values so that the first individuums are the strongest ones. How should we select the fathers and the mothers from this array? We need to do the selection in a way that the higher the fitness value of the individuum, the higher the chances of this individuum being selected for mating. The function will help us with this. weightedRandom() // Picks the random item based on its weight. // The items with a higher weight will be picked more often. const weightedRandom = <T>(items: T[], weights: number[]): { item: T, index: number } => { if (items.length !== weights.length) { throw new Error('Items and weights must be of the same size'); } // Preparing the cumulative weights array. // For example: // - weights = [1, 4, 3] // - cumulativeWeights = [1, 5, 8] const cumulativeWeights: number[] = []; for (let i = 0; i < weights.length; i += 1) { cumulativeWeights[i] = weights[i] + (cumulativeWeights[i - 1] || 0); } // Getting the random number in a range [0...sum(weights)] // For example: // - weights = [1, 4, 3] // - maxCumulativeWeight = 8 // - range for the random number is [0...8] const maxCumulativeWeight = cumulativeWeights[cumulativeWeights.length - 1]; const randomNumber = maxCumulativeWeight * Math.random(); // Picking the random item based on its weight. // The items with higher weight will be picked more often. for (let i = 0; i < items.length; i += 1) { if (cumulativeWeights[i] >= randomNumber) { return { item: items[i], index: i, }; } } return { item: items[items.length - 1], index: items.length - 1, }; }; The usage of this function is pretty straightforward. Let's say you really like bananas and want to eat them more often than strawberries. Then you may call , and in out of cases the variable will be equal to , and only in out of times it will be equal to . const fruit = weightedRandom(['banana', 'strawberry'], [9, 1]) ≈9 10 fruit banana ≈1 10 strawberry To avoid losing the best individuums (let's call them champions) during the mating process, we may also introduce a so-called parameter. For example, if the , then of the best cars from the previous population will be carried over to the new generation. You may think about it as there are some long-living individuums that can live a long life and see their children and even grandchildren. longLivingChampionsPercentage longLivingChampionsPercentage = 10 10% Here is the actual implementation of the function: select() // The number between 0 and 100. type Percentage = number; type SelectionOptions = { mutationProbability: Probability, longLivingChampionsPercentage: Percentage, }; // @see: https://en.wikipedia.org/wiki/Selection_(genetic_algorithm) function select( generation: Generation, fitness: FitnessFunction, options: SelectionOptions, ) { const { mutationProbability, longLivingChampionsPercentage, } = options; const newGeneration: Generation = []; const oldGeneration = [...generation]; // First one - the fittest one. oldGeneration.sort((genomeA: Genome, genomeB: Genome): number => { const fitnessA = fitness(genomeA); const fitnessB = fitness(genomeB); if (fitnessA < fitnessB) { return 1; } if (fitnessA > fitnessB) { return -1; } return 0; }); // Let long-liver champions continue living in the new generation. const longLiversCount = Math.floor(longLivingChampionsPercentage * oldGeneration.length / 100); if (longLiversCount) { oldGeneration.slice(0, longLiversCount).forEach((longLivingGenome: Genome) => { newGeneration.push(longLivingGenome); }); } // Get the data about he fitness of each individuum. const fitnessPerOldGenome: number[] = oldGeneration.map((genome: Genome) => fitness(genome)); // Populate the next generation until it becomes the same size as a old generation. while (newGeneration.length < generation.length) { // Select random father and mother from the population. // The fittest individuums have higher chances to be selected. let father: Genome | null = null; let fatherGenomeIndex: number | null = null; let mother: Genome | null = null; let matherGenomeIndex: number | null = null; // To produce children the father and mother need each other. // It must be two different individuums. while (!father || !mother || fatherGenomeIndex === matherGenomeIndex) { const { item: randomFather, index: randomFatherGenomeIndex, } = weightedRandom<Genome>(generation, fitnessPerOldGenome); const { item: randomMother, index: randomMotherGenomeIndex, } = weightedRandom<Genome>(generation, fitnessPerOldGenome); father = randomFather; fatherGenomeIndex = randomFatherGenomeIndex; mother = randomMother; matherGenomeIndex = randomMotherGenomeIndex; } // Let father and mother produce two children. const [firstChild, secondChild] = mate(father, mother, mutationProbability); newGeneration.push(firstChild); // Depending on the number of long-living champions it is possible that // there will be the place for only one child, sorry. if (newGeneration.length < generation.length) { newGeneration.push(secondChild); } } return newGeneration; } Fitness function The fitness of the car will be defined by the distance from the car to the parking spot. The higher the distance, the lower the fitness. The final distance we will calculate is an average distance from car wheels to the correspondent corners of the parking spot. This distance we will call the which is inversely proportional to the . 4 4 loss fitness Calculating the distance between each wheel and each corner separately (instead of just calculating the distance from the car center to the parking spot center) will make the car preserve the proper orientation relative to the parking spot. The distance between two points in space will be calculated based on the like this: Pythagorean theorem type NumVec3 = [number, number, number]; // Calculates the XZ distance between two points in space. // The vertical Y distance is not being taken into account. const euclideanDistance = (from: NumVec3, to: NumVec3) => { const fromX = from[0]; const fromZ = from[2]; const toX = to[0]; const toZ = to[2]; return Math.sqrt((fromX - toX) ** 2 + (fromZ - toZ) ** 2); }; The distance (the ) between the car and the parking spot will be calculated like this: loss type RectanglePoints = { fl: NumVec3, // Front-left fr: NumVec3, // Front-right bl: NumVec3, // Back-left br: NumVec3, // Back-right }; type GeometricParams = { wheelsPosition: RectanglePoints, parkingLotCorners: RectanglePoints, }; const carLoss = (params: GeometricParams): number => { const { wheelsPosition, parkingLotCorners } = params; const { fl: flWheel, fr: frWheel, br: brWheel, bl: blWheel, } = wheelsPosition; const { fl: flCorner, fr: frCorner, br: brCorner, bl: blCorner, } = parkingLotCorners; const flDistance = euclideanDistance(flWheel, flCorner); const frDistance = euclideanDistance(frWheel, frCorner); const brDistance = euclideanDistance(brWheel, brCorner); const blDistance = euclideanDistance(blWheel, blCorner); return (flDistance + frDistance + brDistance + blDistance) / 4; }; Since the should be inversely proportional to the we'll calculate it like this: fitness loss const carFitness = (params: GeometricParams): number => { const loss = carLoss(params); // Adding +1 to avoid a division by zero. return 1 / (loss + 1); }; You may see the and the values for a specific genome and for a current car position on the dashboard: fitness loss Evolution Simulator Launching the evolution Let's put the evolution functions together. We're going to "create the world", launch the evolution loop, make the time going, the generation evolving, and the cars learning how to park. To get the fitness values of each car we need to run a simulation of the car’s behavior in a virtual 3D world. The does exactly that - it runs the code below in the simulator, which is : Evolution Simulator made with Three.js // Evolution setup example. // Configurable via the Evolution Simulator. const GENERATION_SIZE = 1000; const LONG_LIVING_CHAMPIONS_PERCENTAGE = 6; const MUTATION_PROBABILITY = 0.04; const MAX_GENERATIONS_NUM = 40; // Fitness function. // It is like an annual doctor's checkup for the cars. const carFitnessFunction = (genome: Genome): number => { // The evolution simulator calculates and stores the fitness values for each car in the fitnessValues map. // Here we will just fetch the pre-calculated fitness value for the car in current generation. const genomeKey = genome.join(''); return fitnessValues[genomeKey]; }; // Creating the "world" with the very first cars generation. let generationIndex = 0; let generation: Generation = createGeneration({ generationSize: GENERATION_SIZE, genomeLength: GENOME_LENGTH, // <- 180 genes }); // Starting the "time". while(generationIndex < MAX_GENERATIONS_NUM) { // SIMULATION IS NEEDED HERE to pre-calculate the fitness values. // Selecting, mating, and mutating the current generation. generation = select( generation, carFitnessFunction, { mutationProbability: MUTATION_PROBABILITY, longLivingChampionsPercentage: LONG_LIVING_CHAMPIONS_PERCENTAGE, }, ); // Make the "time" go by. generationIndex += 1; } // Here we may check the fittest individuum of the latest generation. const fittestCar = generation[0]; After running the function, the array is sorted by the fitness values in descending order. Therefore, the fittest car will always be the first car in the array. select() generation of cars with random genomes will behave something like this: The 1st generation the cars start learning what the self-parking is and start getting closer to the parking spot: On the ≈40th generation, Another example with a bit more challenging starting point: The cars are hitting some other cars along the way, and also are not perfectly fitting the parking spot, but this is only the 40th generation since the creation of the world for them, so you may give the cars some more time to learn. From generation to generation we may see how the loss values are going down (which means that fitness values are going up). The shows the average loss value (average distance from the cars to the parking spot) of the of fittest cars. The shows the loss value of the fittest car in each generation. P50 Avg Loss 50% Min Loss You may see that on average the of the fittest cars of the generation are learning to get closer to the parking spot (from away from the parking spot to in 35 generations). The trend for the values is less obvious (from to with some noise signals), however from the animations above you may see that cars have learned some basic parking moves. 50% 5.5m 3.5m Min Loss 1m 0.5m Conclusion In this article, we've broken down the high-level task of creating the self-parking car to the straightforward low-level task of finding the optimal combination of ones and zeroes (finding the optimal car genome). 180 Then we've applied the genetic algorithm to find the optimal car genome. It allowed us to get pretty good results in several hours of simulation (instead of many years of running the naive approach). You may launch the 🚕 to see the evolution process directly in your browser. The simulator gives you the following opportunities: Self-parking Car Evolution Simulator You may and adjust genetic parameters by yourself train the cars from scratch You may see the trained self-parking cars in action You may also try to park the car manually The full genetic source code that was shown in this article may also be found in the . If you are one of those folks who will actually count and check the number of lines to make sure there are less than 500 of them (excluding tests), please feel free to check the code . Evolution Simulator repository here There are still some with the code and the simulator: unresolved issues The car's brain is oversimplified, and it uses linear equations instead of, let's say, neural networks. It makes the car not adaptable to the new surroundings or to the new parking lot types. We don't decrease the car's fitness value when the car is hitting the other car. Therefore the car doesn't "feel" any guilt in creating the road accident. The evolution simulator is not stable. It means that the same car genome may produce different fitness values, which makes the evolution less efficient. The evolution simulator is also very heavy in terms of performance, which slows down the evolution progress since we can't train, let's say, 1000 cars at once. Also, the Evolution Simulator requires the browser tab to be open and active to perform the simulation. and ... more However, the purpose of this article was to have some fun while learning how the genetic algorithm works and not to build a production-ready self-parking Teslas. So, even with the issues mentioned above, I hope you've had a good time going through the article. Also published on . https://trekhleb.dev/blog/2021/self-parking-car-evolution/