It occurs to me that using random number to select the parentslike gambling roulettemaybe workalbe.Let me explain it using an example in find the max value of function.The example is shown below:
n random individual and calculated their function value.We named the individual 'j' Xj,and its function value name is f(Xj).And we find and name the max function-value maxValue.fitness of individual j is f(Xj)/maxValue.We can name it g(Xj).And then we calculate all fitness of individuals.parents.(We abandon the individual whose fitness values is less than 0) .A classtic way is gambling roulette.The chance of selecting Xjand Xkis g(Xj)*g(Xk)/[g(X1)+g(X2)+...+g(Xn)]^2.My idea is
Xj and Xkrn in range of 0~1.rn is less than g(Xj)and g(Xk)(the fitness of Xj and Xk),then they are able to reproduce.Then crossover and mutate.1-3.The chance of selecting Xjand Xk is g(Xj)*g(Xk)/n^2,which is similar to gambling roulette.Consider that both denominator of two chance are constant value,they are equal in a certain way.
double randomNumToJudge=Math.random();//generate a random number to judge with the fitness
int randomMother=(int)(Math.random()*1000);
int randomFather=(int)(Math.random()*1000);//random generate parents
if((randomNumToJudge<=individualArray[generation][randomFather].fitnessValue)
&&(randomNumToJudge<=individualArray[generation][randomMother].fitnessValue))
//if the number is less than both fitness of parents,they are permited to reproduce.
{
Individual childIndividual=individualArray[generation][randomFather].crossOverAndMutate(individualArray[generation][randomFather], individualArray[generation][randomMother]);
//Crossover and mutate and generate child individual
individualArray[generation+1][counter]=childIndividual;//add childIndividual to tha Array.
counter++;//the count of individual number in child generation
}
I test this way in a java code.The function is x + 10sin(5x) + 7cos(4x), x∈[0,10).I generate 100 generation and the individual number in a generation is 1000.
Its result is correct.
In a certain execution,in the 100th generation,i find the best individual is 7.856744175554171,and the best function value is 24.855362868957645.
I have tested for 10 times.Every result is accurate to 10 decimal places in 100th generation.
So is this way workable?Is this way already been thought by others?
Any comments are appreciated ^@^
PS:Pardon my poor english-_-
Please note I have edited this answer.
From point 2, I am assuming your target fitness is 1. Your algorithm will likely never fully converge (find a local minima). This is because your random value range (0~>1) does not change even if your fitnesses do.
Note that this does not mean better fitnesses are not created; they will be. But there will be a sharp decline in the speed at which better fitnesses are created due to the fact that you are checking for fitnesses (random 0~>1).
Consider this example where all fitnesses have converged to be high:
[0.95555, 0.98888, 0.92345, 0.92366]
Here, all values are very likely to satisfy randomNumToJudge<=fitness. This means any of the values are equally likely to be chosen as a parent. You do not want this - you want the best values to have a higher chance of being chosen.
Your algorithm could be amended to converge properly if you set your randomNumToJudge to have a range of (median fitness in population ~> 1), though this still is not optimal.
Alternative Method
I recommend implementing the classic roulette wheel method.
The roulette wheel method assigns to each individual a probability of being chosen as a parent based on how "fit" they are. Essentially, the greater the fitness, the the bigger the slice of the wheel the individual will occupy and the higher the chance a random number will choose this position on the wheel.