Now let's think about two basics operators in GA: the crossover and mutation.For the crossover, as the individual is represented by an unique binary string in its chromosome, let's just use its simplest form, single point crossover:The procedure is very simple, had chosen one random point, let's slice each individual into two parts and exchange those parts to generate two new individuals, the function will get as a parameter two crom structs representing the parents and two pointers to crom structs representing the two children.

--- insert into galib.c ----- void crossover(crom par1, crom par2, crom *son1, crom *son2){ int point,i; /* Let's get a point between 0 and BITS */ point = random(0,BITS); /* * * * * * * * * * * * * * * * * * * The first part we copy the genes* * from parent 1 to child 1 * * and from parent 2 to child 2 * * * * * * * * * * * * * * * * * * */ for(i = 0;i<point;++i){ son1->cromo[i] = par1.cromo[i]; son2->cromo[i] = par2.cromo[i]; } /* * * * * * * * * * * * * * * * * * * * * * Here we'll "cross" their chromosome, * * now copying from parent 2 to child 1 * * and from parent 1 to child 2 * * * * * * * * * * * * * * * * * * * * * */ for(;i<BITS;++i){ son1->cromo[i] = par2.cromo[i]; son2->cromo[i] = par1.cromo[i]; } } ---------cut here------------

And for the mutation we select a single point and then exchange its bit:

--- insert into galib.c ----- void mutation(crom *cromo){ int point,i; /* Let's get a point between 0 and BITS */ point = random(0,BITS); /* just invert the bit in the point chosen (bless the binary system) */ cromo->cromo[point] = !cromo->cromo[point]; } --------cut here-------------

Random Tip:Before we go on, let's make a notice about the C rand( ) function. Most people when required to get a random number in a given range tends to use the modulus operator (%). But in some machines and some compilers, this make it lose some of its randomness, a better way to implement this function is shown next:
------ insert into galib.c ----- int random(int start,int end){ return (int)((((double)rand()/(double)(RAND_MAX+1))*end)+start); } ------------cut here------------ More information here. |

Now let's work with the one function that we'll make us help to select who gonna reproduce and later, who gonna stay alive to the next generation.For selection in this case that is not a real time application (in my next article/tutorial hopefully I'll explain a case like this) I'd rather use the Roulette Selection, for it converges slower then elitism model, and doesn't get trapped by a local optimum easily.

The Roulette works by calculating the probability of an individual to be selected based on its fitness and the sum of all fitness of population. So first let's make a function to calculatethis probability:

-------- insert into galib.c ------ void probability(crom *pop, int size_pop){ int i; double sum = 0.0, prob; int prob_sum=0; /* Let's calculate the sum of all fitness */ for(i=0;i<size_pop;++i){ sum += pop[i].fitness; } /* * * * * * * * * * * * * * * * * now for each one we divide * * its fitness by the sum of * * all and multiply by 100 * * resulting in its percentage * * * * * * * * * * * * * * * * */ for(i=0;i<size_pop;++i){ prob = (double)(100*pop[i].fitness) / sum; pop[i].prob = iround(prob); /* just in case we want to see if we have really 100% (and rarely we do) */ prob_sum += iround(prob); } } ------------- cut here -------------

The iround function uses a nearer-even algorithm for rounding numbers, so we can get something near 100% as the sum of all probabilities. You can see this function in the source code package.

The select function will get as parameters, the population to make the selection, the amount of individuals to select, and pointer for a crom struct where the chosen ones will be stored, we also use a trick to slow down evolution when there's an individual much better than the others, we spin the roulette several times trying to select someone not selected yet. So here is the code:

------ insert into galib.c ----- void select(crom *pop, int size_pop, int n_selections, crom *result){ int i,j,k,choice,theone,tries=0; char *h_pop; /* Let's use this dynamic array to avoid choosing 2 same individuals */ h_pop = (char *) malloc(sizeof(char)*size_pop); /* initialize it to 0, where 0 is not choosen and 1 is already choosen */ memset((void *)h_pop,size_pop,'\0'); for(i=0;i<n_selections;++i){ tries = 0; do{ j = 0; theone = 0; /* 0 to 100 percent */ choice = random(1,100); /* sum the probabilities until we get the percentage randomly chosen */ while(theone < choice && j < size_pop) theone += pop[j++].prob; /* get back to the chosen one */ --j; /* * * * * * * * * * * * * * * * * * * * * * * after the loop, j will store the * * value of the chosen one, but in * * case we have passed thru the limit ... * * * * * * * * * * * * * * * * * * * * * * */ j = j%size_pop; if(j<0) j = 0; /* * * * * * * * * * * * * * * * * * loop until we choose someone * * not chosen before, or we have * * tried more than 20 times * * * * * * * * * * * * * * * * * */ }while(h_pop[j] && tries++ < 20 ); /* this one is now chosen */ h_pop[j]=1; /* * * * * * * * * * * * do the copy dance * * * * * * * * * * * */ for(k=0;k<BITS;++k) result[i].cromo[k] = pop[j].cromo[k]; /* * * * * * * * * * * * * * * * * * * * * * only the fitness will be copied * * for the probability will be different * * * * * * * * * * * * * * * * * * * * * */ result[i].fitness = pop[j].fitness; } /* let's not waste memory */ free(h_pop); } -------------cut here-----------

Well, all the parts of a simple GA system is ready to run, let's put all pieces together:

------- insert into galib.c ------- void ga_solve(){ int i,j,k,gen; int vals[6]; /* * * * * * * * * * * * * * * * * * * * * * * * * * * * * declaration of the population, their children, * * an auxiliary variable and the best individual ever * * * * * * * * * * * * * * * * * * * * * * * * * * * * */ crom pop[POP], children[CHILDREN], temp[POP+CHILDREN],best; /* * * * * * * * * * * * * * * * * * * * * * in the beginning the best individual * * is the 0 one, so... no need to set * * any more parameters since we'll just * * use the fitness for comparison * * * * * * * * * * * * * * * * * * * * * */ best.fitness = 0; best_turn.fitness = 0; /* Let's generate the initial population at random (or pseudo-random as you wish) */ for(i=0;i<POP;++i){ for(j=0;j<BITS;++j){ pop[i].cromo[j] = random(0,2); } } gen = 0; /* repeat the reproduction steps until the max number of generations is reached */ while(gen++<GENS){ /* First, let's see how good they are ... */ for(i=0;i<POP;++i) evaluate(&pop[i]); /* ... and what is the chance of each one */ probability(pop,POP); /* * * * * * * * * * * * * * * * * * * * * * * * * * * And two by two, my human zoo, shall reproduce * * until the number desired of children is reached * * * * * * * * * * * * * * * * * * * * * * * * * * */ for(i=0;i<CHILDREN;i+=2){ select(pop,POP,2,temp); crossover(temp[0],temp[1],&children[i],&children[i+1]); } /* Do our children are good enough? */ for(i=0;i<CHILDREN;++i) evaluate(&children[i]); /* Let's do some mutation experiments to our population buhuahuahua */ i = random(0,POP); mutation(&pop[i]); /* Let's see how good our mutant is */ evaluate(&pop[i]); /* Let's gather all together the oldies and the newbies */ /* first the oldies */ for(i=0;i<POP;++i){ temp[i].fitness = pop[i].fitness; for(j=0;j<BITS;++j) temp[i].cromo[j] = pop[i].cromo[j]; } /* and now our children */ for(k=0;i<POP+CHILDREN;++i,++k){ temp[i].fitness = children[k].fitness; for(j=0;j<BITS;++j) temp[i].cromo[j] = children[k].cromo[j]; } /* Let's elect the best of this generation */ for(i=1,k=0;i<POP+CHILDREN;++i) if(temp[i].fitness > temp[k].fitness){ decode(temp[i],vals); /* We are looking for someone who respect the constraints */ if(!penalty(vals)) k = i; } decode(temp[k],vals); /* Let's store it for later */ if(temp[k].fitness > best.fitness && !penalty(vals)){ for(i=0;i<BITS;++i) best.cromo[i] = temp[k].cromo[i]; best.fitness = temp[k].fitness; } /* Now the party can begin, who will live and who shall die? */ probability(temp,POP+CHILDREN); select(temp,POP+CHILDREN,POP,pop); } /* End of this Generation */ /* And the best individual ever was... */ printf("The best ever fitness was: %d\n",best.fitness); decode(best,vals); printf("Values: %d %d %d %d %d %d\n", vals[0],vals[1],vals[2],vals[3],vals[4],vals[5]); printf("Objective Function: %d\n",object(vals)); printf("Penalty: %d\n\n",penalty(vals)); } --------------cut here-------------

Remember that all the names in uppercase are constants defined in galib.h.Here we are, now we have a complete GA system to solve an Integer Programming Problem. Download the complete source code with some helpful functions and reports here in .ZIP format and in .TAR.GZ format. And download also the Excel spreadsheet with the problem modeled for using the Solver in .ZIP format.

As you might have noticed, this almost never converges to the global optimal solution, but it gets close, it usually stays between $11,000.00 and $12,000.00 (the solution is $12,580.00). For this problem, the use of Integer Programming Solvers Tools are more adequate (the Excel Solver uses the branch-and-bound algorithm), but when we have problems not so trivial, or non-linear complex problems, it's better to get a closer result fast, than wait decades to get the optimal one.You should also notice that the GA system has a complexity of O(1), since no matter the kind of problem, the time will be almost constant, since what defines the time is the number of generations that it will loop for, and not the size of the problem.

This is just the start point for using GA with optimization problems. You, the avid researcher and guru coder, might be with your fingers itching for coding new things and expanding this simple and humble program.You shall start experimenting changing the parameters GEN, RATEN, POP, CHILDREN and fine tune the optimal sets, then you may try other types of Crossover and Mutation, as well implementing probability rates for them to occur, next you can implement a“cfg” parser, so you won't need to re-compile every time you need to try another parameters set; later you may consider experiment the other methods to deal with constraints. And last and hardest, you might generalize this GA to solve any problem with no need to change the code (you may put the problem in a text file and parse it).

For it's a dangerous path to go, here is a map to guide you with all the references in this written:

Nonlinear Programming FAQ

Integer Programming

Linear programming formulation examples

The Simplex Method

Branch and Bound

Genetic algorithms for the solution of optimisation problems

Genetic Algorithms Tutorials

Some books to enjoy

Dr. Math and the roundin numbers algorithms

Contact info for doubts or complaints: olivetti@terra.com.brAcme™, Willy E. Coyote™ and RoadRunner™ are trademarks of Warner Bros

Remember you can visit the *Message Store* to discuss this tutorial.
Comments are always welcome!.
There are already replies in the thread, why not join in?