Prezentace se nahrává, počkejte prosím

Prezentace se nahrává, počkejte prosím

GAs and Premature Convergence  Premature convergence - GAs converge too early to suboptimal solution o as the population evolves, only a little new can.

Podobné prezentace


Prezentace na téma: "GAs and Premature Convergence  Premature convergence - GAs converge too early to suboptimal solution o as the population evolves, only a little new can."— Transkript prezentace:

1 GAs and Premature Convergence  Premature convergence - GAs converge too early to suboptimal solution o as the population evolves, only a little new can be produced  Reasons for premature convergence: o improper selection pressure o insufficient population size o deception o improper representation and genetic operators

2 Motivation and Realization  Motivation – to maintain a diversity of the evolved population and extend the explorative power of the algorithm  Realization o Convergence of the population is allowed up to specified extent o Convergence at individual positions of the representation is controlled o Convergence rate – specifies a maximal difference in the frequency of ones and zeroes in every column of the population –ranges from 0 to PopSize/2 o Principal condition – at any position of the representation neither ones nor zeroes can exceed the frequency constraint o Specific way of modifying the population genotype

3 Algorithm of GALCO 1. Generate initial population 2. Choose parents 3. Create offspring 4. if (offspring > parents) then replace parents with offspring else{ find(replacement) replace_with_mask(child1, replacement) find(replacement) replace_with_mask(child2, replacement) } 5. if (not finished) then go to step 2

4 Operator replace_with_mask Mask – vector of integer counters; stores a number of 1s for each bit of the representation 50

5 Testovací úlohy - statické  F101(x, y)  Deceptive function  Hierarchická funkce  Royal Road Problem

6 GALCO – vliv parametru C

7 GALCO vs. SGA

8 GALCO – vliv parametru C

9 GALCO vs. SGA

10 Multimodal Optimization Initial population SIGA with without

11 Multimodal Optimization (cont.) Initial populationGALCOSIGA

12 GA s reálně kódovanou binární rep. (GARB)  Pseudo-binární rep. - bity kódovány reálným číslem r   0.0, 1.0  o interpretace(r)= 1, pro r > 0.5 = 0, pro r < 0.5  redundance kódu o Příklad: ch1 = [ ] ch2 = [ ] interpretace(ch1) = interpretace(ch2) = [ ]  Síla genů – vyjadřuje míru stability genů o Čím blíže k 0.5 tím je gen slabší (nestabilnější) o „Jedničkové geny“: 0.92 > 0.86 > 0.65 > 0.62 o „Nulové geny“: 0.07 > 0.19 > 0.23 > 0.41

13 Gene-strength adjustment mechanism  Geny chromozomů vzniklých při křížení jsou upraveny o v závislosti na jejich interpretaci o a relativní frequenci jedniček (nul) na dané pozici v populaci P[] př.: P[ ]  v populaci jena 1. pozici 82% jedniček, na 2. pozici 17% jedniček, na 3. pozici 35% jedniček, na 4. pozici 68% jedniček.  Geny, které v populaci převládají jsou oslabovány; ostatní jsou posilovány.

14 Posilování a oslabování genů  Oslabování gen’ = gen + c*(1.0-P[i]), když (gen<0.5) a (P[i]<0.5) (gen má hodnotu nula a v populaci na i-té pozici převažují nuly) a gen’ = gen – c*P[i], když (gen>0.5) a (P[i]>0.5)  Posilování gen’ = gen – c*(P[i]), když (gen 0.5) (gen má hodnotu nula a v populaci na i-té pozici převažují jedničky) a gen’ = gen + c*(1.0-P[i]), když (gen>0.5) a (P[i]<0.5)  Konstanta c určuje rychlost adaptace genů: c  (0.0,0.2 

15 Stabilizace slibných jedinců  Potomci, kteří jsou lepší než jejich rodiče by měli být stabilnější než ostatní vygenerovaná nekvalitní řešení o Chromozomy slibných jedinců jsou vygenerovány se silnými geny ch = (0.71, 0.45, 0.18, 0.57)  ch’= (0.97, 0.03, 0.02, 0.99) o Geny slibných jedinců přežijí více generací aniž by byly zmeněny v důsledku oslabování

16 Pseudocode for GARB 1begin 2initialize(OldPop) 3repeat 4calculate P[] from OldPop 5repeat 6select Parents from OldPop 7generate Children 8adjust Children genes 9evaluate Children 10if Child is better than Parents 11then rescale Child 12insert Children to NewPop 13until NewPop is completed 14switch OldPop and NewPop 15until termination condition 16end

17 Testovací úlohy - dynamické  Ošmerův dynamický problém g(x,t) = 1-exp(-200(x-c(t)) 2 ) c(t) = 0,04(  t/20  )  Minimum g(x,t)=0.0 se mění každých 20 generací  Oscillating Knapsack Problem 14 objektů, w i =2 i, i =0,...,13 f(x)=1/(1+target-  w i x i )  Target osciluje mezi hodnotami a 2837, které se v binárním vyjádření liší o 9 bitů

18 Výsledky na statických problémech DF3 H-IFF F101

19 Výsledky na statických problémech

20 Výsledky na dynamických problémech Oscillating knapsack problem

21 Výsledky na dynamických problémech Ošmerův dynamický problém Bezprostředně po změně opt.Celkově AlgoritmusMTEStDevMTEStDev GARB c = 0: GARB c = 0: GARB c = 0: GARB c = 0: GARB c = 0: SGA binaryN/AN/A SGA GrayN/AN/A CBM-BN/AN/A  MTE – Mean Tracking Error [%] – střední odchylka nejlepšího jedince v populaci a optimálního řešení počítaná přes všechny gen.

22 Zotavení z homogenní populace

23 Weakness of Simple Selectorecombinative GAs  Scale poorely on hard problems, largely the result of their mixing behaviour o Inability of SGA to correctly identify and adequately mix the appropriate BBs in subsequent generations o Exponential computation complexity of SGA  Crossover operators or other exchange emchanisms are needed such that adapt to the problem at hand o Linkage adaptation

24 Naivní přístupy – operátor inverze  Obrátí pořadí genů náhodně vybraného podřetězce v chromozomu – (1,1) | (2,0)(3,0)(4,1) | (5,1)  po inverzi (1,1) (4,1)(3,0)(2,0) (5,1)  Nepoužitelné z důvodu nevyváženosti signálu pro zlepšování linkage oproti signálu pro učení allel. o t α < t λ - alely podstupují přímější selekci než linkage  GA se rozhodne pro optimální nastavení alel dříve než zjistí, které kombinace genů zformovat dohromady a vzájemně mixovat. o Řešení: obrátit nerovnítko na t α > t λ (ALE JAK?)

25 Competent GAs  Can solve o hard problems (multimodal, deceptive, high degree of subsolution interaction, noise,...), o quickly, o accurately, o reliably.  Messy GAs – mGA, fmGA, gemGA  Learning linkage GAs – LLGA  Compact GAs – cGA, ECGA  Bayesian optimization algorithm - BOA

26 Messy Genetic Algorithms - mGAs  Inspirationfrom the nature – evolution starts from the simplest forms of life  mGA departed from SGA in four ways: o messy codings o messy operators o separation of processing into three heterogeneous phases o epoch-wise iteration to improve the complexity of solution

27 mGA’s codings  Tagged alleles: o Variable-length strings: (name 1, allele 1 ) … (name N, allele N ) ((4,0) (1,1) (2,0) (4,1) (4,1) (5,1))  Over-specification – multiple gene instances (gene 4) o Majority voting – would express deceptive genes too readily o First-come first-served (left to right expression) - positional priority  Underspecification – missing gene instances (gene 3) o Average schema value – variance is too high o Competitive template – solution locally optimal with respect to k-bit perturbations

28 Messy operators: cut & splice  Cut – divides a single string into two parts  Splice – joins the head of one string with the tail of the other one o When short strings are mated – probability of cut is small  mostly the string will be just spliced –the strings’ length is doubled o When long string are mated – probability of cut is large  one- point crossover

29 mGAs: three heterogeneous phases  Initialization o Enumerative initialization of the population with all sub-strings of a certain length k<

30 Fast messy genetic algorithms - fmGAs  Probabilistically complete enumeration o Population of strings of length l’ close to l is generated o Assumption: each string contains many different BBs of length k<

31 Gene expression messy GA - gemGA  Messy ??? o No variable-length strings o No under- or over-specification o No left-to-right expression  Messy use of heterogeneous phases of processing in gemGA o Linkage learning phase - first identifies linkage groups o Mixing phase – selection + recombination –exchanges good allele combinations within those groups to find optimal solution

32 gemGA: The idea  Linkage learning phase o Transcription I (antimutation) –Each string undergoes l one-bit perturbations –Improvements are ignored ?!? (bit does not belong to optimal BB) –Changes that degrade the structure are marked as possible linkage groups candidates Ex.: two 3-bit deceptive BBs marked not marked (degrades) (improves) o Transcription II –Identifies the exact relations among the genes by checking nonlinearities IF  f(X’ i ) +  f(X’ j ) !=  f(X’ ij ) THEN link(i,j)

33 Linkage Learning GA - LLGA  More “messy” than gemGA o Variable-length strings o Left-to-right expression o Always over-specification  NO primordial or juxtapositional phase – more SGA like  Idea: o Probabilistic expression that slows down the convergence of alleles o Crossover that adapts linkage at the same time that alleles are exchanged

34 LLGA – Probabilistic expression Clockwise interpretation  (3,1)(2,0)(5,1)(1,1)(4,0) 

35 LLGA – probabilistic expression cont.  The allele 1 is expressed with the probability δ/l and 1/l respectively  The allele 0 is expressed with the probability (l-δ)/l and (l-1)/l respectively

36 LLGA: Effect of PE on BBs  Assume a 6-bit problem where BB requiring genes 4, 5, and 6 to take on values of 1 in a trap function. o Initially the block 111 will be expressed roughly 1/8 th of the time o After the linkage evolved properly the BB success rate increases (6,1) (4,1) (5,1) (4,0) (5,0) (6,0) expressed most of the time almost never expressed  Extended probabilistic expression EPE-q o q is the number of copies of unexpressed allele (q=2)

37 LLGA – introns Introns – non-coding genes (97% of DNA is non-coding) o Number of introns required for proper functioning grows exponentially  compressed introns

38 Probabilistic Model-Building GAs 1.Initialize population at random 2.Select promising solutions 3.Build probabilistic model of selected solutions 4.Sample built model to generate new solutions 5.Incorporate new solutions into original population 6.Go to 2 (if not finished)

39

40 5-bit trap problem

41 UMDA performance

42 UMDA with “good” statistics

43 Extended compact GA - ECGA  Marginal product model (MPM) o Groups of bits (partitions) treated as chunks o Partitions represent subproblem o Onemax: [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] o Traps: [ ] [ ]

44 Learning structure in ECGA  Two components o Scoring metrics: minimal description length (MDL) –Number of bits for storing probabilities: C m = log 2 N  i 2 Si –Number of bits storing population using model: C p = N  i E(M i ) –Minimize C = C m + C p o Search procedure: a greedy algorithm –Start with one-bit groups –Merge two groups for most improvement –No more improvement possible  finish.

45 ECGA model [0,2,5][1,4][3] , 010, , 101,

46 ECGA example


Stáhnout ppt "GAs and Premature Convergence  Premature convergence - GAs converge too early to suboptimal solution o as the population evolves, only a little new can."

Podobné prezentace


Reklamy Google