Algorithm-Evolutionary

 view release on metacpan or  search on metacpan

Changes  view on Meta::CPAN

2011-11-23  Juan Julian Merelo Guervos  <jjmerelo@gmail.com>

	* lib/Algorithm/Evolutionary/Individual/Vector.pm (clone): Change
	suggested by Cristoph Meissner, comma between fitness and
	chromosome. 

2011-02-20  Juan J. Merelo Guervós  <jjmerelo@gmail.com>

	* lib/Algorithm/Evolutionary/Op/Breeder_Diverser.pm (apply): Made
	even more diverse by not inserting the new individual if it is the
	same as the parent; refactored also to best practices.

	* lib/Algorithm/Evolutionary/Op/Uniform_Crossover_Diff.pm (apply):
	Changed to leave at least one difference without change

2011-02-19  Juan J. Merelo Guervós  <jjmerelo@gmail.com>

	* lib/Algorithm/Evolutionary/Op/Breeder_Diverser.pm (apply):
	Changed to make diversification higher: offspring is chosen in
	turn, instead of chosing it randomly, which avoids repeating.

LICENSE  view on Meta::CPAN

TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.

		     END OF TERMS AND CONDITIONS

	    How to Apply These Terms to Your New Programs

  If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.

  To do so, attach the following notices to the program.  It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.

    <one line to give the program's name and a brief idea of what it does.>
    Copyright (C) <year>  <name of author>

lib/Algorithm/Evolutionary/Op/Breeder.pm  view on Meta::CPAN


    my $m =  new Algorithm::Evolutionary::Op::Mutation 0.5;
    my $c = new Algorithm::Evolutionary::Op::Crossover; #Classical 2-point crossover

    my $selector = new Algorithm::Evolutionary::Op::RouletteWheel $population_size; #One of the possible selectors

    my $generation = 
      new Algorithm::Evolutionary::Op::Breeder( $selector, [$m, $c] );

    my @sortPop = sort { $b->Fitness() <=> $a->Fitness() } @pop;
    my $bestIndi = $sortPop[0];
    my $previous_average = average( \@sortPop );
    $generation->apply( \@sortPop );

=head1 Base Class

L<Algorithm::Evolutionary::Op::Base>

=head1 DESCRIPTION

Breeder part of the evolutionary algorithm; takes a population and returns another created from the first

lib/Algorithm/Evolutionary/Op/Breeder_Diverser.pm  view on Meta::CPAN


    my $m =  new Algorithm::Evolutionary::Op::Mutation 0.5;
    my $c = new Algorithm::Evolutionary::Op::Crossover; #Classical 2-point crossover

    my $selector = new Algorithm::Evolutionary::Op::RouletteWheel $population_size; #One of the possible selectors

    my $generation = 
      new Algorithm::Evolutionary::Op::Breeder_Diverser( $selector, [$m, $c] );

    my @sortPop = sort { $b->Fitness() <=> $a->Fitness() } @pop;
    my $bestIndi = $sortPop[0];
    my $previous_average = average( \@sortPop );
    $generation->apply( \@sortPop );

=head1 Base Class

L<Algorithm::Evolutionary::Op::Base>

=head1 DESCRIPTION

Breeder part of the evolutionary algorithm; takes a population and

lib/Algorithm/Evolutionary/Op/DeltaTerm.pm  view on Meta::CPAN

use strict;
use warnings;


=head1 NAME

    Algorithm::Evolutionary::Op::DeltaTerm - Termination condition for an algorithm; checks that 
                the difference of the best to a target is less than a delta
                 

=head1 SYNOPSIS

   my $target = 1;
   my $epsilon = 0.01;
   my $dt = new Algorithm::Evolutionary::Op::DeltaTerm $target, $epsilon; 
   #$dt->apply( \@pop ) when the best fitness is 1 plus/minus 0.1

=head1 Base Class

L<Algorithm::Evolutionary::Op::Base|Algorithm::Evolutionary::Op::Base>

=head1 DESCRIPTION

Termination condition for evolutionary algorithm loops; the C<apply>
method returns false when the first element in the array is as close
to the target as the differente indicated.

lib/Algorithm/Evolutionary/Op/EDA_step.pm  view on Meta::CPAN

      $indi->evaluate( $onemax );
      push( @pop, $indi );
    }

    my $selector = new Algorithm::Evolutionary::Op::RouletteWheel $population_size; #One of the possible selectors

    my $generation = 
      new Algorithm::Evolutionary::Op::EDA_step( $onemax, $selector, $replacement_rate );

    my @sortPop = sort { $b->Fitness() <=> $a->Fitness() } @pop;
    my $bestIndi = $sortPop[0];
    my $previous_average = average( \@sortPop );
    $generation->apply( \@sortPop );

=head1 Base Class

L<Algorithm::Evolutionary::Op::Base>

=head1 DESCRIPTION

Estimation of Distribution Algorithms shun operators and instead try

lib/Algorithm/Evolutionary/Op/GeneralGeneration.pm  view on Meta::CPAN

  my $numBits = 10;
  for ( 0..$popSize ) {
    my $indi = new Algorithm::Evolutionary::Individual::BitString $numBits ; #Creates random individual
    my $fitness = $onemax->( $indi );
    $indi->Fitness( $fitness );
    push( @pop, $indi );
  }
  my $generation = 
    new Algorithm::Evolutionary::Op::GeneralGeneration( $onemax, $selector, [$m, $c], $replacementRate );
  my @sortPop = sort { $a->Fitness() <=> $b->Fitness() } @pop;
  my $bestIndi = $sortPop[0];
  $generation->apply( \@sortPop );
 
=head1 Base Class

L<Algorithm::Evolutionary::Op::Base|Algorithm::Evolutionary::Op::Base>

=head1 DESCRIPTION

Genetic algorithm that uses the other component. Must take as input the operators thar are going to be
used, along with its priorities

lib/Algorithm/Evolutionary/Op/Generation_Skeleton.pm  view on Meta::CPAN


    my $m =  new Algorithm::Evolutionary::Op::Mutation 0.5;
    my $c = new Algorithm::Evolutionary::Op::Crossover; #Classical 2-point crossover

    my $selector = new Algorithm::Evolutionary::Op::RouletteWheel $population_size; #One of the possible selectors

    my $generation = 
      new Algorithm::Evolutionary::Op::Generation_Skeleton( $onemax, $selector, [$m, $c], $replacement_rate );

    my @sortPop = sort { $b->Fitness() <=> $a->Fitness() } @pop;
    my $bestIndi = $sortPop[0];
    my $previous_average = average( \@sortPop );
    $generation->apply( \@sortPop );

=head1 Base Class

L<Algorithm::Evolutionary::Op::Base>

=head1 DESCRIPTION

Skeleton class for a general single-generation (or single step) in an

lib/Algorithm/Evolutionary/Op/Tournament_Selection.pm  view on Meta::CPAN

  my @newPop = $selector->apply( @pop ); #Creates a new population from old

=head1 Base Class

L<Algorithm::Evolutionary::Op::Selector>

=head1 DESCRIPTION

One of the possible selectors used for selecting the pool of individuals
that are going to be the parents of the following generation. Takes a
set of individuals randomly out of the population, and select  the best. 

=head1 METHODS

=cut


package Algorithm::Evolutionary::Op::Tournament_Selection;

use Carp;

lib/Algorithm/Evolutionary/Op/Tournament_Selection.pm  view on Meta::CPAN


=cut

sub apply ($$) {
  my $self = shift;
  my $pop = shift || croak "No pop";
  my $output_size = shift || @$pop;
  my @output;
  for ( my $i = 0; $i < $output_size; $i++ ) {
    #Randomly select a few guys
    my $best = $pop->[ rand( @$pop ) ];
    for ( my $j = 1; $j < $self->{'_tournament_size'}; $j++ ) {
      my $this_one = $pop->[ rand( @$pop ) ];
      if ( $this_one->{'_fitness'} > $best->{'_fitness'} ) {
	$best = $this_one;
      }
    }
    #Sort by fitness
    push @output, $best;
  }
  return @output;
}

=head1 See Also

L<Algorithm::Evolutionary::Op::RouleteWheel> is another option for
selecting a pool of individuals

=head1 Copyright

lib/Algorithm/Evolutionary/Run.pm  view on Meta::CPAN


=cut

sub random_member {
    my $self = shift;
    return $self->{'_population'}->[rand( @{$self->{'_population'}} )];
}

=head2 results()
 
Returns results in a hash that contains the best, total time so far
 and the number of evaluations. 

=cut

sub results {
  my $self = shift;
  my $population_size = scalar @{$self->{'_population'}};
  my $last_good_pos = $population_size*(1-$self->{'selection_rate'});
  my $results = { best => $self->{'_population'}->[0],
		  median => $self->{'_population'}->[ $population_size / 2],
		  last_good => $self->{'_population'}->[ $last_good_pos ],
		  time =>  tv_interval( $self->{'_start_time'} ),
		  evaluations => $self->{'_fitness'}->evaluations() };
  return $results;

}

=head2 evaluated_population()

scripts/canonical-genetic-algorithm.pl  view on Meta::CPAN

my $contador=0;
do {
  $generation->apply( \@pop );
  print "$contador : ", $pop[0]->asString(), "\n" ;
  $contador++;
} while( ($contador < $numGens) 
	 && ($pop[0]->Fitness() < $bits));


#----------------------------------------------------------#
# Show best
print "Best is:\n\t ",$pop[0]->asString()," Fitness: ",$pop[0]->Fitness(),"\n";

print "\n\n\tTime: ", tv_interval( $inicioTiempo ) , "\n";

print "\n\tEvaluations: ", $rr->evaluations(), "\n";

print "\n\tCache size ratio: ", $rr->cached_evals()/$rr->evaluations(), "\n";

=head1 SEE ALSO

scripts/rectangle-coverage.pl  view on Meta::CPAN


  prompt% ./rectangle-coverage.pl

And change variable values from the user interface

=head1 DESCRIPTION  

A demo that combines the L<Algorithm::Evolutionary::Op::Easy> module
    with L<Tk> to create a visual demo of the evolutionary
    algorithm. It generates randomly a number of rectangles, and shows
    how the population evolves to find the solution. The best point is
    shown in darkening yellow color, the rest of the population in
    green. 

Use "Start" to start the algorithm after setting the variables, and
    then Finish to stop the EA, Exit to close the window.

Default values are as follows

=over

scripts/tide_bitstring.pl  view on Meta::CPAN


=head1 SYNOPSIS

  prompt% ./tide_bitstring.pl <population> <number of generations>

or

  prompt% perl tide_bitstring.pl <population> <number of generations>

  # Shows the values of the two floating-point components of the
  # chromosome and finally the best value and fitness reached, which
  # should be as close to 1 as possible.
  

=head1 DESCRIPTION  

A simple example of how to run an Evolutionary algorithm based on
Algorithm::Evolutionary. Tries to find the max of the bidimensional
Tide , and outputs the x and y coordinates, along with fitness. Best
fitness is close to 1. Around 50 generations should be enough, but
default is population and number of generations equal to 100.

scripts/tide_bitstring.pl  view on Meta::CPAN

do {
  $generation->apply( \@pop );

  print "$contador : ", $pop[0]->asString(), "\n" ;

  $contador++;
} while( $contador < $numGens );


#----------------------------------------------------------#
# Show the best
print "El mejor es:\n\t ",$pop[0]->asString()," Fitness: ",$pop[0]->Fitness(),"\n";

print "\n\nTime: ". tv_interval( $inicioTiempo ) . "\n";

=head1 AUTHOR

Contributed by Pedro Castillo Valdivieso, modified by J. J. Merelo

=cut

scripts/tide_float.pl  view on Meta::CPAN


=head1 SYNOPSIS

  prompt% ./tide_float.pl <population> <number of generations>

or

  prompt% perl tide_float.pl <population> <number of generations>

will show the values of the two floating-point components of the
chromosome and finally the best value and fitness reached, which
should be as close to 1 as possible.
  

=head1 DESCRIPTION  

A simple example of how to run an Evolutionary algorithm based on
Algorithm::Evolutionary. Tries to find the max of the bidimensional
Tide , and outputs the x and y coordinates, along with fitness. Best
fitness is close to 1. Around 50 generations should be enough, but
default is population and number of generations equal to 100.

t/0200-run-cga.t  view on Meta::CPAN

}

my $e =  new Algorithm::Evolutionary::Op::CanonicalGA $onemax;

isa_ok( $e,  "Algorithm::Evolutionary::Op::CanonicalGA");

$e->apply( \@pop);

is( scalar @pop, $population_size, "Size OK" );

my $best_fitness = $pop[0]->Fitness();

is ( $best_fitness > 1, 1, "First generation $best_fitness" );

for ( 1..5 ) {
  $e->apply( \@pop);
}

SKIP: {
  skip "Unlucky with improving fitness this time", 1 unless $pop[0]->Fitness() >= $best_fitness;
  cmp_ok(  $pop[0]->Fitness(), ">=", $best_fitness, "Improving fitness to ". $pop[0]->Fitness() );
}

t/0200-run-cga_nn.t  view on Meta::CPAN

}

my $e =  new Algorithm::Evolutionary::Op::Canonical_GA_NN;

isa_ok( $e,  "Algorithm::Evolutionary::Op::Canonical_GA_NN");

$e->apply( \@pop);

is( scalar @pop, $population_size, "Size OK" );

my $best_fitness = $pop[0]->Fitness();

is ( $best_fitness > 1, 1, "First generation $best_fitness" );

for ( 1..10 ) {
  map( $_->evaluate( $onemax ), @pop );
  $e->apply( \@pop);
}

SKIP: {
  skip "Unlucky with improving fitness this time", 1 unless $pop[0]->Fitness() >= $best_fitness;
  cmp_ok(  $pop[0]->Fitness(), ">=", $best_fitness, "Improving fitness to ". $pop[0]->Fitness() );
}

t/0200-run-ga.t  view on Meta::CPAN

  'selection_rate' => '0.1'
};
my $another_algorithm = new Algorithm::Evolutionary::Run $conf;
isa_ok( $another_algorithm, 'Algorithm::Evolutionary::Run' );
my $somebody = $algorithm->random_member();
isa_ok( $somebody, 'Algorithm::Evolutionary::Individual::BitString');
$another_algorithm->run();
ok( $another_algorithm->{'_counter'} == 10, "run OK" ); 
my $results = $another_algorithm->results();
cmp_ok( $results->{'evaluations'}, ">",100, "Evaluations OK" );
cmp_ok( $results->{'best'}->Fitness(), ">", 
	$results->{'last_good'}->Fitness(), "Evolution OK" );
cmp_ok( $another_algorithm->compute_average_distance( $somebody), ">", 0, "Distances" );

t/0500-generation-skel.t  view on Meta::CPAN


my $m =  new Algorithm::Evolutionary::Op::Mutation 0.5;
my $c = new Algorithm::Evolutionary::Op::Crossover; #Classical 2-point crossover

my $selector = new Algorithm::Evolutionary::Op::RouletteWheel $population_size; #One of the possible selectors

my $generation = 
  new Algorithm::Evolutionary::Op::Generation_Skeleton( $onemax, $selector, [$m, $c], $replacement_rate );

my @sorted_pop = sort { $b->Fitness() <=> $a->Fitness() } @pop;
my $bestIndi = $sorted_pop[0];
my $previous_average = average( \@sorted_pop );
$generation->apply( \@sorted_pop );
ok( $bestIndi->Fitness() <= $sorted_pop[0]->Fitness(), 1 ); #fitness
                                                         #improves,
                                                         #but not
                                                         #always 
#This should have improved...
do {
  $generation->apply( \@sorted_pop );
} until ( $previous_average < average( \@sorted_pop)); #It eventually improves

my $this_average = average( \@sorted_pop );
ok( $previous_average < $this_average , 1 );

t/0600-ez-moga.t  view on Meta::CPAN

  my $indi = new Algorithm::Evolutionary::Individual::BitString 30*$number_of_bits ; #Creates random individual
  push( @pop, $indi );
}

my $e =  new Algorithm::Evolutionary::Op::Easy_MO $zdt1;

isa_ok( $e,  "Algorithm::Evolutionary::Op::Easy_MO");

$e->apply( \@pop);

my $best_fitness = $pop[0]->Fitness();

is ( $best_fitness == 1, 1, "First generation" );

$e->apply( \@pop);

is(  $pop[0]->Fitness() == 1, 1, "Improving fitness" );

=cut

t/0603-eda.t  view on Meta::CPAN

  my $indi = new Algorithm::Evolutionary::Individual::BitString $number_of_bits ; #Creates random individual
  push( @pop, $indi );
}

my $e =  new Algorithm::Evolutionary::Op::EDA_step $rr, $replacement_rate, $population_size;

isa_ok( $e,  "Algorithm::Evolutionary::Op::EDA_step");

$e->apply( \@pop);

my $best_fitness = $pop[0]->Fitness();

cmp_ok( $best_fitness, ">", 1, "First generation $best_fitness" );

$e->apply( \@pop);

cmp_ok(  $pop[0]->Fitness(), ">=", $best_fitness, "Improving fitness ".  $pop[0]->Fitness() );

=cut

t/general.t  view on Meta::CPAN

  my $indi = new Algorithm::Evolutionary::Individual::BitString $numBits ; #Creates random individual
  my $fitness = $onemax->( $indi );
  $indi->Fitness( $fitness );
  push( @pop, $indi );
}

#fitness
my $generation = 
  new Algorithm::Evolutionary::Op::GeneralGeneration( $onemax, $selector, [$m, $c], $replacementRate );
my @sortPop = sort { $b->Fitness() <=> $a->Fitness() } @pop;
my $bestIndi = $sortPop[0];
$generation->apply( \@sortPop );
is( $bestIndi->Fitness() <= $sortPop[0]->Fitness(), 1, "Fitness improvement" ); #fitness improves, but not always

# To be obsoleted
my $ggxml = $generation->asXML();
my $gprime =  Algorithm::Evolutionary::Op::Base->fromXML( $ggxml );
is( $gprime->{_eval}( $pop[0] ) eq $generation->{_eval}( $pop[0] ) , 1, "XML" ); #Code snippets will never be exactly the same.

#Test 33 & 34
use Algorithm::Evolutionary::Op::Easy;
my $ez = new Algorithm::Evolutionary::Op::Easy $onemax;
  
my $ezxml = $ez->asXML();
my $ezprime = Algorithm::Evolutionary::Op::Base->fromXML( $ezxml );
is( $ezprime->{_eval}( $pop[0] ) eq $ez->{_eval}( $pop[0] ) , 1, "Code snippets" ); #Code snippets will never be exactly the same.
my $oldBestFitness = $bestIndi->Fitness();
$ez->apply( \@sortPop );
is( $sortPop[0]->Fitness() >= $oldBestFitness, 1, "Fitness improving");
  
#Test 35 & 36
use Algorithm::Evolutionary::Op::GenerationalTerm;
my $g100 = new Algorithm::Evolutionary::Op::GenerationalTerm 10;
use Algorithm::Evolutionary::Op::FullAlgorithm;
my $f = new Algorithm::Evolutionary::Op::FullAlgorithm $generation, $g100;
  
my $fxml = $f->asXML();
my $txml = $f->{_terminator}->asXML();
my $fprime = Algorithm::Evolutionary::Op::Base->fromXML( $fxml );
is( $txml eq $fprime->{_terminator}->asXML() , 1, "from XML" ); 
$oldBestFitness = $bestIndi->Fitness();
for ( @sortPop ) {
  if ( !defined $_->Fitness() ) {
    my $fitness = $onemax->( $_ );
    $_->Fitness( $fitness );
  }
}
$f->apply( \@sortPop );
is( $sortPop[0]->Fitness() >= $oldBestFitness, 1, "Improving fitness");
  
=head1 Copyright



( run in 1.107 second using v1.01-cache-2.11-cpan-4e96b696675 )