AI-Evolve-Befunge

 view release on metacpan or  search on metacpan

example.conf  view on Meta::CPAN

# Like code_density, above, but only applies to the first generation (or the
# first after resuming a saved results file).
initial_code_density: 90

# The number of resource tokens each critter gets, per battle.
tokens: 2000

# Number of tokens each character of code costs.
code_cost: 1

# Number of tokens each clock cycle costs (per thread).
iter_cost: 2

# Number of tokens each repeat-cycle (from the "k" instruction) costs.
repeat_cost: 1

# Number of tokens each stack entry costs.
stack_cost: 1

# Number of tokens to create a new thread.
thread_cost: 10

lib/AI/Evolve/Befunge.pm  view on Meta::CPAN

The important thing to remember here is that the results take time -
it will probably take several weeks of solid processing time before you
begin to see any promising results at all.  It takes a lot of random
code generation before it starts to generate code that does what you
want it to do.

If you don't know anything about Befunge, I recommend you read up on
that first, before trying to understand how this works.

The individuals of this population (which we call Critters) may be of
various sizes, and may make heavy or light use of threads and stacks.
Each one is issued a certain number of "tokens" (which you can think
of as blood sugar or battery power).  Just being born takes a certain
number of tokens, depending on the code size.  After that, doing things
(like executing a befunge command, pushing a value to the stack,
spawning a thread) all take a certain number of tokens to accomplish.
When the number of tokens drops to 0, the critter dies.  So it had
better accomplish its task before that happens.

After a population fights it out for a while, the winners are chosen
(who continue to live) and everyone else dies.  Then a new population
is generated from the winners, some random mutation (random
generation of code, as well as potentially resizing the codebase)
occurs, and the whole process starts over for the next generation.


lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

use IO::File;
use Carp;
use Perl6::Export::Attrs;
use Scalar::Util qw(weaken);

use base 'Class::Accessor::Fast';
__PACKAGE__->mk_accessors(
    # basic values
    qw{ boardsize codesize code color dims maxlen maxsize minsize },
    # token currency stuff
    qw{ tokens codecost itercost stackcost repeatcost threadcost },
    # other objects we manage
    qw{ blueprint physics interp }
);

use AI::Evolve::Befunge::Util;
use aliased 'AI::Evolve::Befunge::Critter::Result' => 'Result';

=head1 NAME

    AI::Evolve::Befunge::Critter - critter execution environment

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

in the hopes that the critter can use this information to its
advantage.  The values pushed are (in order from the top of the stack
(most accessible) to the bottom (least accessible)):

    * the Physics token (implying the rules of the game/universe)
    * the number of dimensions this universe operates in
    * The number of tokens the critter has left (see LIMITS, below)
    * The   iter cost (see LIMITS, below)
    * The repeat cost (see LIMITS, below)
    * The  stack cost (see LIMITS, below)
    * The thread cost (see LIMITS, below)
    * The code size (a Vector)
    * The maximum storage size (a Vector)
    * The board size (a Vector) if operating in a boardgame universe

If a Critter instance will have it's ->invoke() method called more
than once (for board game universes, it is called once per "turn"),
the storage model is not re-created.  The critter is responsible for
preserving enough of itself to handle multiple invocations properly.
The Language::Befunge Interpreter and Storage model are preserved,
though a new IP is created each time, and (for board game universes)

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

keep the code under test from escaping the environment, or consuming
an unacceptable amount of resources.

Escape is prevented by disabling all file operations, I/O operations,
system commands like fork() and system(), and commands which load
(potentially insecure) external Befunge semantics modules.

Resource consumption is limited through the use of a currency system.
The way this works is, each critter starts out with a certain amount
of "Tokens" (the critter form of currency), and every action (like an
executed befunge instruction, a repeated command, a spawned thread,
etc) incurs a cost.  When the number of tokens drops to 0, the critter
dies.  This prevents the critter from getting itself (and the rest of
the system) into trouble.

For reference, the following things are specifically tested for:

=over 4

=item Size of stacks

=item Number of stacks

=item Storage size (electric fence)

=item Number of threads

=item "k" command repeat count

=item "j" command jump count

=item "x" command dead IP checks (setting a null vector)

=back

Most of the above things will result in spending some tokens.  There

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN


If not specified, this will be pulled from the variable "codecost" in
the config file.  If that can't be found, a default value of 1 is
used.


=item IterCost

This is the number of tokens the critter pays for each command it
runs.  It is a basic operational overhead, decremented for each clock
tick for each running thread.

If not specified, this will be pulled from the variable "itercost" in
the config file.  If that can't be found, a default value of 2 is
used.


=item RepeatCost

This is the number of tokens the critter pays for each time a command
is repeated (with the "k" instruction).  It makes sense for this value

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

multiplied by the StackCost to determine the total cost.

If not specified, this will be pulled from the variable "stackcost"
in the config file.  If that can't be found, a default value of 1 is
used.


=item ThreadCost

This is a fixed number of tokens the critter pays for spawning a new
thread.  When a new thread is created, this cost is incurred, plus the
cost of duplicating all of the thread's stacks (see StackCost, above).
The new threads will begin incurring additional costs from the
IterCost (also above), when it begins executing commands of its own.

If not specified, this will be pulled from the variable "threadcost"
in the config file.  If that can't be found, a default value of 10 is
used.


=item Color

This determines the color of the player, which (for board games)
indicates which type of piece the current player is able to play.  It
has no other effect, and thus, it is not necessary for non-boardgame
physics models.

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

    # args
    my $usage = 
      "Usage: $package->new(Blueprint => \$blueprint, Physics => \$physics,\n"
     ."                     Tokens => 2000, BoardSize => \$vector, Config => \$config)";
    croak $usage unless exists  $args{Config};
    $args{Tokens}     = $args{Config}->config('tokens'     , 2000) unless defined $args{Tokens};
    $args{CodeCost}   = $args{Config}->config("code_cost"  , 1   ) unless defined $args{CodeCost};
    $args{IterCost}   = $args{Config}->config("iter_cost"  , 2   ) unless defined $args{IterCost};
    $args{RepeatCost} = $args{Config}->config("repeat_cost", 1   ) unless defined $args{RepeatCost};
    $args{StackCost}  = $args{Config}->config("stack_cost" , 1   ) unless defined $args{StackCost};
    $args{ThreadCost} = $args{Config}->config("thread_cost", 10  ) unless defined $args{ThreadCost};

    croak $usage unless exists  $args{Blueprint};
    croak $usage unless exists  $args{Physics};
    croak $usage unless defined $args{Color};

    my $codelen = 1;
    foreach my $d ($args{Blueprint}->size->get_all_components) {
        $codelen *= $d;
    }
    croak "CodeCost must be greater than 0!"   unless $args{CodeCost}   > 0;

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

    $$self{blueprint}  = $args{Blueprint};
    $$self{boardsize}  = $args{BoardSize} if exists $args{BoardSize};
    $$self{code}       = $$self{blueprint}->code;
    $$self{codecost}   = $args{CodeCost};
    $$self{codesize}   = $$self{blueprint}->size;
    $$self{config}     = $args{Config};
    $$self{dims}       = $$self{codesize}->get_dims();
    $$self{itercost}   = $args{IterCost};
    $$self{repeatcost} = $args{RepeatCost};
    $$self{stackcost}  = $args{StackCost};
    $$self{threadcost} = $args{ThreadCost};
    $$self{tokens}     = $args{Tokens};
    if(exists($$self{boardsize})) {
        $$self{dims} = $$self{boardsize}->get_dims()
            if($$self{dims} < $$self{boardsize}->get_dims());
    }
    if($$self{codesize}->get_dims() < $$self{dims}) {
        # upgrade codesize (keep it hypercubical)
        $$self{codesize} = Language::Befunge::Vector->new(
            $$self{codesize}->get_all_components(),
            map { $$self{codesize}->get_component(0) }

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN



    my @params;
    my @vectors;
    push(@vectors, $$self{boardsize}) if exists $$self{boardsize};
    push(@vectors, $$self{maxsize}, $$self{codesize});
    foreach my $vec (@vectors) {
        push(@params, $vec->get_all_components());
        push(@params, 1) for($vec->get_dims()+1..$$self{dims});
    }
    push(@params, $$self{threadcost}, $$self{stackcost}, $$self{repeatcost}, 
         $$self{itercost}, $$self{tokens}, $$self{dims});
    push(@params, $self->physics->token) if defined $self->physics->token;

    $$self{interp}->set_params([@params]);

    return $self;
}


=head1 METHODS

lib/AI/Evolve/Befunge/Critter.pm  view on Meta::CPAN

    # override op_flow_repeat to impose loop count checking
    sub _op_flow_repeat_wrap {
        my ($interp) = @_;
        my $ip       = $interp->get_curip;
        my $critter  = $$interp{_ai_critter};
        my $count    = $ip->svalue(1);
        return $ip->dir_reverse unless $critter->spend($critter->repeatcost * abs($count));
        return Language::Befunge::Ops::flow_repeat(@_);
    }

    # override op_spawn_ip to impose thread count checking
    sub _op_spawn_ip_wrap {
        my ($interp) = @_;
        my $ip       = $interp->get_curip;
        my $critter  = $$interp{_ai_critter};
        my $cost     = 0;$critter->threadcost;
        foreach my $stack ($ip->get_toss(), @{$ip->get_ss}) {
            $cost   += scalar @$stack;
        }
        $cost       *= $critter->stackcost;
        $cost       += $critter->threadcost;
        return $ip->dir_reverse unless $critter->spend($cost);
        # This is a hack; Storable can't deep copy our data structure.
        # It will get re-added to both parent and child, next time around.
        delete($$ip{_ai_critter});
        return Language::Befunge::Ops::spawn_ip(@_);
    }
}

1;

t/05critter.t  view on Meta::CPAN



# Critter adds lots of useful info to the initial IP's stack
my $stack_expectations =
    [ord('P'), # physics plugin
     2,        # dimensions
     1983,     # tokens
     2,        # itercost
     1,        # repeatcost
     2,        # stackcost
     10,       # threadcost
     17, 17,   # codesize
     17, 17,   # maxsize
     5, 5,     # boardsize,
     ];
$rv = newaebc('PPPPPPPPPPPPPPPPq', 17, 1);
is_deeply([reverse @{$rv->interp->get_params}], $stack_expectations, 'Critter constructor sets the params value correctly');
push(@$stack_expectations, 0, 0, 0); # make sure nothing else is on the stack
$rv = $rv->move();
ok(!$rv->died, "did not die");
is_deeply([@AI::Evolve::Befunge::Physics::test1::p], $stack_expectations, 'Critter adds lots of useful info to the initial stack');

t/testconfig.conf  view on Meta::CPAN

# minimal values to speed up the test suite.
migrationd_host: 127.0.0.1
migrationd_port: -1
tokens: 1100
codecost: 1
itercost: 2
repeatcost: 1
stackcost: 2
threadcost: 5
basic_value: 42
dimensions: 2
physics: ttt
popsize: 40
bygen:
    1000:
        basic_value: 67
byhost:
    host:
        popsize: 10



( run in 0.247 second using v1.01-cache-2.11-cpan-55f5a4728d2 )