view release on metacpan or search on metacpan
lib/AI/Classifier/Text/Analyzer.pm view on Meta::CPAN
=head1 DESCRIPTION
Computes feature vectors of text using some heuristics and adds words count
(using L<Text::WordCounter> by default).
The object is immutable - but some methods use a second parameter as an accumulator for the
features found in given text.
It uses some specific values and methods that work for our case - but are not guaranteed
to bring good results universally - see the source for details!
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/DecisionTree.pm view on Meta::CPAN
=item copy_instances(from =E<gt> $other_tree)
Allows two trees to share the same set of training instances. More
commonly, this lets you train one tree, then re-use its instances in
another tree (possibly changing the instance C<result> values using
C<set_results()>), which is much faster than re-populating the second
tree's instances from scratch.
=item set_results(\%results)
Given a hash that relates instance names to instance result values,
lib/AI/DecisionTree.pm view on Meta::CPAN
=back
=head1 TO DO
All the stuff in the LIMITATIONS section. Also, revisit the pruning
algorithm to see how it can be improved.
=head1 AUTHOR
Ken Williams, ken@mathforum.org
view all matches for this distribution
view release on metacpan or search on metacpan
t/02-test.t view on Meta::CPAN
is( scalar split (/,/, $embed1), 1536, "Correct first embed length");
my $embed2 = $embed_pass->test_embedding($test_string2);
is( scalar split (/,/, $embed2), 1536, "Correct second embed length");
my $embed3 = $embed_pass->test_embedding($test_string2);
ok( $embed2 eq $embed3, "Same text - same test embedding" );
ok( $embed2 ne $embed1, "Different text - different test embedding" );
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Evolve/Befunge.pm view on Meta::CPAN
codebase.
The important bits from a user's standpoint are the Population object
(which drives the main process of evolving AI), and the Physics plugin
(which implements the rules of the universe those AI live in). There
are sections below containing more detail on what these two things
are, and how they work.
=head1 POPULATIONS
lib/AI/Evolve/Befunge.pm view on Meta::CPAN
they will be involved in the reproduction process and may contribute to
the local gene pool.
On the server end, a script called "migrationd" is provided to accept
connections and distribute critters between nodes. The config file
specifies which server to connect to. See the CONFIG FILE section,
below.
=head1 PRACTICAL APPLICATION
view all matches for this distribution
view release on metacpan or search on metacpan
inc/Module/Install/Metadata.pm view on Meta::CPAN
my $name = shift;
my $features = ( $self->{values}->{features} ||= [] );
my $mods;
if ( @_ == 1 and ref( $_[0] ) ) {
# The user used ->feature like ->features by passing in the second
# argument as a reference. Accomodate for that.
$mods = $_[0];
} else {
$mods = \@_;
}
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ExpertSystem/Simple/Goal.pm view on Meta::CPAN
=over 4
=item new( NAME, MESSAGE )
The constructor takes two arguments. The first, NAME, is the name of the attribute that when set will
trigger the end of the consoltation. The second argument, MESSAGE, is the text that will be interpolated
as the answer for the consoltation.
=back
=head2 Public methods
lib/AI/ExpertSystem/Simple/Goal.pm view on Meta::CPAN
The correct number of arguments were supplied to the constructor, however the first argument, NAME, was undefined.
=item Goal->new() argument 2 (MESSAGE) is undefined
The correct number of arguments were supplied to the constructor, however the second argument, MESSAGE, was undefined.
=item Goal->is_goal() takes 1 argument
When the method is called it requires one argument. This message is given if more or less arguments were supplied.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/FANN.pm view on Meta::CPAN
# gets:
$af = $ann->neuron_activation_function($layer_ix, $neuron_ix);
Important: note that on the Perl version, the optional value argument
is moved to the last position (on the C version of the C<set_> method
it is usually the second argument).
=item *
Some functions have been renamed to make the naming more consistent
and to follow Perl conventions:
view all matches for this distribution
view release on metacpan or search on metacpan
print "complement of a is: " . $c->as_string . "\n";
$c = $a->union($b);
print "a union b is: " . $c->as_string . "\n";
$c = $a->intersection($b);
print "a intersection b is: " . $c->as_string . "\n";
__END__
=head1 DESCRIPTION
# get list of members, sorted from least membership to greatest:
@shortest_first = B<members> $fs_tall_people;
$fs = B<new> AI::Fuzzy::Set( x1 => .3, x2 => .5, x3 => .8, x4 => 0, x5 => 1);
B<complement>, B<union>, B<intersection>
Thesie are the fuzzy set version of the typical functions.
B<equal>
Returns true if the sets have the same elements and those elements
are all equal.
view all matches for this distribution
view release on metacpan or search on metacpan
0.2.0 2013-02-13
PDL awareness
Tests re-factored
requires perl 5.8.9
POD improved, PDL sections added
0.2.1 2013-02-14
Bug in test fixed (use PDL removed)
versioning of Set and Variable adapted
view all matches for this distribution
view release on metacpan or search on metacpan
FuzzyInference.pm view on Meta::CPAN
=item operation()
This method is used to set/query the fuzzy operations. It takes at least
one argument, and at most 2. The first argument specifies the logic
operation in question, and can be either C<&> for logical I<AND>,
C<|> for logical I<OR>, or C<!> for logical I<NOT>. The second
argument is used to set what method to use for the given operator.
The following values are possible:
=item &
view all matches for this distribution
view release on metacpan or search on metacpan
AI/Gene/Sequence.pm view on Meta::CPAN
##
# swaps over two sequences within the gene
# any sort of oddness can occur if regions overlap
# 0: number to perform
# 1: start of first sequence (undef for rand)
# 2: start of second sequence (undef for rand)
# 3: length of first sequence (undef for 1, 0 for rand)
# 4: length of second sequence (undef for 1, 0 for rand)
sub mutate_switch {
my $self = shift;
my $num = $_[0] || 1;
my $rt = 0;
AI/Gene/Sequence.pm view on Meta::CPAN
# provided for the sake of testing.
# Generates things to make up genes
# can be called with a token type to produce, or with none.
# if called with a token type, it will also be passed the original
# token as the second argument.
# should return a two element list of the token type followed by the token itself.
sub generate_token {
my $self = shift;
my $token_type = $_[0];
AI/Gene/Sequence.pm view on Meta::CPAN
return ($token_type) x2;
}
# takes sting of token types to be checked for validity.
# If a mutation affects only one place, then the position of the
# mutation can be passed as a second argument.
sub valid_gene {1}
## You might also want to have methods like the following,
# they will not be called by the 'sequence' methods.
AI/Gene/Sequence.pm view on Meta::CPAN
=item C<mutate([num, ref to hash of probs & methods])>
This will call at random one of the other mutation methods.
It will repeat itself I<num> times. If passed a reference
to a hash as its second argument, it will use that to
decide which mutation to attempt.
This hash should contain keys which fit $1 in C<mutate_(.*)>
and values indicating the weight to be given to that method.
The module will normalise this nicely, so you do not have to.
AI/Gene/Sequence.pm view on Meta::CPAN
The token will be randomly generated by the calling object's
C<generate_token> method.
=item C<mutate_overwrite([num, pos1, pos2, len])>
Copies a section of the gene (starting at I<pos1>, length I<len>)
and writes it back into the gene, overwriting current elements,
starting at I<pos2>.
=item C<mutate_reverse([num, pos, len])>
AI/Gene/Sequence.pm view on Meta::CPAN
This takes a sequence (starting at I<pos1> length I<len>)
from within a gene and moves
it to another position (starting at I<pos2>). Odd things might occur if the
position to move the sequence into lies within the
section to be moved, but the module will try its hardest
to cause a mutation.
=item C<mutate_duplicate([num, pos1, pos2, length])>
This copies a portion of the gene starting at I<pos1> of length
AI/Gene/Sequence.pm view on Meta::CPAN
=item C<mutate_switch([num, pos1, pos2, len1, len2])>
This takes two sequences within the gene and swaps them
into each other's position. The first starts at I<pos1>
with length I<len1> and the second at I<pos2> with length
I<len2>. If the two sequences overlap, then no mutation will
be attempted.
=back
view all matches for this distribution
view release on metacpan or search on metacpan
part of its purpose remains meaningful.
(For example, a function in a library to
compute square roots has a purpose that is
entirely well-defined independent of the
application. Therefore, Subsection 2d requires
that any application-supplied function or table
used by this function must be optional: if the
application does not supply it, the square root
function must still compute square roots.)
These requirements apply to the modified work
as a whole. If identifiable sections of that work
are not derived from the Library, and can be
reasonably considered independent and
separate works in themselves, then this
License, and its terms, do not apply to those
sections when you distribute them as separate
works. But when you distribute the same
sections as part of a whole which is a work
based on the Library, the distribution of the
whole must be on the terms of this License,
whose permissions for other licensees extend
to the entire whole, and thus to each and every
part regardless of who wrote it.
Thus, it is not the intent of this section to claim
rights or contest your rights to work written
entirely by you; rather, the intent is to exercise
the right to control the distribution of derivative
or collective works based on the Library.
version is interface-compatible with the version
that the work was made with.
c) Accompany the work with a written offer,
valid for at least three years, to give the same
user the materials specified in Subsection 6a,
above, for a charge no more than the cost of
performing this distribution.
d) If distribution of the work is made by offering
access to copy from a designated place, offer
the Library by all those who receive copies directly or
indirectly through you, then the only way you could satisfy
both it and this License would be to refrain entirely from
distribution of the Library.
If any portion of this section is held invalid or unenforceable
under any particular circumstance, the balance of the
section is intended to apply, and the section as a whole is
intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe
any patents or other property right claims or to contest
validity of any such claims; this section has the sole purpose
of protecting the integrity of the free software distribution
system which is implemented by public license practices.
Many people have made generous contributions to the wide
range of software distributed through that system in reliance
on consistent application of that system; it is up to the
author/donor to decide if he or she is willing to distribute
software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is
believed to be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted
interfaces, the original copyright holder who places the
view all matches for this distribution
view release on metacpan or search on metacpan
generated.
=item I<$ga>-E<gt>B<evolve>(I<strategy>, ?I<num_generations>?)
This method causes the GA to evolve the population using the specified strategy.
A strategy name has to be specified as the first argument. The second argument
is optional and specifies the number of generations to evolve. It defaults to
1. See L</"STRATEGIES"> for more information on the default strategies.
Each generation consists of the following steps:
view all matches for this distribution
view release on metacpan or search on metacpan
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
c) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Logic/AnswerSet.pm view on Meta::CPAN
my $countAS = 0;
my @answerSets = @{$_[0]};
my @second;
if($_[1]) {
@second = @{$_[1]};
}
my @third;
if($_[2]) {
@third = @{$_[2]};
lib/AI/Logic/AnswerSet.pm view on Meta::CPAN
my @predList;
my @pred;
if(@second) {
if($second[0] =~ /\d+/) {
@selectedAS = @second;
if(@third) {
@predList = @third;
}
}
else {
@predList = @second;
if(@third) {
@selectedAS = @third;
}
}
}
view all matches for this distribution
view release on metacpan or search on metacpan
t/00-report-prereqs.t view on Meta::CPAN
my @full_reports;
my @dep_errors;
my $req_hash = $HAS_CPAN_META ? $full_prereqs->as_string_hash : $full_prereqs;
# Add static includes into a fake section
for my $mod (@include) {
$req_hash->{other}{modules}{$mod} = 0;
}
for my $phase ( qw(configure build test runtime develop other) ) {
view all matches for this distribution
view release on metacpan or search on metacpan
examples/mnist.pl view on Meta::CPAN
# The first fully-connected layer
# my $fc1 = mx->sym->FullyConnected(data => $data, name => 'fc1', num_hidden => 128);
# # Apply relu to the output of the first fully-connnected layer
# my $act1 = mx->sym->Activation(data => $fc1, name => 'relu1', act_type => "relu");
# The second fully-connected layer and the according activation function
my $fc2 = mx->sym->FullyConnected(data => $data, name => 'fc2', num_hidden => 64);
my $act2 = mx->sym->Activation(data => $fc2, name => 'relu2', act_type => "relu");
# The thrid fully-connected layer, note that the hidden size should be 10, which is the number of unique digits
my $fc3 = mx->sym->FullyConnected(data => $act2, name => 'fc3', num_hidden => 10);
examples/mnist.pl view on Meta::CPAN
return $mlp;
}
sub nn_conv {
my($data) = @_;
# Epoch[9] Batch [200] Speed: 1625.07 samples/sec Train-accuracy=0.992090
# Epoch[9] Batch [400] Speed: 1630.12 samples/sec Train-accuracy=0.992850
# Epoch[9] Train-accuracy=0.991357
# Epoch[9] Time cost=36.817
# Epoch[9] Validation-accuracy=0.988100
my $conv1= mx->symbol->Convolution(data => $data, name => 'conv1', num_filter => 20, kernel => [5,5], stride => [2,2]);
view all matches for this distribution
view release on metacpan or search on metacpan
*
* \param handle handle to the KVStore
* \param node_id Can be a node group or a single node.
* kScheduler = 1, kServerGroup = 2, kWorkerGroup = 4
* \param number Ouptut number of dead nodes
* \param timeout_sec A node fails to send heartbeart in {timeout_sec} seconds
* will be presumed as 'dead'
*/
int MXKVStoreGetNumDeadNode(KVStoreHandle handle,
const int node_id,
int *out,
const int timeout_sec = 60);
/**
* \brief Create a RecordIO writer object
* \param uri path to file
* \param out handle pointer to the created object
view all matches for this distribution
view release on metacpan or search on metacpan
inc/Module/Install/AutoInstall.pm view on Meta::CPAN
$self->makemaker_args( Module::AutoInstall::_make_args() );
my $class = ref($self);
$self->postamble(
"# --- $class section:\n" .
Module::AutoInstall::postamble()
);
}
sub auto_install_now {
view all matches for this distribution
view release on metacpan or search on metacpan
To read about the latest features, see the Changes file.
The author invites feedback on AI::MegaHAL. If you find a bug, please send the
information described in the BUGS section below.
2) INSTALLATION
---------------
view all matches for this distribution
view release on metacpan or search on metacpan
bin/from-folder.pl view on Meta::CPAN
ok($c->simpleMixedSearch($style,$_)) && ok($c->play($style,$_)) for
qw(atom antimatter planet);
ok(print Dumper $c->intersect($style,$_)) for
qw(atom antimatter planet);
ok(print Dumper $c->similar($style,$_)) for
qw(atom antimatter planet);
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NNEasy.pm view on Meta::CPAN
What is hard in a NN is to find this I<weights>. By default L<AI::NNEasy> uses
I<backprop> as learning algorithm. With I<backprop> it pastes the inputs through
the Neural Network and adjust the I<weights> using random numbers until we find
a set of I<weights> that give to us the right output.
The secret of a NN is the number of hidden layers and nodes/neurons for each layer.
Basically the best way to define the hidden layers is 1 layer of (INPUT_NODES+OUTPUT_NODES).
So, a layer of 2 input nodes and 1 output node, should have 3 nodes in the hidden layer.
This definition exists because the number of inputs define the maximal variability of
the inputs (N**2 for bollean inputs), and the output defines if the variability is reduced by some logic restriction, like
int the XOR example, where we have 2 inputs and 1 output, so, hidden is 3. And as we can see in the
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NNFlex/Feedforward.pm view on Meta::CPAN
}
$counter++;
}
# Now flow activation through the network starting with the second layer
foreach my $layer (@{$network->{'layers'}})
{
if ($layer eq $network->{'layers'}->[0]){next}
foreach my $node (@{$layer->{'nodes'}})
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NaiveBayes/Classification.pm view on Meta::CPAN
Returns a string being a label that suits given document the best.
=item C<find_predictors()>
This method returns the C<best_category()>, as well as the list of all the predictors
along with their influence on the best category selected. So the second value
returned is a list of array references, where each one contains a string being a
single feature and a number describing its influence on the result. So the
second part of the result may look like this:
(
[ 'activities', 1.2511540632952 ],
[ 'over', -1.0269523272981 ],
[ 'provide', 0.8280157033269 ],
view all matches for this distribution
view release on metacpan or search on metacpan
BackProp.pm view on Meta::CPAN
# $Id: BackProp.pm,v 0.89 2000/08/12 01:05:27 josiah Exp $
#
# Copyright (c) 2000 Josiah Bryan USA
#
# See AUTHOR section in pod text below for usage and distribution rights.
# See UPDATES section in pod text below for info on what has changed in this release.
#
BEGIN {
$AI::NeuralNet::BackProp::VERSION = "0.89";
}
BackProp.pm view on Meta::CPAN
bless $self, $type;
# If $layers is a string, then it will be nummerically equal to 0, so try to load it
# as a network file.
if($layers == 0) {
# We use a "1" flag as the second argument to indicate that we want load()
# to call the new constructor to make a network the same size as in the file
# and return a refrence to the network, instead of just creating the network from
# pre-exisiting refrence
return $self->load($layers,1);
}
BackProp.pm view on Meta::CPAN
return $self->{parent}->intr(($self->{palette}->[$color]->{red}+$self->{palette}->[$color]->{green}+$self->{palette}->[$color]->{blue})/3);
}
# Loads and decompresses a PCX-format 320x200, 8-bit image file and returns
# two arrays, first is a 64000-byte long array, each element contains a palette
# index, and the second array is a 255-byte long array, each element is a hash
# ref with the keys 'red', 'green', and 'blue', each key contains the respective color
# component for that color index in the palette.
sub load_pcx {
shift if(substr($_[0],0,4) eq 'AI::');
BackProp.pm view on Meta::CPAN
=head1 DESCRIPTION
AI::NeuralNet::BackProp implements a nerual network similar to a feed-foward,
back-propagtion network; learning via a mix of a generalization
of the Delta rule and a disection of Hebbs rule. The actual
neruons of the network are implemented via the AI::NeuralNet::BackProp::neuron package.
You constuct a new network via the new constructor:
my $net = new AI::NeuralNet::BackProp(2,3,1);
BackProp.pm view on Meta::CPAN
illustrates the call. Run now allows strings to be used as
input. See run() for more information.
Run returns a refrence with $size elements (Remember $size? $size
is what you passed as the second argument to the network
constructor.) This array contains the results of the mapping. If
you ran the example exactly as shown above, $result would probably
contain (1,1) as its elements.
To make the network learn a new pattern, you simply call the learn
BackProp.pm view on Meta::CPAN
Now $result will conain (1,0), effectivly flipping the input pattern
around. Obviously, the larger $size is, the longer it will take
to learn a pattern. Learn() returns a string in the form of
Learning took X loops and X wallclock seconds (X.XXX usr + X.XXX sys = X.XXX CPU).
With the X's replaced by time or loop values for that loop call. So,
to view the learning stats for every learn call, you can just:
print $net->learn(\@map,\@res);
If you call "$net->debug(4)" with $net being the
refrence returned by the new() constructor, you will get benchmarking
information for the learn function, as well as plenty of other information output.
See notes on debug() in the METHODS section, below.
If you do call $net->debug(1), it is a good
idea to point STDIO of your script to a file, as a lot of information is output. I often
use this command line:
BackProp.pm view on Meta::CPAN
as calling:
$net->range($net->crunch("string of values"));
=item $net->range("first string","second string");
This is the same as calling:
$net->range($net->crunch("first string"),$net->crunch("second string"));
Or:
@range = ($net->crunch("first string"),
$net->crunch("second string"));
$net->range(\@range);
=item $net->range($value1,$value2);
BackProp.pm view on Meta::CPAN
Or:
@range = ($value1,$value2);
$net->range(\@range);
The second example is the same as the first example.
=item $net->benchmarked();
BackProp.pm view on Meta::CPAN
bad to have a pure 0 internally because the weights cannot change a 0 when multiplied by a 0, the
product stays a 0. Yet when a weight is multiplied by 0.00001, eventually with enough weight, it will
be able to learn. With a 0 value instead of 0.00001 or whatever, then it would never be able
to add enough weight to get anything other than a 0.
The second option to allow for 0s is to enable a maximum error with the 'error' option in
learn() , learn_set() , and learn_set_rand() . This allows the network to not worry about
learning an output perfectly.
For accuracy reasons, it is recomended that you work with 0s using the random() method.
view all matches for this distribution
view release on metacpan or search on metacpan
}
else if ( SvTYPE(SvRV(self)) == SVt_PVMG ) {
/*
* this should be the second pass. here we need to serialize
* the tied part not seen from the perl side.
*/
som = INT2PTR(SOM_GENERIC*,self2iv(self));
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Hopfield.pm view on Meta::CPAN
my $b_cols = $matrix_b->{_cols};
my $result = Math::SparseMatrix->new($a_rows, $b_cols);
if ($matrix_a->{_cols} != $matrix_b->{_rows}) {
die "To use ordinary matrix multiplication the number of columns on the first matrix must mat the number of rows on the second";
}
for (my $result_row = 1; $result_row <= $a_rows; $result_row++) {
for(my $result_col = 1; $result_col <= $b_cols; $result_col++) {
my $value = 0;
view all matches for this distribution
view release on metacpan or search on metacpan
t/AI-NeuralNet-Kohonen-Visual.t view on Meta::CPAN
is( $net->{input}->[0]->{values}->[2],0);
is( $net->{weight_dim}, 2);
$net->train;
diag "Will automatically destroy this window in $delay seconds";
$net->{_mw}->after($delay*1000, sub{ $net->{_mw}->destroy } );
$net->main_loop;
pass;
t/AI-NeuralNet-Kohonen-Visual.t view on Meta::CPAN
foreach my $bmu ($net->get_results){
$net->label_map(@$bmu->[1],@$bmu->[2],"+".@$bmu->[3]);
}
diag "Will automatically destroy this window in $delay seconds";
$net->{_mw}->after($delay*1000, sub{ $net->{_mw}->destroy } );
$net->plot_map;
$net->main_loop;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Kohonen.pm view on Meta::CPAN
}
=head1 PRIVATE FUNCTION _gauss_weight
Accepts: two paramters: the first, C<r>, gives the distance from the mask centre,
the second, C<sigma>, specifies the width of the mask.
Returns the gaussian weight.
See also L<_decay_learning_rate>.
view all matches for this distribution
view release on metacpan or search on metacpan
#!/usr/bin/perl
# Copyright (c) 2000 Josiah Bryan USA
#
# See AUTHOR section in pod text below for usage and distribution rights.
#
BEGIN {
$AI::NeuralNet::Mesh::VERSION = "0.44";
$AI::NeuralNet::Mesh::ID =
bless $self, $type;
# If $layers is a string, then it will be numerically equal to 0, so
# try to load it as a network file.
if($layers == 0) {
# We use a "1" flag as the second argument to indicate that we
# want load() to call the new constructor to make a network the
# same size as in the file and return a refrence to the network,
# instead of just creating the network from pre-exisiting refrence
return $self->load($layers,1);
}
#
# Note: when using a range() activatior, train the
# net TWICE on the data set, because the first time
# the range() function searches for the top value in
# the inputs, and therefore, results could flucuate.
# The second learning cycle guarantees more accuracy.
#
sub range {
my @r=@_;
sub{$_[1]->{t}=$_[0]if($_[0]>$_[1]->{t});$r[intr($_[0]/$_[1]->{t}*$#r)]}
}
#
# Note: when using a ramp() activatior, train the
# net at least TWICE on the data set, because the first
# time the ramp() function searches for the top value in
# the inputs, and therefore, results could flucuate.
# The second learning cycle guarantees more accuracy.
#
sub ramp {
my $r=shift||1;my $t=($r<2)?0:-1;
sub{$_[1]->{t}=$_[0]if($_[0]>$_[1]->{t});$_[0]/$_[1]->{t}*$r-$b}
}
The code ref is called with this syntax:
$output = &$code_ref($sum_of_inputs, $self);
The code ref is expected to return a value to be used as the output of the node.
The code ref also has access to all the data of that node through the second argument,
a blessed hash refrence to that node.
See CUSTOM ACTIVATION FUNCTIONS for information on several included activation functions
other than the ones listed above.
The code ref is called with this syntax:
$output = &$code_ref($sum_of_inputs, $self);
The code ref is expected to return a value to be used as the output of the node.
The code ref also has access to all the data of that node through the second argument,
a blessed hash refrence to that node.
See CUSTOM ACTIVATION FUNCTIONS for information on several included activation functions
other than the ones listed above.
Note: when using a range() activatior, train the
net TWICE on the data set, because the first time
the range() function searches for the top value in
the inputs, and therefore, results could flucuate.
The second learning cycle guarantees more accuracy.
The actual code that implements the range closure is
a bit convulted, so I will expand on it here as a simple
tutorial for custom activation functions.
Note: when using a ramp() activatior, train the
net at least TWICE on the data set, because the first
time the ramp() function searches for the top value in
the inputs, and therefore, results could flucuate.
The second learning cycle guarantees more accuracy.
No code to show here, as it is almost exactly the same as range().
=item and_gate($threshold);
The network is stored internally as one long array of node objects. The goal here
is to connect one range of nodes in that array to another range of nodes. The calling
function has already calculated the indices into the array, and it passed it to you
as the four arguments after the $self refrence. The first two arguments we will call
$r1a and $r1b. These define the start and end indices of the first range, or "layer." Likewise,
the next two arguemnts, $r2a and $r2b, define the start and end indices of the second
layer. We also grab a refrence to the mesh array so we dont have to type the $self
refrence over and over.
The loop that folows the arguments in the above example is very simple. It opens
a for() loop over the range of numbers, calculating the size instead of just going
= line 11 = }
= line 12 = }
= line 12 = }
Its that easy! The simplest connector (well almost anyways). It just connects each
node in the first layer defined by ($r1a..$r1b) to every node in the second layer as
defined by ($r2a..$r2b).
Those of you that are still reading, if you do come up with any new connection functions,
PLEASE SEND THEM TO ME. I would love to see what others are doing, as well as get new
network ideas. I will probably include any connectors you send over in future releases (with
view all matches for this distribution