AI-NNFlex
view release on metacpan or search on metacpan
Removed feedforward_pdl.pm from the distribution - it shouldn't have
been there in the first place!
Fixed the lesion subs so they expect parameter=>value pairs instead
of an anonymous hash (left over from when I didn't know you could do
that).
Fixed an error - random weights was bounded by 1, not the parameter
'randomweights'. Its now positive only. Some benchmarking needed as
it appears that positive random starting weights rather than a mix
of positive and negative make the network quicker to converge, at
least with momentum.
weights now defaults to rand(1) instead of 0 - at least for backprop
type nets, a default 0 weight will never work. For other types of nets
the random weights can be overridden with the 'fixedweights' parameter.
Fixed load_state to correctly read weights from the bias node
Implemented a rounding network property in output
Fixed a bug that allowed activation to flow through a node
even if it was inactive
Altered the syntax for output to be param=>value instead of
an anonymous hash
As per Scott Fahlmans comments about neural net benchmarking,
(Fahlman S.E., (1988) 'An empirical study of learning speed in back-propagation networks'. Tech. Rep. CMU-CS-88-162, Carnegie Mellon University, Pittsburgh, PA.) , I've started using a more realistic benchmark than xor.
The 'cars' subfolder in examples contains the learning code
for this, drawn from
ftp://ftp.ics.uci.edu/pub/machine-learning-databases/car/
#############################################################
lib/AI/NNFlex.pm view on Meta::CPAN
$network->dump_state (filename=>'badgers.wts');
$network->load_state (filename=>'badgers.wts');
my $outputsRef = $network->output(layer=>2,round=>1);
=head1 DESCRIPTION
AI::NNFlex is a base class for constructing your own neural network modules. To implement a neural network, start with the documentation for AI::NNFlex::Backprop, included in this distribution
=head1 CONSTRUCTOR
=head2 AI::NNFlex->new ( parameter => value );
randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT
fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS
lib/AI/NNFlex/Feedforward.pm view on Meta::CPAN
$_->{'activation'} =$inputPattern[$counter];
if (scalar @debug> 0)
{$network->dbug("Applying ".$inputPattern[$counter]." to $_",3);}
}
}
$counter++;
}
# Now flow activation through the network starting with the second layer
foreach my $layer (@{$network->{'layers'}})
{
if ($layer eq $network->{'layers'}->[0]){next}
foreach my $node (@{$layer->{'nodes'}})
{
my $totalActivation;
# Set the node to 0 if not persistent
if (!($node->{'persistentactivation'}))
{
lib/AI/NNFlex/Reinforce.pm view on Meta::CPAN
sub learn
{
my $network = shift;
my @layers = @{$network->{'layers'}};
# no connections westwards from input, so no weights to adjust
shift @layers;
# reverse to start with the last layer first
foreach my $layer (reverse @layers)
{
my @nodes = @{$layer->{'nodes'}};
foreach my $node (@nodes)
{
my @westNodes = @{$node->{'connectedNodesWest'}->{'nodes'}};
my @westWeights = @{$node->{'connectedNodesWest'}->{'weights'}};
my $connectedNodeCounter=0;
foreach my $westNode (@westNodes)
( run in 0.300 second using v1.01-cache-2.11-cpan-0d8aa00de5b )