AI-NeuralNet-Simple
view release on metacpan or search on metacpan
678910111213141516171819202122232425260.10 December 29, 2005
The following changes are all courtesy of Raphael Manfredi
<Raphael_Manfredi [at] pobox.com>.
Added tanh (bipolar) activation function.
train_set() can now
accept
an error target to avoid over-training.
Multiple network support.
Persistence via storable.
0.02 September 21 2005
Added pod and pod coverage tests
Added Sub::Uplevel dependency to stop that annoying error failure :(
0.01 Sat Jan 31 12:19:00 2004
Applied patch from
"Daniel L. Ashbrook"
<anjiro [at] cc.gatech.edu>
to fix a small memory allocation bug in infer()
Added learn_rate() method to expose the network.learn_rate. This
should help programmers who wish to fine-tune the network training.
0.01 Sun Oct 5 10:03:18 2003
23456789101112131415161718192021================================
This is a simple backprop neural net simply
for
experimentation purposes.
INSTALLATION
To install this module type the following:
perl Makefile.PL
make
make test
make install
DEPENDENCIES
This module requires these other modules and libraries:
Inline::C
COPYRIGHT AND LICENCE
lib/AI/NeuralNet/Simple.pm view on Meta::CPAN
288289290291292293294295296297298299300301302303304305306307and then to the output layers. This is the
"feed forward"
part. We then
compare the output to the expected results and measure how far off we are. We
then adjust the weights on the
"output to hidden"
synapses, measure the error
on the hidden nodes and then adjust the weights on the
"hidden to input"
synapses. This is what is referred to as
"back error propagation"
.
We
continue
this process
until
the amount of error is small enough that we are
satisfied. In reality, we will rarely
if
ever get precise results from the
network, but we learn various strategies to interpret the results. In the
nodes
has
the greatest value will be the
"winner"
, and thus the answer.
In the examples directory, you will find a program named
"logical_or.pl"
which
demonstrates the above process.
=head2 Building a network
In creating a new neural network, there are three basic steps:
=over 4
lib/AI/NeuralNet/Simple.pm view on Meta::CPAN
315316317318319320321322323324325326327328329330331332333334335=item 2 Training
This involves feeding the neural network enough data until the error rate is
low enough to be acceptable. Often we have a large data set and merely keep
iterating until the desired error rate is achieved.
=item 3 Measuring results
One frequent mistake made with neural networks is failing to test the network
with different data from the training data. It's quite possible for a
backpropagation network to hit what is known as a "local minimum" which is not
truly where it should be. This will cause false results. To check for this,
after training we often feed in other known good data for verification. If the
results are not satisfactory, perhaps a different number of neurons per layer
should be tried or a different set of training data should be supplied.
=back
=head1 Programming C<AI::NeuralNet::Simple>
t/10nn_simple.t view on Meta::CPAN
12345678910use
Test::Exception;
use
strict;
my
$CLASS
;
BEGIN {
unshift
@INC
=>
'blib/lib/'
,
'../blib/lib/'
;
$CLASS
=
'AI::NeuralNet::Simple'
;
use_ok(
$CLASS
) ||
die
;
};
t/20nn_multi.t view on Meta::CPAN
12345678910use
Test::Exception;
use
strict;
my
$CLASS
;
BEGIN {
unshift
@INC
=>
'blib/lib/'
,
'../blib/lib/'
;
$CLASS
=
'AI::NeuralNet::Simple'
;
use_ok(
$CLASS
) ||
die
;
};
t/30nn_storable.t view on Meta::CPAN
1234567891011use
Test::Exception;
use
Storable;
use
strict;
my
$CLASS
;
BEGIN {
unshift
@INC
=>
'blib/lib/'
,
'../blib/lib/'
;
$CLASS
=
'AI::NeuralNet::Simple'
;
use_ok(
$CLASS
) ||
die
;
};
t/pod-coverage.t view on Meta::CPAN
1234567891011121314151617#!perl -T
use
Test::More;
eval
"use Test::Pod::Coverage 1.04"
;
plan $@
? (
skip_all
=>
"Test::Pod::Coverage 1.04 required for testing POD coverage"
)
: (
tests
=> 1 );
my
$ignore
=
join
'|'
=> qw(
STORABLE_freeze
STORABLE_thaw
build_axaref
build_rv
c_destroy_network
c_export_network
c_get_delta
c_get_learn_rate
123456#!perl -T
use
Test::More;
eval
"use Test::Pod 1.14"
;
plan
skip_all
=>
"Test::Pod 1.14 required for testing POD"
if
$@;
all_pod_files_ok();
( run in 0.281 second using v1.01-cache-2.11-cpan-496ff517765 )