AI-NeuralNet-Simple

 view release on metacpan or  search on metacpan

Changes  view on Meta::CPAN

6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
0.10  December 29, 2005
      The following changes are all courtesy of Raphael Manfredi
      <Raphael_Manfredi [at] pobox.com>.
      Added tanh (bipolar) activation function.
      train_set() can now accept an error target to avoid over-training.
      Multiple network support.
      Persistence via storable.
 
0.02  September 21 2005
      Added pod and pod coverage tests
      Added Sub::Uplevel dependency to stop that annoying error failure :(
 
0.01  Sat Jan 31 12:19:00 2004
      Applied patch from "Daniel L. Ashbrook" <anjiro [at] cc.gatech.edu>
      to fix a small memory allocation bug in infer()
 
      Added learn_rate() method to expose the network.learn_rate.  This
      should help programmers who wish to fine-tune the network training.
 
0.01  Sun Oct  5 10:03:18 2003

README  view on Meta::CPAN

2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
================================
 
This is a simple backprop neural net simply for experimentation purposes.
 
INSTALLATION
 
To install this module type the following:
 
   perl Makefile.PL
   make
   make test
   make install
 
DEPENDENCIES
 
This module requires these other modules and libraries:
 
  Inline::C
 
COPYRIGHT AND LICENCE

lib/AI/NeuralNet/Simple.pm  view on Meta::CPAN

288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
and then to the output layers.  This is the "feed forward" part.  We then
compare the output to the expected results and measure how far off we are.  We
then adjust the weights on the "output to hidden" synapses, measure the error
on the hidden nodes and then adjust the weights on the "hidden to input"
synapses.  This is what is referred to as "back error propagation".
 
We continue this process until the amount of error is small enough that we are
satisfied.  In reality, we will rarely if ever get precise results from the
network, but we learn various strategies to interpret the results.  In the
example above, we use a "winner takes all" strategy.  Which ever of the output
nodes has the greatest value will be the "winner", and thus the answer.
 
In the examples directory, you will find a program named "logical_or.pl" which
demonstrates the above process.
 
=head2 Building a network
 
In creating a new neural network, there are three basic steps:
 
=over 4

lib/AI/NeuralNet/Simple.pm  view on Meta::CPAN

315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
functions you wish to use and the "learn rate" of the neurons.
 
=item 2 Training
 
This involves feeding the neural network enough data until the error rate is
low enough to be acceptable.  Often we have a large data set and merely keep
iterating until the desired error rate is achieved.
 
=item 3 Measuring results
 
One frequent mistake made with neural networks is failing to test the network
with different data from the training data.  It's quite possible for a
backpropagation network to hit what is known as a "local minimum" which is not
truly where it should be.  This will cause false results.  To check for this,
after training we often feed in other known good data for verification.  If the
results are not satisfactory, perhaps a different number of neurons per layer
should be tried or a different set of training data should be supplied.
 
=back
 
=head1 Programming C<AI::NeuralNet::Simple>

t/10nn_simple.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
use Test::More tests => 22;
use strict;
 
my $CLASS;
BEGIN {
    unshift @INC => 'blib/lib/', '../blib/lib/';
    $CLASS = 'AI::NeuralNet::Simple';
    use_ok($CLASS) || die;
};

t/20nn_multi.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
use Test::More tests => 14;
use strict;
 
my $CLASS;
BEGIN {
    unshift @INC => 'blib/lib/', '../blib/lib/';
    $CLASS = 'AI::NeuralNet::Simple';
    use_ok($CLASS) || die;
};

t/30nn_storable.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
use Test::More tests => 15;
use strict;
 
my $CLASS;
BEGIN {
    unshift @INC => 'blib/lib/', '../blib/lib/';
    $CLASS = 'AI::NeuralNet::Simple';
    use_ok($CLASS) || die;
};

t/pod-coverage.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#!perl -T
 
eval "use Test::Pod::Coverage 1.04";
plan $@
  ? (skip_all => "Test::Pod::Coverage 1.04 required for testing POD coverage")
  : ( tests => 1 );
 
my $ignore = join '|' => qw(
    STORABLE_freeze
    STORABLE_thaw
    build_axaref
    build_rv
    c_destroy_network
    c_export_network
    c_get_delta
    c_get_learn_rate

t/pod.t  view on Meta::CPAN

1
2
3
4
5
6
#!perl -T
 
eval "use Test::Pod 1.14";
plan skip_all => "Test::Pod 1.14 required for testing POD" if $@;
all_pod_files_ok();



( run in 0.281 second using v1.01-cache-2.11-cpan-496ff517765 )