Incorrect search filter: invalid characters - *.p[ml]
AI-NNFlex

 view release on metacpan or  search on metacpan

CHANGES  view on Meta::CPAN

207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
Cleaned up the perldoc some more. Commented out all the method
perldocs, so there is just the single block defining the
distributions documentation, as advocated by perlmonks. Method
perldocs in importable modules have not been commented out.
 
Removed the weight bounding in backprop & momentum. If the network
is going into an unstable state the weight bounding won't help,
and it causes errors under perl -w.
 
Implemented tests (apologies to the CPAN testers for not having
done so before!).
 
 
#################################################################
 
0.13
20050121
 
New plugin training algorithm - momentum.pm
Improvement in speed using momentum on xor as follows (epochs)

lib/AI/NNFlex.pm  view on Meta::CPAN

1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
Damages the network.
 
B<PROBABILITY>
 
A value between 0 and 1, denoting the probability of a given node or connection being damaged.
 
Note: this method may be called on a per network, per node or per layer basis using the appropriate object.
 
=head1 EXAMPLES
 
See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.
 
 
=head1 PREREQs
 
None. NNFlex should run OK on any version of Perl 5 >.
 
 
=head1 ACKNOWLEDGEMENTS
 
Phil Brierley, for his excellent free java code, that solved my backprop problem

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;
 
=head2 run
 
 $dataset->run($network)
 
Runs the dataset through the network and returns a reference to an array of output patterns.
 
=head1 EXAMPLES
 
See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.
 
 
=head1 PREREQs
 
None. NNFlex::Backprop should run OK on any version of Perl 5 >.
 
 
=head1 ACKNOWLEDGEMENTS
 
Phil Brierley, for his excellent free java code, that solved my backprop problem

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN

264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
=head1 SYNOPSIS
 
 use AI::NNFlex::Dataset;
 
 my $dataset = AI::NNFlex::Dataset->new([[0,1,1,0],[0,0,1,1]]);
 
 $dataset->add([[0,1,0,1],[1,1,0,0]]);
 
 $dataset->add([0,1,0,0]);
 
 $dataset->save(filename=>'test.pat');
 
 $dataset->load(filename=>'test.pat');
 
=head1 DESCRIPTION
 
This module allows you to construct, load, save and maintain datasets for use with neural nets implemented using the AI::NNFlex classes. The dataset consists of an array of references to arrays of data. Items may be added in pairs (useful for feedfor...
 
=head1 CONSTRUCTOR
 
=head2 AI::NNFlex::Dataset->new([[INPUT],[TARGET]]);
 
Parameters:

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
=head1 ACKNOWLEDGEMENTS
 
=head1 SEE ALSO
 
 AI::NNFlex
 AI::NNFlex::Backprop
 
 
=head1 TODO
 
More detailed documentation. Better tests. More examples.
 
=head1 CHANGES
 
v0.1 - new module
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;
 
=head3 run
 
 $dataset->run($network)
 
Runs the dataset through the network and returns a reference to an array of output patterns.
 
=head1 EXAMPLES
 
See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.
 
 
=head1 PREREQs
 
None. NNFlex::Reinforce should run OK on any version of Perl 5 >.
 
 
=head1 ACKNOWLEDGEMENTS
 
Phil Brierley, for his excellent free java code, that solved my backprop problem

t/Backprop.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
use strict;
use Test;
 
BEGIN{
        plan tests=>10}
 
# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1,
                                momentum=>0.6);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# add an extra layer to test out connect
$result = $network->add_layer(       nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"sigmoid",
                        randomweights=>1);
 
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
 
# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result);
 
# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);
 
 
 
 
 
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],
                        [1,0],[0,1],
                        [1,1],[0,0]]);
ok ($dataset); #test 4
##
 
 
# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##
 
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##
 
# test saving weights
$result = $network->dump_state(filename=>'state.wts',activations=>1);
ok($result);
 
# test loading weights
$result = $network->load_state(filename=>'state.wts');
ok($result);

t/Dataset.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
use strict;
use Test;
 
BEGIN{
        plan tests=>12}
 
 
 
 
# we need a basic network  in place to test the dataset functionality against
# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1,
                                momentum=>0.6);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],
                        [1,0],[0,1],
                        [1,1],[0,0]]);
ok ($dataset); #test 4
##
 
# test adding an entry
$result = $dataset->add([[1,1],[0,1]]);
ok($result);
 
# test save
$result = $dataset->save(filename=>'test.pat');
ok ($result);
 
# test empty dataset
my $dataset2 = AI::NNFlex::Dataset->new();
ok($dataset);
 
# test load
$result = $dataset2->load(filename=>'test.pat');
ok($result);
 
#  compare original & loaded dataset
my $comparison;
if (scalar @{$dataset->{'data'}} == scalar @{$dataset2->{'data'}}){$comparison=1}
ok($comparison);
 
# delete a pair from the dataset
$result = $dataset->delete([4,5]);
ok($result);
 
# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##
 
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/Hopfield.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
# example script to build a hopfield net
use strict;
use Test;
 
 
BEGIN{plan tests=>4}
my $matrixpresent = eval("require(Math::Matrix)");
my $matrixabsent = !$matrixpresent;
 
my $network = AI::NNFlex::Hopfield->new();
 
skip($matrixabsent,$network);
 
 
$network->add_layer(nodes=>2);
$network->add_layer(nodes=>2);

t/backprop.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
use strict;
use Test;
 
BEGIN{
        plan tests=>8}
 
# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1,
                                momentum=>0.6);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# add an extra layer to test out connect
$result = $network->add_layer(       nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
 
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
 
# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result);
 
# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);
 
 
 
 
 
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],
                        [1,0],[0,1],
                        [1,1],[0,0]]);
ok ($dataset); #test 4
##
 
 
# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##
 
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/reinforce.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
use strict;
use Test;
 
BEGIN{
        plan tests=>5}
 
# test create network
my $network = AI::NNFlex::Reinforce->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],
                        [1,0],[0,1],
                        [1,1],[0,0]]);
ok ($dataset); #test 4
##
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 5
##



( run in 0.347 second using v1.01-cache-2.11-cpan-454fe037f31 )