AI-NNFlex

 view release on metacpan or  search on metacpan

CHANGES  view on Meta::CPAN

77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
0.20
20050308
 
v0.17 was never released, as I rejigged the whole lot for
object inheritance before I got around to uploading it to CPAN.
Why? I hear you ask, when it worked OK already.
1) its faster, a lot faster.
2) feedforward isn't the only kind of network, and I wanted to
be free to overload some of the methods (especially init) to
simplify writing a Hopfield module (in progress)
3) its more theoretically correct
 
So now, AI::NNFlex is the base class for the other types of
networks, and you should never need to call AI::NNFlex class
directly - you should call the constructor of the subclass, such
as:
my $network = AI::NNFlex::momentum->new(params);
 
The upshot of that is that the network type and learning algorithm
parameters are now obsolete.

CHANGES  view on Meta::CPAN

267
268
269
270
271
272
273
274
275
276
277
278
279
280
Added PNG support to AI::NNFlex::draw
 
Added AI::NNFlex::Dataset
This creates a dataset object that can be run against a
network
 
Added AI::NNFlex::lesion
Damages a network with a probability of losing a node
or a connection. See the perldoc
 
Cleaned up the POD docs a bit, although theres a lot still
to do.
 
################################################################

INSTALL  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
Note: the dependency upon Math::Matrix is for the
Hopfield module only. If you want to use Backprop
you can safely leave it unresolved.
 
If you want to perform a standard install, placing
the modules etc in the perl standard library locations,
run:
perl Makefile.PL
 
followed by:
make install
 
++++++++++++++++++++++++++++++++++++++++++++++++++

TODO  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
Put in some more error checking, particularly trying to create connections
between layers/nodes that don't exist.
 
Write a simple net simulator with syntax loosely based on xerion. At
present this lot is API driven, it should be straightforward to write
a basic simulator that calls the API in the backend.
 
read & write methods for both networks and datasets modelled on snns format (for use with frontend script). data should be snns format, network definition file will probably have to differ
 
Implement an error method in addition to dbug, and clean up the dbug & error calls

examples/bp.pl  view on Meta::CPAN

66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
        #display the overall network error
        #after each epoch
        calcOverallError();
 
        print "epoch = ".$j."  RMS Error = ".$RMSerror."\n";
 
    }
 
    #training has finished
    #display the results
    displayResults();
 
 }
 
#============================================================
#********** END OF THE MAIN PROGRAM **************************
#=============================================================

examples/bp.pl  view on Meta::CPAN

174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
}
 
 
#************************************
 sub initData()
 {
 
    print "initialising data\n";
 
    # the data here is the XOR data
    # it has been rescaled to the range
    # [-1][1]
    # an extra input valued 1 is also added
    # to act as the bias
 
    $trainInputs[0][0]  = 1;
    $trainInputs[0][1]  = -1;
    $trainInputs[0][2]  = 1;    #bias
    $trainOutput[0] = 1;
 
    $trainInputs[1][0]  = -1;

examples/lesion.pl  view on Meta::CPAN

13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
                                debug=>[],bias=>1,
                                momentum=>0.6,
                                round=>1);
 
 
 
$network->add_layer( nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
 
 
$network->add_layer( nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
 
$network->add_layer( nodes=>1,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"linear",
                        randomweights=>1);
 
 
$network->init();
 
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[0],
                        [0,1],[1],
                        [1,0],[1],

examples/reinforceTest.pl  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# this is /really/ experimental - see perldoc NNFlex::reinforce
 
my $object = AI::NNFlex->new([{"nodes"=>2,"persistent activation"=>0,"decay"=>0.0,"random activation"=>0,"threshold"=>0.0,"activation function"=>"tanh","random weights"=>1},
                        {"nodes"=>2,"persistent activation"=>0,"decay"=>0.0,"random activation"=>0,"threshold"=>0.0,"activation function"=>"tanh","random weights"=>1},
                       {"nodes"=>1,"persistent activation"=>0,"decay"=>0.0,"random activation"=>0,"threshold"=>0.0,"activation function"=>"linear","random weights"=>1}],{'random connections'=>0,'networktype'=>'feedforward', 'random weights'=>1,'learn...
 
 
$object->run([1,0]);
$output = $object->output();
foreach (@$output)
{
        print "1,0 - $_ ";
}
print "\n";

lib/AI/NNFlex.pm  view on Meta::CPAN

577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
# fromnode=>[LAYER,NODE],tonode=>[LAYER,NODE]
#
# returns success or failure
#
#
#########################################################################
sub connect
{
        my $network = shift;
        my %params = @_;
        my $result = 0;
 
        if ($params{'fromnode'})
        {
                $result = $network->connectnodes(%params);
        }
        elsif ($params{'fromlayer'})
        {
                $result = $network->connectlayers(%params);
        }
        return $result;
 
}
 
########################################################################
# AI::NNFlex::connectlayers
########################################################################
sub connectlayers
{
        my $network=shift;
        my %params = @_;

lib/AI/NNFlex.pm  view on Meta::CPAN

912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
=head2 AI::NNFlex->new ( parameter => value );
         
 
randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT
 
fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS
 
debug=>[LIST OF CODES FOR MODULES TO DEBUG]
 
round=>0 or 1, a true value sets the network to round output values to nearest of 1, -1 or 0
 
 
The constructor implements a fairly generalised network object with a number of parameters.
 
 
The following parameters are optional:
 randomweights
 fixedweights
 debug
 round

lib/AI/NNFlex.pm  view on Meta::CPAN

941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
=head2 AI::NNFlex
 
=head3 add_layer
 
 Syntax:
 
 $network->add_layer(        nodes=>NUMBER OF NODES IN LAYER,
                        persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
                        decay=>RATE OF ACTIVATION DECAY PER PASS,
                        randomactivation=>MAXIMUM STARTING ACTIVATION,
                        threshold=>NYI,
                        activationfunction=>"ACTIVATION FUNCTION",
                        randomweights=>MAX VALUE OF STARTING WEIGHTS);
 
Add layer adds whatever parameters you specify as attributes of the layer, so if you want to implement additional parameters simply use them in your calling code.
 
Add layer returns success or failure, and if successful adds a layer object to the $network->{'layers'} array. This layer object contains an attribute $layer->{'nodes'}, which is an array of nodes in the layer.
 
=head3 init
 
 Syntax:

lib/AI/NNFlex.pm  view on Meta::CPAN

1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.
 
=head1 SEE ALSO
 
 AI::NNFlex::Backprop
 AI::NNFlex::Feedforward
 AI::NNFlex::Mathlib
 AI::NNFlex::Dataset
 
 AI::NNEasy - Developed by Graciliano M.Passos
 (Shares some common code with NNFlex)
  
 
=head1 TODO
 
 Lots of things:
 
 clean up the perldocs some more
 write gamma modules
 write BPTT modules
 write a perceptron learning module

lib/AI/NNFlex.pm  view on Meta::CPAN

1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
v0.11 introduces the lesion method, png support in the draw module and datasets.
 
v0.12 fixes a bug in reinforce.pm & adds a reflector in feedforward->run to make $network->run($dataset) work.
 
v0.13 introduces the momentum learning algorithm and fixes a bug that allowed training to proceed even if the node activation function module can't be loaded
 
v0.14 fixes momentum and backprop so they are no longer nailed to tanh hidden units only.
 
v0.15 fixes a bug in feedforward, and reduces the debug overhead
 
v0.16 changes some underlying addressing of weights, to simplify and speed 
 
v0.17 is a bugfix release, plus some cleaning of UI
 
v0.20 changes AI::NNFlex to be a base class, and ships three different network types (i.e. training algorithms). Backprop & momentum are both networks of the feedforward class, and inherit their 'run' method from feedforward.pm. 0.20 also fixes a who...
 
v0.21 cleans up the perldocs more, and makes nnflex more distinctly a base module. There are quite a number of changes in Backprop in the v0.21 distribution.
 
v0.22 introduces the ::connect method, to allow creation of recurrent connections, and manual control over connections between nodes/layers.
 
v0.23 includes a Hopfield module in the distribution.
 
v0.24 fixes a bug in the bias weight calculations
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
=cut

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
        fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS
 
        debug=>[LIST OF CODES FOR MODULES TO DEBUG]
 
        learningrate=>the learning rate of the network
 
        momentum=>the momentum value (momentum learning only)
 
        round=>0 or 1 - 1 sets the network to round output values to
                nearest of 1, -1 or 0
 
        fahlmanconstant=>0.1
                 
 
 
The following parameters are optional:
 
 randomweights
 
 fixedweights

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
=head2 AI::NNFlex::Backprop
 
=head2 add_layer
 
 Syntax:
 
 $network->add_layer(        nodes=>NUMBER OF NODES IN LAYER,
                        persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
                        decay=>RATE OF ACTIVATION DECAY PER PASS,
                        randomactivation=>MAXIMUM STARTING ACTIVATION,
                        threshold=>NYI,
                        activationfunction=>"ACTIVATION FUNCTION",
                        errorfunction=>'ERROR TRANSFORMATION FUNCTION',
                        randomweights=>MAX VALUE OF STARTING WEIGHTS);
 
 
The activation function must be defined in AI::NNFlex::Mathlib. Valid predefined activation functions are tanh & linear.
 
The error transformation function defines a transform that is done on the error value. It must be a valid function in AI::NNFlex::Mathlib. Using a non linear transformation function on the error value can sometimes speed up training.
 
The following parameters are optional:
 
 persistentactivation
 
 decay
 
 randomactivation
 
 threshold
 
 errorfunction
 
 randomweights
 
 
 
=head2 init
 
 Syntax:

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
Graciliano M.Passos for suggestions & improved code (see SEE ALSO).
 
Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.
 
=head1 SEE ALSO
 
 AI::NNFlex
 
 AI::NNEasy - Developed by Graciliano M.Passos
 Shares some common code with NNFlex.
  
 
=head1 TODO
 
 
 
=head1 CHANGES
 
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
 
 
=cut

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN

348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
Method to delete existing dataset entries by index
 
Method to validate linear separability of a dataset.
 
=head1 CHANGES
 
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify
it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
 
 
=cut

lib/AI/NNFlex/Feedforward.pm  view on Meta::CPAN

212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
AI::NNFlex::Backprop
 AI::NNFlex::Dataset
 
 
=head1 CHANGES
 
 
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
=cut

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
# Get a list of all the nodes in the network
foreach my $layer (@{$network->{'layers'}})
{
        foreach my $node (@{$layer->{'nodes'}})
        {
                # cover the assumption that some inherited code
                # will require an activation function
                if (!$node->{'activationfunction'})
                {
                        $node->{'activationfunction'}= 'hopfield_threshold';
                        $node->{'activation'} =0;
                        $node->{'lastactivation'} = 0;
                }
                push @nodes,$node;
        }
}
 
# we'll probably need this later
$network->{'nodes'} = \@nodes;

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
        my $product = $inversepattern->multiply($patternmatrix);
 
        my $weights = $product->subtract($minus);
 
        my @element = ('1');
        my @truearray;
        for (1..scalar @{$dataset->{'data'}}){push @truearray,"1"}
         
        my $truematrix = Math::Matrix->new(\@truearray);
 
        my $thresholds = $truematrix->multiply($patternmatrix);
        #$thresholds = $thresholds->transpose();
 
        my $counter=0;
        foreach (@{$network->{'nodes'}})
        {
                my @slice;
                foreach (@{$weights->slice($counter)})
                {
                        push @slice,$$_[0];
                }
 
                push @slice,${$thresholds->slice($counter)}[0][0];
 
                $_->{'connectednodes'}->{'weights'} = \@slice;
                $counter++;
        }
 
        return 1;
 
}

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
=head1 TODO
 
More detailed documentation. Better tests. More examples.
 
=head1 CHANGES
 
v0.1 - new module
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
 
 
=cut

lib/AI/NNFlex/Mathlib.pm  view on Meta::CPAN

5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
#######################################################
#
# Version history
# ===============
#
# 1.0   CColbourn       20050315        Compiled into a
#                                       single module
#
# 1.1   CColbourn       20050321        added in sigmoid_slope
#
# 1.2   CColbourn       20050330        Added in hopfield_threshold
#
# 1,3   CColbourn       20050407        Changed sigmoid function to
#                                       a standard sigmoid. sigmoid2
#                                       now contains old sigmoid,
#                                       which is more used in BPTT
#                                       and I think needs cross
#                                       entropy calc to work.
#
#######################################################
#Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify
 
use strict;
 
#######################################################
# tanh activation function
#######################################################
sub tanh
{

lib/AI/NNFlex/Mathlib.pm  view on Meta::CPAN

141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
 
        my $return = $value * (1-$value);
        if (scalar @debug > 0)
        {$network->dbug("sigmoid_slope returning $value",5);}
 
        return $return;
}
 
############################################################
# hopfield_threshold
# standard hopfield threshold activation - doesn't need a
# slope (because hopfield networks don't use them!)
############################################################
sub hopfield_threshold
{
        my $network = shift;
        my $value = shift;
 
        if ($value <0){return -1}
        if ($value >0){return 1}
        return $value;
}
 
############################################################

lib/AI/NNFlex/Mathlib.pm  view on Meta::CPAN

179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
1;
 
=pod
 
=head1 NAME
 
AI::NNFlex::Mathlib - miscellaneous mathematical functions for the AI::NNFlex NN package
 
=head1 DESCRIPTION
 
The AI::NNFlex::Mathlib package contains activation and error functions. At present there are the following:
 
Activation functions
 
=over
 
=item *
tanh
 
=item *
linear
 
=item *
hopfield_threshold
 
=back
 
Error functions
 
=over
 
=item *
atanh
 
=back
 
If you want to implement your own activation/error functions, you can add them to this module. All activation functions to be used by certain types of net (like Backprop) require an additional function <function name>_slope, which returns the 1st ord...
 
This rule doesn't apply to all network types. Hopfield for example requires no slope calculation.
 
=head1 CHANGES
 
v1.2 includes hopfield_threshold
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
 
 
=cut

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
#
##########################################################
# Versions
# ========
#
# 1.0   20041125        CColbourn       New module
# 1.1   20050116        CColbourn       Fixed reverse @layers
#                                       bug reported by GM Passos
#
# 1.2   20050218        CColbourn       Mod'd to change weight
#                                       addressing from hash to
#                                       array for nnf0.16
#
# 1.3   20050307        CColbourn       repackaged as a subclass
#                                       of nnflex
#
##########################################################
# ToDo
# ----
#
#

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
         
        randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT
 
        fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS
 
        debug=>[LIST OF CODES FOR MODULES TO DEBUG]
 
        learningrate=>the learning rate of the network
 
        round=>0 or 1 - 1 sets the network to round output values to
                nearest of 1, -1 or 0
 
 
The following parameters are optional:
 randomweights
 fixedweights
 debug
 round
 
(Note, if randomweights is not specified the network will default to a random value from 0 to 1.

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
=head2 AI::NNFlex
 
=head3 add_layer
 
 Syntax:
 
 $network->add_layer(        nodes=>NUMBER OF NODES IN LAYER,
                        persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
                        decay=>RATE OF ACTIVATION DECAY PER PASS,
                        randomactivation=>MAXIMUM STARTING ACTIVATION,
                        threshold=>NYI,
                        activationfunction=>"ACTIVATION FUNCTION",
                        randomweights=>MAX VALUE OF STARTING WEIGHTS);
 
=head3 init
 
 Syntax:
 
 $network->init();
 
Initialises connections between nodes, sets initial weights and loads external components. The base AI::NNFlex init method implementes connections backwards and forwards from each node in each layer to each node in the preceeding and following layers...

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
=head1 SEE ALSO
 
 AI::NNFlex
 AI::NNFlex::Backprop
 AI::NNFlex::Dataset
 
 
=head1 COPYRIGHT
 
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
 
=head1 CONTACT
 
 charlesc@nnflex.g0n.net
 
 
 
=cut

t/Backprop.t  view on Meta::CPAN

10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1,
                                momentum=>0.6);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# add an extra layer to test out connect
$result = $network->add_layer(       nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"sigmoid",
                        randomweights=>1);
 
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
 
# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result);
 
# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);
 
 
 
 
 
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],

t/Backprop.t  view on Meta::CPAN

66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
##
 
 
# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##
 
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##
 
# test saving weights
$result = $network->dump_state(filename=>'state.wts',activations=>1);
ok($result);
 
# test loading weights
$result = $network->load_state(filename=>'state.wts');
ok($result);

t/Dataset.t  view on Meta::CPAN

14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1,
                                momentum=>0.6);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],
                        [1,0],[0,1],
                        [1,1],[0,0]]);
ok ($dataset); #test 4
##
 
# test adding an entry
$result = $dataset->add([[1,1],[0,1]]);
ok($result);
 
# test save
$result = $dataset->save(filename=>'test.pat');
ok ($result);
 
# test empty dataset
my $dataset2 = AI::NNFlex::Dataset->new();
ok($dataset);
 
# test load
$result = $dataset2->load(filename=>'test.pat');
ok($result);
 
#  compare original & loaded dataset
my $comparison;
if (scalar @{$dataset->{'data'}} == scalar @{$dataset2->{'data'}}){$comparison=1}
ok($comparison);
 
# delete a pair from the dataset
$result = $dataset->delete([4,5]);
ok($result);
 
# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##
 
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/Hopfield.t  view on Meta::CPAN

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
# example script to build a hopfield net
use strict;
use Test;
 
 
BEGIN{plan tests=>4}
my $matrixpresent = eval("require(Math::Matrix)");
my $matrixabsent = !$matrixpresent;
 
my $network = AI::NNFlex::Hopfield->new();
 
skip($matrixabsent,$network);
 
 
$network->add_layer(nodes=>2);
$network->add_layer(nodes=>2);
 
my $result = $network->init();
skip($matrixabsent,$result);
 
my $dataset = AI::NNFlex::Dataset->new();
 
$dataset->add([-1, 1, -1, 1]);
$dataset->add([-1, -1, 1, 1]);
 
skip($matrixabsent,$dataset);
 
$network->learn($dataset);

t/backprop.t  view on Meta::CPAN

10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1,
                                momentum=>0.6);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# add an extra layer to test out connect
$result = $network->add_layer(       nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
 
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
 
# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result);
 
# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);
 
 
 
 
 
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],

t/backprop.t  view on Meta::CPAN

66
67
68
69
70
71
72
73
74
75
76
77
78
##
 
 
# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##
 
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/reinforce.t  view on Meta::CPAN

9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
# test create network
my $network = AI::NNFlex::Reinforce->new(randomconnections=>0,
                                randomweights=>1,
                                learningrate=>.1,
                                debug=>[],bias=>1);
 
ok($network); #test 1
##
 
# test add layer
my $result = $network->add_layer(    nodes=>2,
                        persistentactivation=>0,
                        decay=>0.0,
                        randomactivation=>0,
                        threshold=>0.0,
                        activationfunction=>"tanh",
                        randomweights=>1);
ok($result); #test 2
##
 
# Test initialise network
$result = $network->init();
ok($result); #test 3
##
 
# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
                        [0,0],[1,1],
                        [0,1],[1,0],
                        [1,0],[0,1],
                        [1,1],[0,0]]);
ok ($dataset); #test 4
##
 
# Test a run pass
$result = $dataset->run($network);
ok($result); #test 5
##



( run in 2.623 seconds using v1.01-cache-2.11-cpan-26ccb49234f )