Incorrect search filter: invalid characters - *.p[ml]
AI-NNFlex

 view release on metacpan or  search on metacpan

CHANGES  view on Meta::CPAN


Cleaned up the perldoc some more. Commented out all the method
perldocs, so there is just the single block defining the 
distributions documentation, as advocated by perlmonks. Method
perldocs in importable modules have not been commented out.

Removed the weight bounding in backprop & momentum. If the network
is going into an unstable state the weight bounding won't help,
and it causes errors under perl -w.

Implemented tests (apologies to the CPAN testers for not having
done so before!).


#################################################################

0.13
20050121

New plugin training algorithm - momentum.pm
Improvement in speed using momentum on xor as follows (epochs)

lib/AI/NNFlex.pm  view on Meta::CPAN

 Damages the network.

B<PROBABILITY>

A value between 0 and 1, denoting the probability of a given node or connection being damaged.

Note: this method may be called on a per network, per node or per layer basis using the appropriate object.

=head1 EXAMPLES

See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.


=head1 PREREQs

None. NNFlex should run OK on any version of Perl 5 >. 


=head1 ACKNOWLEDGEMENTS

Phil Brierley, for his excellent free java code, that solved my backprop problem

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;

=head2 run

 $dataset->run($network)

Runs the dataset through the network and returns a reference to an array of output patterns.

=head1 EXAMPLES

See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.


=head1 PREREQs

None. NNFlex::Backprop should run OK on any version of Perl 5 >. 


=head1 ACKNOWLEDGEMENTS

Phil Brierley, for his excellent free java code, that solved my backprop problem

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN

=head1 SYNOPSIS

 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([[0,1,1,0],[0,0,1,1]]);

 $dataset->add([[0,1,0,1],[1,1,0,0]]);

 $dataset->add([0,1,0,0]);

 $dataset->save(filename=>'test.pat');

 $dataset->load(filename=>'test.pat');

=head1 DESCRIPTION

This module allows you to construct, load, save and maintain datasets for use with neural nets implemented using the AI::NNFlex classes. The dataset consists of an array of references to arrays of data. Items may be added in pairs (useful for feedfor...

=head1 CONSTRUCTOR 

=head2 AI::NNFlex::Dataset->new([[INPUT],[TARGET]]);

Parameters:

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

=head1 ACKNOWLEDGEMENTS

=head1 SEE ALSO

 AI::NNFlex
 AI::NNFlex::Backprop


=head1 TODO

More detailed documentation. Better tests. More examples.

=head1 CHANGES

v0.1 - new module

=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;

=head3 run

 $dataset->run($network)

Runs the dataset through the network and returns a reference to an array of output patterns.

=head1 EXAMPLES

See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.


=head1 PREREQs

None. NNFlex::Reinforce should run OK on any version of Perl 5 >. 


=head1 ACKNOWLEDGEMENTS

Phil Brierley, for his excellent free java code, that solved my backprop problem

t/Backprop.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>10}

# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# add an extra layer to test out connect
$result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"sigmoid",
			randomweights=>1);


# Test initialise network
$result = $network->init();
ok($result); #test 3
##


# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result); 

# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

# test saving weights
$result = $network->dump_state(filename=>'state.wts',activations=>1);
ok($result);

# test loading weights
$result = $network->load_state(filename=>'state.wts');
ok($result);

t/Dataset.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>12}




# we need a basic network  in place to test the dataset functionality against
# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# test adding an entry
$result = $dataset->add([[1,1],[0,1]]);
ok($result);

# test save
$result = $dataset->save(filename=>'test.pat');
ok ($result);

# test empty dataset
my $dataset2 = AI::NNFlex::Dataset->new();
ok($dataset);

# test load
$result = $dataset2->load(filename=>'test.pat');
ok($result);

#  compare original & loaded dataset
my $comparison;
if (scalar @{$dataset->{'data'}} == scalar @{$dataset2->{'data'}}){$comparison=1}
ok($comparison);

# delete a pair from the dataset
$result = $dataset->delete([4,5]);
ok($result);

# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/Hopfield.t  view on Meta::CPAN

# example script to build a hopfield net
use strict;
use AI::NNFlex::Hopfield;
use AI::NNFlex::Dataset;
use Test;


BEGIN{plan tests=>4}
my $matrixpresent = eval("require(Math::Matrix)");
my $matrixabsent = !$matrixpresent;

my $network = AI::NNFlex::Hopfield->new();

skip($matrixabsent,$network);


$network->add_layer(nodes=>2);
$network->add_layer(nodes=>2);

t/backprop.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>8}

# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# add an extra layer to test out connect
$result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);


# Test initialise network
$result = $network->init();
ok($result); #test 3
##


# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result); 

# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/reinforce.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Reinforce;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>5}

# test create network
my $network = AI::NNFlex::Reinforce->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# Test a run pass
$result = $dataset->run($network);
ok($result); #test 5
##



( run in 0.488 second using v1.01-cache-2.11-cpan-454fe037f31 )