AI-NNFlex

 view release on metacpan or  search on metacpan

CHANGES  view on Meta::CPAN

implemented node & layer connect methods, to allow recurrent
connections.

put sigmoid_slope function in mathlib, courtesy of frodo72
@ perlmonks

Implemented functions to save & load snns .pat files in Dataset

Altered Dataset constructor to allow an empty param set - you
can now construct a null Dataset & add items to it using the
$dataset->add([[0,1],[1]) method (also implemented in this 
version.

Altered feedforward run method to return output pattern - more
intuitive that way.

Implemented a Hopfield module. This is very much the first cut
at this, since I've never really used hopfield nets before, and
haven't put any debug in etc, until I've rethought the whole
approach to debug in this set of code.

Implemented dataset->delete method.

Put the pod documentation back in Dataset.pm :-)


###############################################################
0.21
20050313

Rewrote all the pod. Its probably a bit sparse now, but its 
much more accurate.

README.txt  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);



my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);
	print "Epoch = $counter error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}



TODO  view on Meta::CPAN

Put in some more error checking, particularly trying to create connections
between layers/nodes that don't exist.

Write a simple net simulator with syntax loosely based on xerion. At
present this lot is API driven, it should be straightforward to write
a basic simulator that calls the API in the backend.

read & write methods for both networks and datasets modelled on snns format (for use with frontend script). data should be snns format, network definition file will probably have to differ

Implement an error method in addition to dbug, and clean up the dbug & error calls


examples/add.pl  view on Meta::CPAN

$network->add_layer(    nodes=>2,
            activationfunction=>"linear");

$network->add_layer(    nodes=>1,
            activationfunction=>"linear");


$network->init();

# Taken from Mesh ex_add.pl
my $dataset = AI::NNFlex::Dataset->new([
[ 1,   1   ], [ 2    ],
[ 1,   2   ], [ 3    ],
[ 2,   2   ], [ 4    ],
[ 20,  20  ], [ 40   ],
[ 10,  10  ], [ 20   ],
[ 15,  15  ], [ 30   ],
[ 12,  8   ], [ 20   ],

]);

my $err = 10;
# Stop after 4096 epochs -- don't want to wait more than that
for ( my $i = 0; ($err > 0.0001) && ($i < 4096); $i++ ) {
    $err = $dataset->learn($network);
    print "Epoch = $i error = $err\n";
}

foreach (@{$dataset->run($network)})
{
    foreach (@$_){print $_}
    print "\n";    
}

print "this should be 4000 - ";
$network->run([2000,2000]);
foreach ( @{$network->output}){print $_."\n";}

 foreach my $a ( 1..10 ) {

examples/bp.pl  view on Meta::CPAN

#==============================================================
#********** THIS IS THE MAIN PROGRAM **************************
#==============================================================

sub main
 {

 # initiate the weights
  initWeights();

 # load in the data
  initData();

 # train the network
    for(my $j = 0;$j <= $numEpochs;$j++)
    {

        for(my $i = 0;$i<$numPatterns;$i++)
        {

            #select a pattern at random

examples/lesion.pl  view on Meta::CPAN

			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"linear",
			randomweights=>1);


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);



my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);

	print "Epoch $counter: Error = $err\n";
	$counter++;
}

$network->lesion(nodes=>0.5,connections=>0.5);

$network->dump_state(filename=>"weights-learned.wts",activations=>1);

foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}


examples/test.pl  view on Meta::CPAN



# create the numbers
my %numbers;
for (0..255)
{	
	my @array = split //,sprintf("%08b",$_);
	$numbers{$_} = \@array;
}

my @data;
for (my $counter=0;$counter < 14;$counter+=2)
{
	push @data,$numbers{$counter};

	push @data,$numbers{$counter*$counter};

}


# Create the network 

my $network = AI::NNFlex::Backprop->new(
				learningrate=>.05,
				bias=>1,
				fahlmanconstant=>0.1,

examples/xor_minimal.pl  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);



my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);

	print "Epoch $counter: Error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}



examples/xorminus.pl  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[-1,-1],[-1],
			[-1,1],[1],
			[1,-1],[1],
			[1,1],[-1]]);

$dataset->save(filename=>'xor.pat');
$dataset->load(filename=>'xor.pat');


my $counter=0;
my $err = 10;
while ($err >.001)
#for (1..1500)
{
	$err = $dataset->learn($network);
	print "Epoch = $counter error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}

print "this should be 1 - ".@{$network->run([-1,1])}."\n";

lib/AI/NNFlex.pm  view on Meta::CPAN

# to create meshes, apply input, and read output ONLY!
#
# Separate modules are to be written to perform feedback adjustments,
# various activation functions, text/gui front ends etc
#
###############################################################################
# Version Control
# ===============
#
# 0.1 20040905		CColbourn	New module
#					added NNFlex::datasets
#
# 0.11 20050113		CColbourn	Added NNFlex::lesion
#					Improved Draw
#					added NNFlex::datasets
#
# 0.12 20050116		CColbourn	Fixed reinforce.pm bug
# 					Added call into datasets
#					in ::run to offer alternative
#					syntax
#
# 0.13 20050121		CColbourn	Created momentum learning module
#
# 0.14 20050201		CColbourn	Abstracted derivatiive of activation
#					function into a separate function call
#					instead of hardcoded 1-y*y in backprop
#					tanh, linear & momentum
#

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN

##########################################################
# AI::NNFlex::Dataset
##########################################################
# Dataset methods for AI::NNFlex - perform learning etc
# on groups of data
#
##########################################################
# Versions
# ========
#
# 1.0	20050115	CColbourn	New module
#
# 1.1	20050324	CColbourn	Added load support
#
##########################################################

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

 my $network = AI::NNFlex::Reinforce->new(config parameter=>value);

 $network->add_layer(nodes=>x,activationfunction=>'function');

 $network->init(); 



 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([
			[INPUTARRAY],[TARGETOUTPUT],
			[INPUTARRAY],[TARGETOUTPUT]]);

 my $sqrError = 10;

 for (1..100)

 {

	 $dataset->learn($network);

 }

 $network->lesion({'nodes'=>PROBABILITY,'connections'=>PROBABILITY});

 $network->dump_state(filename=>'badgers.wts');

 $network->load_state(filename=>'badgers.wts');

 my $outputsRef = $dataset->run($network);

 my $outputsRef = $network->output(layer=>2,round=>1);

=head1 DESCRIPTION

Reinforce is a very simple NN module. It's mainly included in this distribution to provide an example of how to subclass AI::NNFlex to write your own NN modules. The training method strengthens any connections that are active during the run pass.

=head1 CONSTRUCTOR 

=head2 AI::NNFlex::Reinforce

t/Dataset.t  view on Meta::CPAN

use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>12}




# we need a basic network  in place to test the dataset functionality against
# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

t/backprop.t  view on Meta::CPAN


# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/reinforce.t  view on Meta::CPAN

			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# Test a run pass
$result = $dataset->run($network);
ok($result); #test 5
##

 view all matches for this distribution
 view release on metacpan -  search on metacpan

( run in 0.970 second using v1.00-cache-2.02-grep-82fe00e-cpan-4673cadbf75 )