AI-NNFlex

 view release on metacpan or  search on metacpan

CHANGES  view on Meta::CPAN

###########################################################
0.22
XXXXXXXXX

implemented node & layer connect methods, to allow recurrent
connections.

put sigmoid_slope function in mathlib, courtesy of frodo72
@ perlmonks

Implemented functions to save & load snns .pat files in Dataset

Altered Dataset constructor to allow an empty param set - you
can now construct a null Dataset & add items to it using the
$dataset->add([[0,1],[1]) method (also implemented in this 
version.

Altered feedforward run method to return output pattern - more
intuitive that way.

Implemented a Hopfield module. This is very much the first cut
at this, since I've never really used hopfield nets before, and
haven't put any debug in etc, until I've rethought the whole
approach to debug in this set of code.

Implemented dataset->delete method.

Put the pod documentation back in Dataset.pm :-)


###############################################################
0.21
20050313

Rewrote all the pod. Its probably a bit sparse now, but its 
much more accurate.

Removed the eval calls from feedforward, backprop & momentum
for speed.

Implemented fahlman constant. This eliminates the 'flat spot'
problem, and gets the network to converge more reliably. XOR
seems to never get stuck with this set to 0.1. as a bonus, its 
also about 50% faster (do you sense a theme developing here?)

Removed momentum module (well, removed backprop module and 
renamed momentum module in fact). There were few code differences
and its easier to maintain this way. Default option is vanilla
backprop, momentum & fahlman adjustments are only implemented if
specified in the network config.

Bundled all the maths functions into AI::NNFlex::Mathlib

CHANGES  view on Meta::CPAN


Fixed a bug that allowed training to proceed even if the activation
function (s) can't be loaded.

################################################################

0.12
20050116

Fixed a bug in reinforce.pm reported by G M Passos
Inserted a catch in feedforward->run to call datasets if the syntax

$network->run($dataset) is called.

Strictly speaking this doesn't fit with the design, but its likely to
be used quite a bit so its worth catching


###############################################################

0.11
20050115
Added PNG support to AI::NNFlex::draw

Added AI::NNFlex::Dataset
This creates a dataset object that can be run against a
network

Added AI::NNFlex::lesion
Damages a network with a probability of losing a node
or a connection. See the perldoc

Cleaned up the POD docs a bit, although theres a lot still
to do.

################################################################

MANIFEST  view on Meta::CPAN

lib/AI/NNFlex/Reinforce.pm
lib/AI/NNFlex/Feedforward.pm
lib/AI/NNFlex/Backprop.pm
lib/AI/NNFlex/Hopfield.pm
lib/AI/NNFlex/Dataset.pm
lib/AI/NNFlex/Mathlib.pm
lib/AI/NNFlex.pm

README.txt  view on Meta::CPAN

Charles Colbourn April 2005

###################################################################
Example XOR neural net (from examples/xor.pl


# Example demonstrating XOR with momentum backprop learning

use strict;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

# Create the network 

my $network = AI::NNFlex::Backprop->new(
				learningrate=>.2,
				bias=>1,
				fahlmanconstant=>0.1,
				momentum=>0.6,
				round=>1);

README.txt  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);



my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);
	print "Epoch = $counter error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}



TODO  view on Meta::CPAN

Put in some more error checking, particularly trying to create connections
between layers/nodes that don't exist.

Write a simple net simulator with syntax loosely based on xerion. At
present this lot is API driven, it should be straightforward to write
a basic simulator that calls the API in the backend.

read & write methods for both networks and datasets modelled on snns format (for use with frontend script). data should be snns format, network definition file will probably have to differ

Implement an error method in addition to dbug, and clean up the dbug & error calls


examples/add.pl  view on Meta::CPAN

use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;


# train the network to do addition. Adapted from code posted to perlmonks
# by tlpriest on 13/05/05




my $network = AI::NNFlex::Backprop->new(
                learningrate=>.00001,

examples/add.pl  view on Meta::CPAN

$network->add_layer(    nodes=>2,
            activationfunction=>"linear");

$network->add_layer(    nodes=>1,
            activationfunction=>"linear");


$network->init();

# Taken from Mesh ex_add.pl
my $dataset = AI::NNFlex::Dataset->new([
[ 1,   1   ], [ 2    ],
[ 1,   2   ], [ 3    ],
[ 2,   2   ], [ 4    ],
[ 20,  20  ], [ 40   ],
[ 10,  10  ], [ 20   ],
[ 15,  15  ], [ 30   ],
[ 12,  8   ], [ 20   ],

]);

my $err = 10;
# Stop after 4096 epochs -- don't want to wait more than that
for ( my $i = 0; ($err > 0.0001) && ($i < 4096); $i++ ) {
    $err = $dataset->learn($network);
    print "Epoch = $i error = $err\n";
}

foreach (@{$dataset->run($network)})
{
    foreach (@$_){print $_}
    print "\n";    
}

print "this should be 4000 - ";
$network->run([2000,2000]);
foreach ( @{$network->output}){print $_."\n";}

 foreach my $a ( 1..10 ) {

examples/cars/cars.pl  view on Meta::CPAN

}

close CARS;


######################################################################
# data now constructed, we can do the NN thing
######################################################################

use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

my $dataset = AI::NNFlex::Dataset->new(\@dataArray);


my $network = AI::NNFlex::Backprop->new( learningrate=>.1,
				fahlmanconstant=>0.1,
				bias=>1,
				momentum=>0.6);



$network->add_layer(	nodes=>12,

examples/cars/cars.pl  view on Meta::CPAN



$network->init();

$network->connect(fromlayer=>2,tolayer=>2);

my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);

	print "Epoch $counter: Error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}



examples/hopfield.pl  view on Meta::CPAN

# example script to build a hopfield net
use strict;
use AI::NNFlex::Hopfield;
use AI::NNFlex::Dataset;

my $network = AI::NNFlex::Hopfield->new();

$network->add_layer(nodes=>2);
$network->add_layer(nodes=>2);

$network->init();

my $dataset = AI::NNFlex::Dataset->new();

$dataset->add([-1, 1, -1, 1]);
$dataset->add([-1, -1, 1, 1]);

$network->learn($dataset);

#my $outputref = $network->run([-1,1,-1,1]);
#my $outputref = $network->run([-1,1,-1,1]);
#my $outputref = $network->run([-1,1,-1,1]);
my $outputref = $network->run([1,-1,1,1]);
my $outputref = $network->run([1,-1,1,1]);
my $outputref = $network->run([1,-1,1,1]);

print @$outputref;

examples/lesion.pl  view on Meta::CPAN

# Example demonstrating XOR with momentum backprop learning
# and node lesioning

use strict;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

# Create the network 

my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6,
				round=>1);

examples/lesion.pl  view on Meta::CPAN

			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"linear",
			randomweights=>1);


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);



my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);

	print "Epoch $counter: Error = $err\n";
	$counter++;
}

$network->lesion(nodes=>0.5,connections=>0.5);

$network->dump_state(filename=>"weights-learned.wts",activations=>1);

foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}


examples/test.pl  view on Meta::CPAN

# Example demonstrating XOR with momentum backprop learning

use strict;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;


# create the numbers
my %numbers;
for (0..255)
{	
	my @array = split //,sprintf("%08b",$_);
	$numbers{$_} = \@array;
}

examples/test.pl  view on Meta::CPAN

$network->add_layer(	nodes=>8,
			errorfunction=>'atanh',
			activationfunction=>"tanh");

$network->add_layer(	nodes=>8,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new(\@data);



my $counter=0;
my $err = 10;
while ($err >.01)
{
	$err = $dataset->learn($network);
	print "Epoch = $counter error = $err\n";
	$counter++;
}

$network->run([0,0,0,0,0,1,0,1]);
my $output = $network->output();
print $output."\n";

foreach (@$output){print $_}
print "\n";

examples/xor.pl  view on Meta::CPAN

# Example demonstrating XOR with momentum backprop learning

use strict;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

# Create the network 

my $network = AI::NNFlex::Backprop->new(
				learningrate=>.2,
				bias=>1,
				fahlmanconstant=>0.1,
				momentum=>0.6,
				round=>1);

examples/xor.pl  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);

$dataset->save(filename=>'xor.pat');
$dataset->load(filename=>'xor.pat');


my $counter=0;
my $err = 10;
while ($err >.001)
#for (1..1500)
{
	$err = $dataset->learn($network);
	print "Epoch = $counter error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}

print "this should be 1 - ".@{$network->run([0,1])}."\n";

examples/xor_minimal.pl  view on Meta::CPAN

# Example demonstrating XOR with momentum backprop learning
# and minimal set of parameters (using default values)

use strict;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

# Create the network 

my $network = AI::NNFlex::Backprop->new( learningrate=>.1,
				bias=>1,
				momentum=>0.6,
				fahlmanconstant=>0.1,
				round=>1);


examples/xor_minimal.pl  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],
			[1,1],[0]]);



my $counter=0;
my $err = 10;
while ($err >.001)
{
	$err = $dataset->learn($network);

	print "Epoch $counter: Error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}



examples/xorminus.pl  view on Meta::CPAN

# Example demonstrating XOR with momentum backprop learning

use strict;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

# Create the network 

my $network = AI::NNFlex::Backprop->new(
				learningrate=>.2,
				bias=>1,
				fahlmanconstant=>0.1,
				momentum=>0.6,
				round=>1);

examples/xorminus.pl  view on Meta::CPAN


$network->add_layer(	nodes=>2,
			activationfunction=>"tanh");

$network->add_layer(	nodes=>1,
			activationfunction=>"linear");


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[-1,-1],[-1],
			[-1,1],[1],
			[1,-1],[1],
			[1,1],[-1]]);

$dataset->save(filename=>'xor.pat');
$dataset->load(filename=>'xor.pat');


my $counter=0;
my $err = 10;
while ($err >.001)
#for (1..1500)
{
	$err = $dataset->learn($network);
	print "Epoch = $counter error = $err\n";
	$counter++;
}


foreach (@{$dataset->run($network)})
{
	foreach (@$_){print $_}
	print "\n";	
}

print "this should be 1 - ".@{$network->run([-1,1])}."\n";

lib/AI/NNFlex.pm  view on Meta::CPAN

use vars qw ($VERSION);
#use warnings;
###############################################################################
# NNFlex - Neural Network (flexible) - a heavily custom NN simulator
# 
# Sept 2004 - CW Colbourn
#
# This was developed from the abortive nnseq package originally intended
# for real time neural networks.
# The basis of the approach for this version is a very flexible, modular
# set of packages. This package constitutes the base, allowing the modeller
# to create meshes, apply input, and read output ONLY!
#
# Separate modules are to be written to perform feedback adjustments,
# various activation functions, text/gui front ends etc
#
###############################################################################
# Version Control
# ===============
#
# 0.1 20040905		CColbourn	New module
#					added NNFlex::datasets
#
# 0.11 20050113		CColbourn	Added NNFlex::lesion
#					Improved Draw
#					added NNFlex::datasets
#
# 0.12 20050116		CColbourn	Fixed reinforce.pm bug
# 					Added call into datasets
#					in ::run to offer alternative
#					syntax
#
# 0.13 20050121		CColbourn	Created momentum learning module
#
# 0.14 20050201		CColbourn	Abstracted derivatiive of activation
#					function into a separate function call
#					instead of hardcoded 1-y*y in backprop
#					tanh, linear & momentum
#

lib/AI/NNFlex.pm  view on Meta::CPAN

# sub init
################################################################################
sub init
{

	#Revised version of init for NNFlex

	my $network = shift;
	my @layers = @{$network->{'layers'}};

	# if network debug state not set, set it to null
	if (!$network->{'debug'})
	{
		$network->{'debug'} = [];
	}
	my @debug = @{$network->{'debug'}};
	

	# implement the bias node
	if ($network->{'bias'})
	{

lib/AI/NNFlex.pm  view on Meta::CPAN

	return 1;
}



##############################################################
# AI::NNFlex::calcweight
##############################################################
#
# calculate an initial weight appropriate for the network
# settings.
# takes no parameters, returns weight
##############################################################
sub calcweight
{
	my $network= shift;
	my $weight;
	if ($network->{'fixedweights'})
	{
		$weight = $network->{'fixedweights'};
	}

lib/AI/NNFlex.pm  view on Meta::CPAN


=head2 AI::NNFlex->new ( parameter => value );
	

randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

debug=>[LIST OF CODES FOR MODULES TO DEBUG]

round=>0 or 1, a true value sets the network to round output values to nearest of 1, -1 or 0


The constructor implements a fairly generalised network object with a number of parameters.


The following parameters are optional:
 randomweights
 fixedweights
 debug
 round

lib/AI/NNFlex.pm  view on Meta::CPAN

Add layer adds whatever parameters you specify as attributes of the layer, so if you want to implement additional parameters simply use them in your calling code.

Add layer returns success or failure, and if successful adds a layer object to the $network->{'layers'} array. This layer object contains an attribute $layer->{'nodes'}, which is an array of nodes in the layer.

=head3 init

 Syntax:

 $network->init();

Initialises connections between nodes, sets initial weights. The base AI::NNFlex init method implementes connections backwards and forwards from each node in each layer to each node in the preceeding and following layers. 

init adds the following attributes to each node:

=over

=item *
{'connectedNodesWest'}->{'nodes'} - an array of node objects connected to this node on the west/left

=item *
{'connectedNodesWest'}->{'weights'} - an array of scalar numeric weights for the connections to these nodes

lib/AI/NNFlex.pm  view on Meta::CPAN


Graciliano M.Passos for suggestions & improved code (see SEE ALSO).

Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.

=head1 SEE ALSO

 AI::NNFlex::Backprop
 AI::NNFlex::Feedforward
 AI::NNFlex::Mathlib
 AI::NNFlex::Dataset

 AI::NNEasy - Developed by Graciliano M.Passos 
 (Shares some common code with NNFlex)
 

=head1 TODO

 Lots of things:

 clean up the perldocs some more
 write gamma modules
 write BPTT modules
 write a perceptron learning module
 speed it up
 write a tk gui

=head1 CHANGES

v0.11 introduces the lesion method, png support in the draw module and datasets.

v0.12 fixes a bug in reinforce.pm & adds a reflector in feedforward->run to make $network->run($dataset) work.

v0.13 introduces the momentum learning algorithm and fixes a bug that allowed training to proceed even if the node activation function module can't be loaded

v0.14 fixes momentum and backprop so they are no longer nailed to tanh hidden units only.

v0.15 fixes a bug in feedforward, and reduces the debug overhead

v0.16 changes some underlying addressing of weights, to simplify and speed  

v0.17 is a bugfix release, plus some cleaning of UI

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

########################################################
# AI::NNFlex::Backprop::learn
########################################################
sub learn
{

	my $network = shift;

	my $outputPatternRef = shift;

	# if this is an incorrect dataset call translate it
	if ($outputPatternRef =~/Dataset/)
	{
		return ($outputPatternRef->learn($network))
	}


	# Set a default value on the Fahlman constant
	if (!$network->{'fahlmanconstant'})
	{
		$network->{'fahlmanconstant'} = 0.1;
	}

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

 use AI::NNFlex::Backprop;

 my $network = AI::NNFlex::Backprop->new(config parameter=>value);

 $network->add_layer(nodes=>x,activationfunction=>'function');

 $network->init(); 



 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([
			[INPUTARRAY],[TARGETOUTPUT],
			[INPUTARRAY],[TARGETOUTPUT]]);

 my $sqrError = 10;

 while ($sqrError >0.01)

 {

	$sqrError = $dataset->learn($network);

 }

 $network->lesion({'nodes'=>PROBABILITY,'connections'=>PROBABILITY});

 $network->dump_state(filename=>'badgers.wts');

 $network->load_state(filename=>'badgers.wts');

 my $outputsRef = $dataset->run($network);

 my $outputsRef = $network->output(layer=>2,round=>1);

=head1 DESCRIPTION

AI::NNFlex::Backprop is a class to generate feedforward, backpropagation neural nets. It inherits various constructs from AI::NNFlex & AI::NNFlex::Feedforward, but is documented here as a standalone.

The code should be simple enough to use for teaching purposes, but a simpler implementation of a simple backprop network is included in the example file bp.pl. This is derived from Phil Brierleys freely available java code at www.philbrierley.com.

AI::NNFlex::Backprop leans towards teaching NN and cognitive modelling applications. Future modules are likely to include more biologically plausible nets like DeVries & Principes Gamma model.

Full documentation for AI::NNFlex::Dataset can be found in the modules own perldoc. It's documented here for convenience only.

=head1 CONSTRUCTOR 

=head2 AI::NNFlex::Backprop->new( parameter => value );

Parameters:

	
	randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

	fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

	debug=>[LIST OF CODES FOR MODULES TO DEBUG]

	learningrate=>the learning rate of the network

	momentum=>the momentum value (momentum learning only)

	round=>0 or 1 - 1 sets the network to round output values to
		nearest of 1, -1 or 0

	fahlmanconstant=>0.1
		


The following parameters are optional:

 randomweights

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN


 momentum

 fahlmanconstant


If randomweights is not specified the network will default to a random value from 0 to 1.

If momentum is not specified the network will default to vanilla (non momentum) backprop.

The Fahlman constant modifies the slope of the error curve. 0.1 is the standard value for everything, and speeds the network up immensely. If no Fahlman constant is set, the network will default to 0.1

=head2 AI::NNFlex::Dataset

 new (	[[INPUT VALUES],[OUTPUT VALUES],
	[INPUT VALUES],[OUTPUT VALUES],..])

=head2 INPUT VALUES

These should be comma separated values. They can be applied to the network with ::run or ::learn

=head2 OUTPUT VALUES
	

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

 randomweights



=head2 init

 Syntax:

 $network->init();

Initialises connections between nodes, sets initial weights and loads external components. Implements connections backwards and forwards from each node in each layer to each node in the preceeding and following layers, and initialises weights values ...

=head2 lesion

 $network->lesion ({'nodes'=>PROBABILITY,'connections'=>PROBABILITY})

 Damages the network.

B<PROBABILITY>

A value between 0 and 1, denoting the probability of a given node or connection being damaged.

Note: this method may be called on a per network, per node or per layer basis using the appropriate object.

=head2 AN::NNFlex::Dataset

=head2 learn

 $dataset->learn($network)

'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;

=head2 run

 $dataset->run($network)

Runs the dataset through the network and returns a reference to an array of output patterns.

=head1 EXAMPLES

See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.


=head1 PREREQs

None. NNFlex::Backprop should run OK on any version of Perl 5 >. 

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN

##########################################################
# AI::NNFlex::Dataset
##########################################################
# Dataset methods for AI::NNFlex - perform learning etc
# on groups of data
#
##########################################################
# Versions
# ========
#
# 1.0	20050115	CColbourn	New module
#
# 1.1	20050324	CColbourn	Added load support
#
##########################################################
# ToDo
# ----
#
#
###########################################################
#
use strict;
package AI::NNFlex::Dataset;


###########################################################
# AI::NNFlex::Dataset::new
###########################################################
sub new
{
	my $class = shift;
	my $params = shift;
	my $dataset;
	if ($class =~ /HASH/)
	{
		$dataset = $class;
		$dataset->{'data'} = $params;
		return 1;
	}

	my %attributes;
	$attributes{'data'} = $params;

	$dataset = \%attributes;
	bless $dataset,$class;
	return $dataset;
}


###########################################################
# AI::NNFlex::Datasets::run
###########################################################
sub run
{
	my $self = shift;
	my $network = shift;
	my @outputs;
	my $counter=0;

	for (my $itemCounter=0;$itemCounter<(scalar @{$self->{'data'}});$itemCounter +=2)
	{
		$network->run(@{$self->{'data'}}[$itemCounter]);
		$outputs[$counter] = $network->output();
		$counter++;
	}

	return \@outputs;

}

###############################################################
# AI::NNFlex::Dataset::learn
###############################################################
sub learn
{
	my $self = shift;
	my $network = shift;
	my $error;

	for (my $itemCounter=0;$itemCounter<(scalar @{$self->{'data'}});$itemCounter +=2)
	{
		$network->run(@{$self->{'data'}}[$itemCounter]);
		$error += $network->learn(@{$self->{'data'}}[$itemCounter+1]);
	}

	$error = $error*$error;

	return $error;
}

#################################################################
# AI::NNFlex::Dataset::save
#################################################################
# save a dataset in an snns .pat file
#################################################################
sub save
{
	my $dataset = shift;
	my %config = @_;

	open (OFILE,">".$config{'filename'}) or return "File error $!";

	print OFILE "No. of patterns : ".((scalar @{$dataset->{'data'}})/2)."\n";
	print OFILE "No. of input units : ".(scalar @{$dataset->{'data'}->[0]})."\n";
	print OFILE "No. of output units : ".(scalar @{$dataset->{'data'}->[1]})."\n\n";

	my $counter = 1;
	my @values = @{$dataset->{'data'}};
	while (@values)
	{
		print OFILE "# Input pattern $counter:\n";
		my $input = shift (@values); 
		my @array = join " ",@$input;
		print OFILE @array;
		print OFILE "\n";

		print OFILE "# Output pattern $counter:\n";
		my $output = shift(@values); 

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN


		$counter++;
	}

	close OFILE;
	return 1;
}


#############################################################
# AI::NNFlex::Dataset::load
#############################################################
sub load
{
	my $dataset = shift;
	my %params = @_;

	my @data;

	my $filename = $params{'filename'};
	if (!$filename)
	{
		return "No filename specified";
	}

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN


	my $filecontent;
	while (<IFILE>)
	{
		if($_ =~ /^#/ || $_ =~ /^\n/){next}
		$filecontent .= $_;
	}

	my @individualvals = split /\s+/s,$filecontent;

	for (my $offset=0;$offset<(scalar @individualvals);$offset+=($config{'no.ofinputunits'} + $config{'no.ofoutputunits'}))
	{
		my @input=@individualvals[$offset..($offset+$config{'no.ofinputunits'}-1)];
		push @data,\@input;
		if ($config{'no.ofoutputunits'} > 0)
		{
			my @output=@individualvals[($offset+$config{'no.ofinputunits'})..($offset+$config{'no.ofinputunits'}+$config{'no.ofoutputunits'}-1)];
			push @data,\@output;
		}
	}

		
	$dataset->new(\@data);

	return 1;
}
	
##########################################################
# AI::NNFlex::Dataset::add
##########################################################
# add an input/output pair to the dataset
##########################################################
sub add
{
	my $dataset= shift;
	my $params = shift;

	if (!$params){return "Nothing to add"};
	if ($params !~/ARRAY/){return "Need a reference to an array"}

	# support adding single patterns (for Hopfield type nets)
	if ($$params[0] !~ /ARRAY/)
	{
		push @{$dataset->{'data'}},$params;
	}
	else
	{
		push @{$dataset->{'data'}},$$params[0];
		push @{$dataset->{'data'}},$$params[1];
	}

	return 1;
}

##################################################################
# AI::NNFlex::Dataset::delete
##################################################################
# delete an item from the dataset by index
##################################################################
sub delete
{
	my $dataset = shift;
	my $index = shift;
	my @indexarray;

	if (!$index){return 0}

	if ($index =~ /ARRAY/)
	{
		@indexarray = @$index;
	}
	else
	{
		$indexarray[0] = $index;
	}

	my @newarray;
	my $counter=0;
	foreach (@indexarray)
	{
		unless ($counter == $_)
		{
			push @newarray,${$dataset->{'data'}}[$_];
		}
	}

	$dataset->{'data'} = \@newarray;

	return 1;
}



1;
=pod

=head1 NAME

AI::NNFlex::Dataset - support for creating/loading/saving datasets for NNFlex nets

=head1 SYNOPSIS

 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([[0,1,1,0],[0,0,1,1]]);

 $dataset->add([[0,1,0,1],[1,1,0,0]]);

 $dataset->add([0,1,0,0]);

 $dataset->save(filename=>'test.pat');

 $dataset->load(filename=>'test.pat');

=head1 DESCRIPTION

This module allows you to construct, load, save and maintain datasets for use with neural nets implemented using the AI::NNFlex classes. The dataset consists of an array of references to arrays of data. Items may be added in pairs (useful for feedfor...

=head1 CONSTRUCTOR 

=head2 AI::NNFlex::Dataset->new([[INPUT],[TARGET]]);

Parameters:

The constructor takes an (optional) reference to an array of one or more arrays. For convenience you can specify two values at a time (for INPUT and OUTPUT values) or a single value at a time. You can also leave the parameters blank, in which case th...

The return value is an AI::NNFlex::Dataset object.

=head1 METHODS

This is a short list of the main methods implemented in AI::NNFlex::Dataset


=head2 add

 Syntax:

 $dataset->add([[INPUT],[OUTPUT]]);

or

 $dataset->add([VALUE]);

This method adds new values to the end of the dataset. You can specify the values as pairs or individually.

=head2 load

 Syntax:

 $dataset->load(filename=>'filename.pat');

Loads an SNNS type .pat file into a blank dataset. If called on an existing dataset IT WILL OVERWRITE IT!

=head2 save

 $dataset->save(filename=>'filename.pat');

Save the existing dataset as an SNNS .pat file. If the file already exists it will be overwritten.

=head2 delete

 $dataset->delete(INDEX);

or

 $dataset->delete([ARRAY OF INDICES]);

Deletes 1 or more items from the dataset by their index (counting from 0). Note that if you are using pairs of values (in a backprop net for example) you MUST delete in pairs - otherwise you will delete only the input/target, and the indices will be ...

=head1 EXAMPLES

See the code in ./examples.


=head1 PREREQs

None.

=head1 SEE ALSO

 AI::NNFlex


=head1 TODO

Method to delete existing dataset entries by index

Method to validate linear separability of a dataset.

=head1 CHANGES


=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify 
it under the same terms as Perl itself.

=head1 CONTACT

lib/AI/NNFlex/Feedforward.pm  view on Meta::CPAN

##########################################################
# This is the first propagation module for NNFlex
#
##########################################################
# Versions
# ========
#
# 1.0	20040910	CColbourn	New module
#
# 1.1	20050116	CColbourn	Added call to 
#					datasets where run
#					is erroneously called
#					with a dataset
#
# 1.2	20050206	CColbourn	Fixed a bug where
#					transfer function
#					was called on every
#					input to a node
#					instead of total
#
# 1.3	20050218	CColbourn	Changed to reflect
#					new weight indexing
#					(arrays) in nnflex 0.16

lib/AI/NNFlex/Feedforward.pm  view on Meta::CPAN

# $network->run([0,1,1,1,0,1,1]);
#
#
###########################################################
sub run
{
	my $network = shift;

	my $inputPatternRef = shift;
	
	# if this is an incorrect dataset call translate it
	if ($inputPatternRef =~/Dataset/)
	{
		return ($inputPatternRef->run($network))
	}


	my @inputPattern = @$inputPatternRef;

	my @debug = @{$network->{'debug'}};
	if (scalar @debug> 0)
	{$network->dbug ("Input pattern @inputPattern received by Feedforward",3);}

lib/AI/NNFlex/Feedforward.pm  view on Meta::CPAN


		foreach my $node (@{$layer->{'nodes'}})
		{
			my $totalActivation;
			# Set the node to 0 if not persistent
			if (!($node->{'persistentactivation'}))
			{
				$node->{'activation'} =0;
			}

			# Decay the node (note that if decay is not set this
			# will have no effect, hence no if).
			$node->{'activation'} -= $node->{'decay'};
			my $nodeCounter=0;
			foreach my $connectedNode (@{$node->{'connectedNodesWest'}->{'nodes'}})
			{
				if (scalar @debug> 0)
				{$network->dbug("Flowing from ".$connectedNode->{'nodeid'}." to ".$node->{'nodeid'},3);}
	
				my $weight = ${$node->{'connectedNodesWest'}->{'weights'}}[$nodeCounter];
				my $activation = $connectedNode->{'activation'};		

lib/AI/NNFlex/Feedforward.pm  view on Meta::CPAN


=head1 AI::NNFlex::Feedforward::run

takes an array of inputs for the network. Returns true or false.

=head1 SEE ALSO

 
 AI::NNFlex
 AI::NNFlex::Backprop
 AI::NNFlex::Dataset 


=head1 CHANGES



=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

	return \@array;
}

########################################################
# AI::NNFlex::Hopfield::learn
########################################################
sub learn
{
	my $network = shift;

	my $dataset = shift;

	# calculate the weights
	# turn the dataset into a matrix
	my @matrix;
	foreach (@{$dataset->{'data'}})
	{
		push @matrix,$_;
	}
	my $patternmatrix = Math::Matrix->new(@matrix);

	my $inversepattern = $patternmatrix->transpose;

	my @minusmatrix;

	for (my $rows=0;$rows <(scalar @{$network->{'nodes'}});$rows++)
	{
		my @temparray;
		for (my $cols=0;$cols <(scalar	@{$network->{'nodes'}});$cols++)
		{
			if ($rows == $cols)
			{
				my $numpats = scalar @{$dataset->{'data'}};
				push @temparray,$numpats;	
			}
			else
			{
				push @temparray,0;
			}
		}
		push @minusmatrix,\@temparray;
	}

	my $minus = Math::Matrix->new(@minusmatrix);

	my $product = $inversepattern->multiply($patternmatrix);

	my $weights = $product->subtract($minus);

	my @element = ('1');
	my @truearray;
	for (1..scalar @{$dataset->{'data'}}){push @truearray,"1"}
	
	my $truematrix = Math::Matrix->new(\@truearray);

	my $thresholds = $truematrix->multiply($patternmatrix);
	#$thresholds = $thresholds->transpose();

	my $counter=0;
	foreach (@{$network->{'nodes'}})
	{
		my @slice;

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

 use AI::NNFlex::Hopfield;

 my $network = AI::NNFlex::Hopfield->new(config parameter=>value);

 $network->add_layer(nodes=>x);

 $network->init(); 



 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([
			[INPUTARRAY],
			[INPUTARRAY]]);

 $network->learn($dataset);

 my $outputsRef = $dataset->run($network);

 my $outputsRef = $network->output();

=head1 DESCRIPTION

AI::NNFlex::Hopfield is a Hopfield network simulator derived from the AI::NNFlex class. THIS IS THE FIRST ALPHA CUT OF THIS MODULE! Any problems, let me know and I'll fix them.

Hopfield networks differ from feedforward networks in that they are effectively a single layer, with all nodes connected to all other nodes (except themselves), and are trained in a single operation. They are particularly useful for recognising corru...

Full documentation for AI::NNFlex::Dataset can be found in the modules own perldoc. It's documented here for convenience only.

=head1 CONSTRUCTOR 

=head2 AI::NNFlex::Hopfield->new();

=head2 AI::NNFlex::Dataset

 new (	[[INPUT VALUES],[INPUT VALUES],
	[INPUT VALUES],[INPUT VALUES],..])

=head2 INPUT VALUES

These should be comma separated values. They can be applied to the network with ::run or ::learn

=head2 OUTPUT VALUES
	

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

=head2 init

 Syntax:

 $network->init();

Initialises connections between nodes.

=head2 run

 $network->run($dataset)

Runs the dataset through the network and returns a reference to an array of output patterns.

=head1 EXAMPLES

See the code in ./examples.


=head1 PREREQs

Math::Matrix

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

 use AI::NNFlex::Reinforce;

 my $network = AI::NNFlex::Reinforce->new(config parameter=>value);

 $network->add_layer(nodes=>x,activationfunction=>'function');

 $network->init(); 



 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([
			[INPUTARRAY],[TARGETOUTPUT],
			[INPUTARRAY],[TARGETOUTPUT]]);

 my $sqrError = 10;

 for (1..100)

 {

	 $dataset->learn($network);

 }

 $network->lesion({'nodes'=>PROBABILITY,'connections'=>PROBABILITY});

 $network->dump_state(filename=>'badgers.wts');

 $network->load_state(filename=>'badgers.wts');

 my $outputsRef = $dataset->run($network);

 my $outputsRef = $network->output(layer=>2,round=>1);

=head1 DESCRIPTION

Reinforce is a very simple NN module. It's mainly included in this distribution to provide an example of how to subclass AI::NNFlex to write your own NN modules. The training method strengthens any connections that are active during the run pass.

=head1 CONSTRUCTOR 

=head2 AI::NNFlex::Reinforce

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

 new ( parameter => value );
	
	randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

	fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

	debug=>[LIST OF CODES FOR MODULES TO DEBUG]

	learningrate=>the learning rate of the network

	round=>0 or 1 - 1 sets the network to round output values to
		nearest of 1, -1 or 0


The following parameters are optional:
 randomweights
 fixedweights
 debug
 round

(Note, if randomweights is not specified the network will default to a random value from 0 to 1.

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

			threshold=>NYI,
			activationfunction=>"ACTIVATION FUNCTION",
			randomweights=>MAX VALUE OF STARTING WEIGHTS);

=head3 init

 Syntax:

 $network->init();

Initialises connections between nodes, sets initial weights and loads external components. The base AI::NNFlex init method implementes connections backwards and forwards from each node in each layer to each node in the preceeding and following layers...

=head3 lesion

 $network->lesion ({'nodes'=>PROBABILITY,'connections'=>PROBABILITY})

 Damages the network.

B<PROBABILITY>

A value between 0 and 1, denoting the probability of a given node or connection being damaged.

Note: this method may be called on a per network, per node or per layer basis using the appropriate object.

=head2 AN::NNFlex::Dataset

=head3 learn

 $dataset->learn($network)

'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;

=head3 run

 $dataset->run($network)

Runs the dataset through the network and returns a reference to an array of output patterns.

=head1 EXAMPLES

See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.


=head1 PREREQs

None. NNFlex::Reinforce should run OK on any version of Perl 5 >. 

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

Dr David Plaut, for help with the project that this code was originally intended for.

Graciliano M.Passos for suggestions & improved code (see SEE ALSO).

Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.

=head1 SEE ALSO

 AI::NNFlex
 AI::NNFlex::Backprop
 AI::NNFlex::Dataset


=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net

t/Backprop.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>10}

# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

t/Backprop.t  view on Meta::CPAN


# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

# test saving weights
$result = $network->dump_state(filename=>'state.wts',activations=>1);
ok($result);

# test loading weights
$result = $network->load_state(filename=>'state.wts');
ok($result);

t/Dataset.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>12}




# we need a basic network  in place to test the dataset functionality against
# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

t/Dataset.t  view on Meta::CPAN

			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# test adding an entry
$result = $dataset->add([[1,1],[0,1]]);
ok($result);

# test save
$result = $dataset->save(filename=>'test.pat');
ok ($result);

# test empty dataset
my $dataset2 = AI::NNFlex::Dataset->new();
ok($dataset);

# test load
$result = $dataset2->load(filename=>'test.pat');
ok($result);

#  compare original & loaded dataset
my $comparison;
if (scalar @{$dataset->{'data'}} == scalar @{$dataset2->{'data'}}){$comparison=1}
ok($comparison);

# delete a pair from the dataset
$result = $dataset->delete([4,5]);
ok($result);

# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/Hopfield.t  view on Meta::CPAN

# example script to build a hopfield net
use strict;
use AI::NNFlex::Hopfield;
use AI::NNFlex::Dataset;
use Test;


BEGIN{plan tests=>4}
my $matrixpresent = eval("require(Math::Matrix)");
my $matrixabsent = !$matrixpresent;

my $network = AI::NNFlex::Hopfield->new();

skip($matrixabsent,$network);


$network->add_layer(nodes=>2);
$network->add_layer(nodes=>2);

my $result = $network->init();
skip($matrixabsent,$result);

my $dataset = AI::NNFlex::Dataset->new();

$dataset->add([-1, 1, -1, 1]);
$dataset->add([-1, -1, 1, 1]);

skip($matrixabsent,$dataset);

$network->learn($dataset);

my $outputref = $network->run([1,-1,1,1]);

skip($matrixabsent,$outputref);

t/backprop.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Backprop;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>8}

# test create network
my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

t/backprop.t  view on Meta::CPAN


# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/reinforce.t  view on Meta::CPAN

use strict;
use Test;
use AI::NNFlex::Reinforce;
use AI::NNFlex::Dataset;

BEGIN{
	plan tests=>5}

# test create network
my $network = AI::NNFlex::Reinforce->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1);

t/reinforce.t  view on Meta::CPAN

			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# Test a run pass
$result = $dataset->run($network);
ok($result); #test 5
##



( run in 1.556 second using v1.01-cache-2.11-cpan-49f99fa48dc )