AI-NeuralNet-Mesh
view release on metacpan or search on metacpan
examples/ex_aln.pl view on Meta::CPAN
This is a simple example of a _basic_ ALN implementation in
under 210 lines of code. In this demo we make use of the
custom node connector as described in the POD. We also
insert our own method over the node's internal adjust_weight()
method to make ALN learning a bit easire. This demo also adds
a temporary method to the network to print the logical type of
each node, called print_aln();
print_aln() prints simple diagram of the
network similar to this (this is for a $net=Tree(8,1) with
$net->learn([1,1,0,1,0,1,1,1],[0]), and each line represents
a layer):
L R L L L L L L
OR OR OR OR
OR OR
AND
All the standard methods that work on AI::NeuralNet::Mesh work
on the object returned by Tree(). load() and save() will correctly
preserve the gate structure and types of your network. learn_set()
and everything else works pretty much as expected. Only thing
that is useless is the crunch() method, as this only takes binary
inputs. But...for those of you who couldnt live without integers
in your network...I'm going to create a small package in the next
week, AI::NeuralNet::ALNTree, from this code. It will which includes
a integer-vectorizer (convert your integers into bit vectors), a bit
vector class to play with, as well as support for concating and
learning bit vectors. But, for now, enjoy this!
This file contains just a simple, functional, ALN implementation.
Enjoy!
=cut
# Import all the little functions.
use AI::NeuralNet::Mesh ':all';
# Create a new ALN tree with 2 leaves and 1 root node.
# Note: Our ALN trees can have more than one root node! Yippe! :-)
# Just a little benefit of deriving our ALNs from
# AI::NeuralNet::Mesh.
#
my $net = Tree(8,1);
# Use our nifty dot verbosity.
$net->v(12);
# Learn a pattern and print stats.
if(!$net->load('aln.mesh')) {
print "Learning";
print "Done!\nLearning took ",$net->learn([1,1,0,1,0,1,1,1],[0]),"\n";
$net->save('aln.mesh');
}
# Print logic gate types
$net->print_aln();
# Test it out
print "\nPattern: [1,1,0,1,0,1,1,1]".
"\nResult: ",$net->run([1,1,1,1,1,1,1,1])->[0],"\n";
######################################################################
#-################ ALN Implementation Code ########################-#
######################################################################
# Build a basic ALN tree network (_very_ basic, only implements
# the node types, and only two learning benefits from ALN theory are
# realized.) Also adds a method to the neural network gates, print_aln().
sub Tree {
# Grab our leaves and roots
my $leaves = shift;
my $roots = shift || $leaves;
# Replace the load function with a new one to preserve the
# load activations. We have to add this up here because next
# thing we do is check if they passed a file name as $leaves,
# and we need to have our new load sub already in place before
# we try to load anything in $leaves.
*{'AI::NeuralNet::Mesh::load'} = sub {
my $self = shift;
my $file = shift;
my $load_flag = shift;
if(!(-f $file)) {
$self->{error} = "File \"$file\" does not exist.";
return undef;
}
open(FILE,"$file");
my @lines=<FILE>;
close(FILE);
my %db;
for my $line (@lines) {
chomp($line);
my ($a,$b) = split /=/, $line;
$db{$a}=$b;
}
if(!$db{"header"}) {
$self->{error} = "Invalid format.";
return undef;
}
return $self->load_old($file) if($self->version($db{"header"})<0.21);
if($load_flag) {
undef $self;
$self = Tree($db{inputs},$db{outputs});
} else {
$self->{inputs} = $db{inputs};
$self->{nodes} = $db{nodes};
$self->{outputs} = $db{outputs};
$self->{layers} = [split(',',$db{layers})];
$self->{total_layers} = $db{total_layers};
( run in 0.692 second using v1.01-cache-2.11-cpan-39bf76dae61 )