AI-NeuralNet-BackProp
view release on metacpan or search on metacpan
BackProp.pm view on Meta::CPAN
# Make a data set from the array refs
my @phrases = (
$phrase1, $phrase2,
$phrase3, $phrase4
);
# Learn the data set
$net->learn_set(\@phrases);
# Run a test phrase through the network
my $test_phrase = $net->crunch("I love neural networking!");
my $result = $net->run($test_phrase);
# Get this, it prints "Jay Leno is networking!" ... LOL!
print $net->uncrunch($result),"\n";
=head1 UPDATES
This is version 0.89. In this version I have included a new feature, output range limits, as
well as automatic crunching of run() and learn*() inputs. Included in the examples directory
BackProp.pm view on Meta::CPAN
Tobias Bronx, tobiasb@odin.funcom.com
Pat Trainor, ptrainor@title14.com
Steve Purkis, spurkis@epn.nu
Rodin Porrata, rodin@ursa.llnl.gov
Daniel Macks dmacks@sas.upenn.edu
Tobias was a great help with the initial releases, and helped with learning options and a great
many helpful suggestions. Rodin has gave me some great ideas for the new internals, as well
as disabling Storable. Steve is the author of AI::Perceptron, and gave some good suggestions for
weighting the neurons. Daniel was a great help with early beta testing of the module and related
ideas. Pat has been a great help for running the module through the works. Pat is the author of
the new Inter game, a in-depth strategy game. He is using a group of neural networks internally
which provides a good test bed for coming up with new ideas for the network. Thankyou for all of
your help, everybody.
=head1 DOWNLOAD
You can always download the latest copy of AI::NeuralNet::BackProp
from http://www.josiah.countystart.com/modules/AI/cgi-bin/rec.pl
=head1 MAILING LIST
A mailing list has been setup for AI::NeuralNet::BackProp for discussion of AI and
neural net related topics as they pertain to AI::NeuralNet::BackProp. I will also
announce in the group each time a new release of AI::NeuralNet::BackProp is available.
The list address is at: ai-neuralnet-backprop@egroups.com
BackProp.pm
Changes
Makefile.PL
MANIFEST
test.pl
docs.htm
README
examples/ex_add.pl
examples/ex_add2.pl
examples/ex_sub.pl
examples/ex_bmp.pl
examples/ex_bmp2.pl
examples/ex_pcx.pl
examples/ex_pcxl.pl
examples/ex_alpha.pl
rule and hopefield theory, as I understand them. So, don't expect
a classicist view of nerual networking here. I simply wrote
from operating theory, not math theory. Any die-hard neural
networking gurus out there? Let me know how far off I am with
this code! :-)
Regards,
~ Josiah Bryan, <jdb@wcoil.com>
Latest Version:
http://www.josiah.countystart.com/modules/AI/cgi-bin/rec.pl?README
# Make a data set from the array refs
my @phrases = (
$phrase1, $phrase2,
$phrase3, $phrase4
);
# Learn the data set
$net->learn_set(\@phrases);
# Run a test phrase through the network
my $test_phrase = $net->crunch("I love neural networking!");
my $result = $net->run($test_phrase);
# Get this, it prints "Jay Leno is networking!" ... LOL!
print $net->uncrunch($result),"\n"
</PRE>
<P>
<HR SIZE=1 COLOR=BLACK>
<H1><A NAME="updates">UPDATES</A></H1>
<P>This is version 0.89. In this version I have included a new feature, output range limits, as
well as automatic crunching of <A HREF="#item_run"><CODE>run()</CODE></A> and learn*() inputs. Included in the examples directory
<P>Below is a list of people that have helped, made suggestions, patches, etc. No particular order.</P>
<PRE>
Tobias Bronx, tobiasb@odin.funcom.com
Pat Trainor, ptrainor@title14.com
Steve Purkis, spurkis@epn.nu
Rodin Porrata, rodin@ursa.llnl.gov
Daniel Macks dmacks@sas.upenn.edu</PRE>
<P>Tobias was a great help with the initial releases, and helped with learning options and a great
many helpful suggestions. Rodin has gave me some great ideas for the new internals, as well
as disabling Storable. Steve is the author of AI::Perceptron, and gave some good suggestions for
weighting the neurons. Daniel was a great help with early beta testing of the module and related
ideas. Pat has been a great help for running the module through the works. Pat is the author of
the new Inter game, a in-depth strategy game. He is using a group of neural networks internally
which provides a good test bed for coming up with new ideas for the network. Thankyou for all of
your help, everybody.</P>
<P>
<HR SIZE=1 COLOR=BLACK>
<H1><A NAME="download">DOWNLOAD</A></H1>
<P>You can always download the latest copy of AI::NeuralNet::BackProp
from <A HREF="http://www.josiah.countystart.com/modules/AI/cgi-bin/rec.pl">http://www.josiah.countystart.com/modules/AI/cgi-bin/rec.pl</A></P>
<P>
<HR SIZE=1 COLOR=BLACK>
<H1><A NAME="mailing list">MAILING LIST</A></H1>
<P>A mailing list has been setup for AI::NeuralNet::BackProp for discussion of AI and
neural net related topics as they pertain to AI::NeuralNet::BackProp. I will also
announce in the group each time a new release of AI::NeuralNet::BackProp is available.</P>
<P>The list address is at: <A HREF="mailto:ai-neuralnet-backprop@egroups.com">ai-neuralnet-backprop@egroups.com</A></P>
<P>To subscribe, send a blank email to: <A HREF="mailto:ai-neuralnet-backprop-subscribe@egroups.com">ai-neuralnet-backprop-subscribe@egroups.com</A></P>
examples/ex_add2.pl view on Meta::CPAN
=begin
File: examples/ex_add2.pl
Author: Rodin Porrata, <rodin@ursa.llnl.gov>
Desc:
This script runs a complete test of the networks ability to add and remember
data sets, as well as testing the optimum "inc" to learn and the optimum
number of layers for a network.
=cut
use AI::NeuralNet::BackProp;
use Benchmark;
use English;
my $ofile = "addnet_data.txt";
examples/ex_add2.pl view on Meta::CPAN
$runtime = timediff($t2,$t1);
print "run took ",timestr($runtime),"\n";
my @input = ( [ 2222, 3333, 3200 ],
[ 1111, 1222, 3211 ],
[ 2345, 2543, 3000 ],
[ 2654, 2234, 2534 ] );
test_net( $net, @input );
}
#.....................................................................
sub test_net {
my @set;
my $fb;
my $net = shift;
my @data = @_;
undef @percent_diff; #@answers; undef @predictions;
for( $i=0; defined( $data[$i] ); $i++ ){
@set = @{ $data[$i] };
$fb = $net->run(\@set)->[0];
# Print output
examples/ex_alpha.pl view on Meta::CPAN
2,2,1,2,2
]
];
if(!$net->load("letters.dat")) {
#$net->range(0..29);
$net->learn_set($letters);
$net->save("letters.dat");
}
# Build a test map
my $tmp = [2,1,1,1,2,
1,2,2,2,1,
1,2,2,2,1,
1,1,1,1,1,
1,2,2,2,1,
1,2,2,2,1,
1,2,2,2,1];
# Display test map
print "\nTest map:\n";
$net->join_cols($tmp,5);
# Display network results
print "Letter index matched: ",$net->run($tmp)->[0],"\n";
examples/ex_bmp2.pl view on Meta::CPAN
1,0,1,0,0,
1,1,1,0,0);
print "\nLearning started...\n";
print $net->learn(\@map,'J');
print "Learning done.\n";
# Build a test map
my @tmp = (0,0,1,1,1,
1,1,1,0,0,
0,0,0,1,0,
0,0,0,1,0,
0,0,0,1,0,
0,0,0,0,0,
0,1,1,0,0);
# Display test map
print "\nTest map:\n";
$net->join_cols(\@tmp,5,'');
print "Running test...\n";
# Run the actual test and get network output
print "Result: ",$net->run_uc(\@tmp),"\n";
print "Test run complete.\n";
examples/ex_synop.pl view on Meta::CPAN
# Make a data set from the array refs
my @phrases = (
$phrase1, $phrase2,
$phrase3, $phrase4
);
# Learn the data set
$net->learn_set(\@phrases);
# Run a test phrase through the network
my $test_phrase = $net->crunch("I love neural networking!");
my $result = $net->run($test_phrase);
# Get this, it prints "Jay Leno is networking!" ... LOL!
print $net->uncrunch($result),"\n";
# Before `make install' is performed this script should be runnable with
# `make test'. After `make install' it should work as `perl test.pl'
BEGIN { $| = 1; print "1..13\n"; }
END {print "not ok 1\n" unless $loaded;}
sub t { my $f=shift;$t++;my $str=($f)?"ok $t":"not ok $t";print $str,"\n";}
use AI::NeuralNet::BackProp;
$loaded = 1;
t 1;
my $net = new AI::NeuralNet::BackProp(2,2,1);
t $net;
t ($net->intr(0.51) eq 1);
( run in 0.284 second using v1.01-cache-2.11-cpan-87723dcf8b7 )