AI-NeuralNet-BackProp

 view release on metacpan or  search on metacpan

BackProp.pm  view on Meta::CPAN

	my $test_phrase = $net->crunch("I love neural networking!");
	my $result = $net->run($test_phrase);
	
	# Get this, it prints "Jay Leno is  networking!" ...  LOL!
	print $net->uncrunch($result),"\n";



=head1 UPDATES

This is version 0.89. In this version I have included a new feature, output range limits, as
well as automatic crunching of run() and learn*() inputs. Included in the examples directory
are seven new practical-use example scripts. Also implemented in this version is a much cleaner 
learning function for individual neurons which is more accurate than previous verions and is 
based on the LMS rule. See range() for information on output range limits. I have also updated 
the load() and save() methods so that they do not depend on Storable anymore. In this version 
you also have the choice between three network topologies, two not as stable, and the third is 
the default which has been in use for the previous four versions.


=head1 DESCRIPTION

README  view on Meta::CPAN


AI::NeuralNet::BackProp is a simply back-propagation,
feed-foward neural network designed to learn using
a generalization of the Delta rule and a bit of Hopefield
theory. 

** What's new?

From the POD:
This is version 0.89. In this version I have included a
new feature, output range limits, as well as automatic
crunching of run() and learn*() inputs. Included in the
examples directory are seven new practical-use example
scripts. Also implemented in this version is a much cleaner
learning function for individual neurons which is more
accurate than previous verions and is based on the LMS
rule. See range() for information on output range limits.
I have also updated the load() and save() methods so that
they do not depend on Storable anymore. In this version you
also have the choice between three network topologies, two
not as stable, and the third is the default which has been

docs.htm  view on Meta::CPAN

        my $test_phrase = $net->crunch("I love neural networking!");
        my $result = $net->run($test_phrase);

        # Get this, it prints "Jay Leno is  networking!" ...  LOL!
        print $net->uncrunch($result),"\n"
        
</PRE>        
<P>
<HR SIZE=1 COLOR=BLACK>
<H1><A NAME="updates">UPDATES</A></H1>
<P>This is version 0.89. In this version I have included a new feature, output range limits, as
well as automatic crunching of <A HREF="#item_run"><CODE>run()</CODE></A> and learn*() inputs. Included in the examples directory
are seven new practical-use example scripts. Also implemented in this version is a much cleaner 
learning function for individual neurons which is more accurate than previous verions and is 
based on the LMS rule. See <A HREF="#item_range"><CODE>range()</CODE></A> for information on output range limits. I have also updated 
the <A HREF="#item_load"><CODE>load()</CODE></A> and <A HREF="#item_save"><CODE>save()</CODE></A> methods so that they do not depend on Storable anymore. In this version 
you also have the choice between three network topologies, two not as stable, and the third is 
the default which has been in use for the previous four versions.</P>
<P>
<HR SIZE=1 COLOR=BLACK>
<H1><A NAME="description">DESCRIPTION</A></H1>



( run in 0.240 second using v1.01-cache-2.11-cpan-4d50c553e7e )