AI-NNFlex
view release on metacpan or search on metacpan
<activation function>_slope implemented, which returns the
slope. I haven't implemented a slope function in sigmoid, for
the simple reason that I need to learn how to integrate the
function to find out what the slope is!
This version introduces a new, friendlier constructor. Instead
of the pair of anonymous hashes previously required, the
constructor now takes fairly normal looking parameters. Layers
are added to a network using the add_layer method. This combines
a more intuitive approach with more flexibility.
The original constructor is not deprecated. Parameter names with
spaces in on the other hand ('activation function') are. They
should be replaced by the same parameter name without any spaces.
The new constructor will convert old format parameter names, but
its at the users risk.
See xor.pl or the perldoc & readme for examples.
Cleaned up the perldoc some more. Commented out all the method
perldocs, so there is just the single block defining the
distributions documentation, as advocated by perlmonks. Method
perldocs in importable modules have not been commented out.
lib/AI/NNFlex.pm view on Meta::CPAN
use strict;
use vars qw ($VERSION);
#use warnings;
###############################################################################
# NNFlex - Neural Network (flexible) - a heavily custom NN simulator
#
# Sept 2004 - CW Colbourn
#
# This was developed from the abortive nnseq package originally intended
# for real time neural networks.
# The basis of the approach for this version is a very flexible, modular
# set of packages. This package constitutes the base, allowing the modeller
# to create meshes, apply input, and read output ONLY!
#
# Separate modules are to be written to perform feedback adjustments,
# various activation functions, text/gui front ends etc
#
###############################################################################
# Version Control
lib/AI/NNFlex.pm view on Meta::CPAN
None. NNFlex should run OK on any version of Perl 5 >.
=head1 ACKNOWLEDGEMENTS
Phil Brierley, for his excellent free java code, that solved my backprop problem
Dr Martin Le Voi, for help with concepts of NN in the early stages
Dr David Plaut, for help with the project that this code was originally intended for.
Graciliano M.Passos for suggestions & improved code (see SEE ALSO).
Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.
=head1 SEE ALSO
AI::NNFlex::Backprop
AI::NNFlex::Feedforward
AI::NNFlex::Mathlib
lib/AI/NNFlex/Backprop.pm view on Meta::CPAN
None. NNFlex::Backprop should run OK on any version of Perl 5 >.
=head1 ACKNOWLEDGEMENTS
Phil Brierley, for his excellent free java code, that solved my backprop problem
Dr Martin Le Voi, for help with concepts of NN in the early stages
Dr David Plaut, for help with the project that this code was originally intended for.
Graciliano M.Passos for suggestions & improved code (see SEE ALSO).
Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.
=head1 SEE ALSO
AI::NNFlex
AI::NNEasy - Developed by Graciliano M.Passos
lib/AI/NNFlex/Reinforce.pm view on Meta::CPAN
None. NNFlex::Reinforce should run OK on any version of Perl 5 >.
=head1 ACKNOWLEDGEMENTS
Phil Brierley, for his excellent free java code, that solved my backprop problem
Dr Martin Le Voi, for help with concepts of NN in the early stages
Dr David Plaut, for help with the project that this code was originally intended for.
Graciliano M.Passos for suggestions & improved code (see SEE ALSO).
Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.
=head1 SEE ALSO
AI::NNFlex
AI::NNFlex::Backprop
AI::NNFlex::Dataset
t/Dataset.t view on Meta::CPAN
ok ($result);
# test empty dataset
my $dataset2 = AI::NNFlex::Dataset->new();
ok($dataset);
# test load
$result = $dataset2->load(filename=>'test.pat');
ok($result);
# compare original & loaded dataset
my $comparison;
if (scalar @{$dataset->{'data'}} == scalar @{$dataset2->{'data'}}){$comparison=1}
ok($comparison);
# delete a pair from the dataset
$result = $dataset->delete([4,5]);
ok($result);
# Test a learning pass
my $err = $dataset->learn($network);
( run in 1.720 second using v1.01-cache-2.11-cpan-f985c23238c )