AI-NNEasy

 view release on metacpan or  search on metacpan

MANIFEST  view on Meta::CPAN

lib/AI/NNEasy/NN/backprop.hploo
lib/AI/NNEasy/NN/backprop.pm
lib/AI/NNEasy/NN/feedforward.hploo
lib/AI/NNEasy/NN/feedforward.pm
lib/AI/NNEasy/NN/layer.hploo
lib/AI/NNEasy/NN/layer.pm
lib/AI/NNEasy/NN/node.hploo
lib/AI/NNEasy/NN/node.pm
lib/AI/NNEasy/NN/reinforce.hploo
lib/AI/NNEasy/NN/reinforce.pm
samples/test-nn-nonbool.pl
samples/test-nn-xor.pl
test.pl

README  view on Meta::CPAN

See POD for more...

################
# INSTALLATION #
################

To install this module type the following:

   perl Makefile.PL
   make
   make test
   make install

##########
# AUTHOR #
##########

Graciliano M. P. <gmpassos@cpan.org>

I will appreciate any type of feedback (include your opinions and/or suggestions). ;-P

lib/AI/NNEasy.hploo  view on Meta::CPAN

=> DESCRIPTION

The main purpose of this module is to create easy Neural Networks with Perl.

The module was designed to can be extended to multiple network types, learning algorithms and activation functions.
This architecture was 1st based in the module L<AI::NNFlex>, than I have rewrited it to fix some
serialization bugs, and have otimized the code and added some XS functions to get speed
in the learning process. Finally I have added an intuitive inteface to create and use the NN,
and added a winner algorithm to the output.

I have writed this module because after test different NN module on Perl I can't find
one that is portable through Linux and Windows, easy to use and the most important,
one that really works in a reall problem.

With this module you don't need to learn much about NN to be able to construct one, you just
define the construction of the NN, learn your set of inputs, and use it.

=> USAGE

Here's an example of a NN to compute XOR:

lib/AI/NNEasy.pm  view on Meta::CPAN

=head1 DESCRIPTION

The main purpose of this module is to create easy Neural Networks with Perl.

The module was designed to can be extended to multiple network types, learning algorithms and activation functions.
This architecture was 1st based in the module L<AI::NNFlex>, than I have rewrited it to fix some
serialization bugs, and have otimized the code and added some XS functions to get speed
in the learning process. Finally I have added an intuitive inteface to create and use the NN,
and added a winner algorithm to the output.

I have writed this module because after test different NN module on Perl I can't find
one that is portable through Linux and Windows, easy to use and the most important,
one that really works in a reall problem.

With this module you don't need to learn much about NN to be able to construct one, you just
define the construction of the NN, learn your set of inputs, and use it.

=head1 USAGE

Here's an example of a NN to compute XOR:

samples/test-nn-nonbool.pl  view on Meta::CPAN

  use AI::NNEasy ;

  my $NN_FILE = 'test-nonbool.nne' ; 

  unlink($NN_FILE) ;

  my $nn = AI::NNEasy->new(
  $NN_FILE ,
  [qw(0 0.2 0.4 0.6 0.8 1)] ,
  0.01 ,
  2 ,
  1 ,
  [3] ,

samples/test-nn-xor.pl  view on Meta::CPAN

  use AI::NNEasy ;

  my $NN_FILE = 'test-xor.nne' ; 

  unlink($NN_FILE) ;

  my $nn = AI::NNEasy->new(
  $NN_FILE ,
  [qw(0 1)] ,
  0 ,
  2 ,
  1 ,
  [3] ,

test.pl  view on Meta::CPAN

#########################

###use Data::Dumper ; print Dumper(  ) ;

use Test;
BEGIN { plan tests => 10 } ;

use AI::NNEasy ;

use strict ;
use warnings qw'all' ;

#########################
{
  ok(1) ;
}
#########################
{
  my $file = 'test.nne' ;

  unlink($file) ;

  my $ERR_OK = 0.1 ;

  my $nn = AI::NNEasy->new($file , [0,1] , $ERR_OK , 2 , 1 ) ;
  
  my @set = (
  [0,0],[0],
  [0,1],[1],



( run in 0.542 second using v1.01-cache-2.11-cpan-87723dcf8b7 )