AI-NeuralNet-Simple

 view release on metacpan or  search on metacpan

Changes  view on Meta::CPAN

Revision history for Perl extension AI::NeuralNet::Simple.

0.11  November 18, 2006
      Converted from Inline::C to XS
      No longer require 5.008.  5.005 and above should be fine.

0.10  December 29, 2005
      The following changes are all courtesy of Raphael Manfredi
      <Raphael_Manfredi [at] pobox.com>.
      Added tanh (bipolar) activation function.
      train_set() can now accept an error target to avoid over-training.
      Multiple network support.
      Persistence via storable.

0.02  September 21 2005
      Added pod and pod coverage tests
      Added Sub::Uplevel dependency to stop that annoying error failure :(

Simple.xs  view on Meta::CPAN

#define EXPORTED_ITEMS    9

/*
 * Exports the C data structures to the Perl world for serialization
 * by Storable.  We don't want to duplicate the logic of Storable here
 * even though we have to do some low-level Perl object construction.
 *
 * The structure we return is an array reference, which contains the
 * following items:
 *
 *  0    the export version number, in case format changes later
 *  1    the amount of neurons in the input layer
 *  2    the amount of neurons in the hidden layer
 *  3    the amount of neurons in the output layer
 *  4    the learning rate
 *  5    the sigmoid delta
 *  6    whether to use a bipolar (tanh) routine instead of the sigmoid
 *  7    [[weight.input_to_hidden[0]], [weight.input_to_hidden[1]], ...]
 *  8    [[weight.hidden_to_output[0]], [weight.hidden_to_output[1]], ...]
 */
SV *c_export_network(int handle)

lib/AI/NeuralNet/Simple.pm  view on Meta::CPAN

the input, hidden, and output layers, respectively.  To create the "logical or"
network described earlier:

  my $net = AI::NeuralNet::Simple->new(2,1,2);

By default, the activation function for the neurons is the sigmoid function
S() with delta = 1:

	S(x) = 1 / (1 + exp(-delta * x))

but you can change the delta after creation.  You can also use a bipolar
activation function T(), using the hyperbolic tangent:

	T(x) = tanh(delta * x)
	tanh(x) = (exp(x) - exp(-x)) / (exp(x) + exp(-x))

which allows the network to have neurons negatively impacting the weight,
since T() is a signed function between (-1,+1) whereas S() only falls
within (0,1).

=head2 C<delta($delta)>

Fetches the current I<delta> used in activation functions to scale the
signal, or sets the new I<delta>. The higher the delta, the steeper the
activation function will be.  The argument must be strictly positive.

You should not change I<delta> during the traning.

=head2 C<use_bipolar($boolean)>

Returns whether the network currently uses a bipolar activation function.
If an argument is supplied, instruct the network to use a bipolar activation
function or not.

You should not change the activation function during the traning.

=head2 C<train(\@input, \@output)>

This method trains the network to associate the input data set with the output
data set.  Representing the "logical or" is as follows:

  $net->train([1,1] => [0,1]);
  $net->train([1,0] => [0,1]);
  $net->train([0,1] => [0,1]);
  $net->train([0,0] => [1,0]);



( run in 0.573 second using v1.01-cache-2.11-cpan-c333fce770f )