AI-NeuralNet-SOM
view release on metacpan or search on metacpan
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
=head2 Scenario
The basic idea is that the neural network consists of a 2-dimensional
array of N-dimensional vectors. When the training is started these
vectors may be completely random, but over time the network learns
from the sample data, which is a set of N-dimensional vectors.
Slowly, the vectors in the network will try to approximate the sample
vectors fed in. If in the sample vectors there were clusters, then
these clusters will be neighbourhoods within the rectangle (or
whatever topology you are using).
Technically, you have reduced your dimension from N to 2.
=head1 INTERFACE
=head2 Constructor
The constructor takes arguments:
=over
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
movement can mean faster learning if the clusters are very pronounced. If not, then the movement is
like noise and the convergence is not good. To mediate that effect, the learning rate is reduced
over the iterations.
=item C<sigma0>: (optional, defaults to radius)
A non-negative number representing the start value for the learning radius. Practically, the value
should be chosen in such a way to cover a larger part of the map. During the learning process this
value will be narrowed down, so that the learning radius impacts less and less neurons.
B<NOTE>: Do not choose C<1> as the C<log> function is used on this value.
=back
Subclasses will (re)define some of these parameters and add others:
Example:
my $nn = new AI::NeuralNet::SOM::Rect (output_dim => "5x6",
input_dim => 3);
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
[ -1, -1, -1 ],
[ 0, 4, -3]);
=cut
sub train {
my $self = shift;
my $epochs = shift || 1;
die "no data to learn" unless @_;
$self->{LAMBDA} = $epochs / log ($self->{_Sigma0}); # educated guess?
my @mes = (); # this will contain the errors during the epochs
for my $epoch (1..$epochs) {
$self->{T} = $epoch;
my $sigma = $self->{_Sigma0} * exp ( - $self->{T} / $self->{LAMBDA} ); # compute current radius
my $l = $self->{_L0} * exp ( - $self->{T} / $epochs ); # current learning rate
my @veggies = @_; # make a local copy, that will be destroyed in the loop
while (@veggies) {
my $sample = splice @veggies, int (rand (scalar @veggies) ), 1; # find (and take out)
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
my $self = shift;
return $self->{output_dim};
}
=pod
=item I<radius> (read-only)
I<$radius> = I<$nn>->radius
Returns the I<radius> of the map. Different topologies interpret this differently.
=item I<map>
I<$m> = I<$nn>->map
This method returns a reference to the map data. See the appropriate subclass of the data
representation.
=cut
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
=over
=item maybe implement the SOM on top of PDL?
=item provide a ::SOM::Compat to have compatibility with the original AI::NeuralNet::SOM?
=item implement different window forms (bubble/gaussian), linear/random
=item implement the format mentioned in the original AI::NeuralNet::SOM
=item add methods as_html to individual topologies
=item add iterators through vector lists for I<initialize> and I<train>
=back
=head1 SUPPORT
Bugs should always be submitted via the CPAN bug tracker
L<https://rt.cpan.org/Dist/Display.html?Status=Active&Queue=AI-NeuralNet-SOM>
lib/AI/NeuralNet/SOM/Hexa.pm view on Meta::CPAN
use AI::NeuralNet::SOM;
use Data::Dumper;
use base qw(AI::NeuralNet::SOM);
use AI::NeuralNet::SOM::Utils;
=pod
=head1 NAME
AI::NeuralNet::SOM::Hexa - Perl extension for Kohonen Maps (hexagonal topology)
=head1 SYNOPSIS
use AI::NeuralNet::SOM::Hexa;
my $nn = new AI::NeuralNet::SOM::Hexa (output_dim => 6,
input_dim => 3);
# ... see also base class AI::NeuralNet::SOM
=head1 INTERFACE
lib/AI/NeuralNet/SOM/Rect.pm view on Meta::CPAN
use warnings;
use Data::Dumper;
use base qw(AI::NeuralNet::SOM);
use AI::NeuralNet::SOM::Utils;
=pod
=head1 NAME
AI::NeuralNet::SOM::Rect - Perl extension for Kohonen Maps (rectangular topology)
=head1 SYNOPSIS
use AI::NeuralNet::SOM::Rect;
my $nn = new AI::NeuralNet::SOM::Rect (output_dim => "5x6",
input_dim => 3);
$nn->initialize;
$nn->train (30,
[ 3, 2, 4 ],
[ -1, -1, -1 ],
lib/AI/NeuralNet/SOM/Torus.pm view on Meta::CPAN
use warnings;
use Data::Dumper;
use base qw(AI::NeuralNet::SOM::Rect);
use AI::NeuralNet::SOM::Utils;
=pod
=head1 NAME
AI::NeuralNet::SOM::Torus - Perl extension for Kohonen Maps (torus topology)
=head1 SYNOPSIS
use AI::NeuralNet::SOM::Torus;
my $nn = new AI::NeuralNet::SOM::Torus (output_dim => "5x6",
input_dim => 3);
$nn->initialize;
$nn->train (30,
[ 3, 2, 4 ],
[ -1, -1, -1 ],
[ 0, 4, -3]);
print $nn->as_data;
=head1 DESCRIPTION
This SOM is very similar to that with a rectangular topology, except that the rectangle is connected
on the top edge and the bottom edge to first form a cylinder; and that cylinder is then formed into
a torus by connecting the rectangle's left and right border (L<http://en.wikipedia.org/wiki/Torus>).
=head1 INTERFACE
It exposes the same interface as the base class.
=cut
sub neighbors { # http://www.ai-junkie.com/ann/som/som3.html
( run in 1.205 second using v1.01-cache-2.11-cpan-49f99fa48dc )