AI-NeuralNet-SOM
view release on metacpan or search on metacpan
0.04 17. Jun CEST 2007
- added labels get/set
- added mean_error function
0.03 Do 14. Jun 21:07:54 CEST 2007
- added output_dim method
- added ::Torus subclass of ::Rect
0.02 Sa 9. Jun 17:55:23 CEST 2007
- split ::SOM.pm into ::SOM::Rect and ::SOM::Hexa
- added more features for initialization
- factored out vector computation into ::SOM::Utils
0.01 Wed Jun 6 01:08:34 2007
- original version; created by h2xs 1.23 with options
-n AI::NeuralNet::SOM -X --use-new-tests
- first stab on things
# http://module-build.sourceforge.net/META-spec.html
#XXXXXXX This is a prototype!!! It will change in the future!!! XXXXX#
name: AI-NeuralNet-SOM
version: 0.07
version_from: lib/AI/NeuralNet/SOM.pm
installdirs: site
requires:
distribution_type: module
generated_by: ExtUtils::MakeMaker version 6.30_01
examples/eigenvector_initialization.pl view on Meta::CPAN
sub _find_num {
my $v = shift;
my $l = shift;
for my $i (0..$#$l) {
return $i if $v == $l->[$i];
}
return undef;
}
for (@es_idx) { # from the highest values downwards, take the index
push @training_vectors, [ list $E->dice($_) ] ; # get the corresponding vector
}
}
$nn->initialize (@training_vectors[0..0]); # take only the biggest ones (the eigenvalues are big, actually)
#warn $nn->as_string;
my @mes = $nn->train ($epochs, @vs);
warn "eigen: length until error is < $epsilon ". scalar (grep { $_ >= $epsilon } @mes);
}
__END__
examples/load_save.pl view on Meta::CPAN
sub _find_num {
my $v = shift;
my $l = shift;
for my $i (0..$#$l) {
return $i if $v == $l->[$i];
}
return undef;
}
for (@es_idx) { # from the highest values downwards, take the index
push @training_vectors, [ list $E->dice($_) ] ; # get the corresponding vector
}
}
$nn->initialize (@training_vectors[0..0]); # take only the biggest ones (the eigenvalues are big, actually)
#warn $nn->as_string;
my @mes = $nn->train ($epochs, @vs);
warn "eigen: length until error is < $epsilon ". scalar (grep { $_ >= $epsilon } @mes);
}
__END__
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
=item C<learning_rate>: (optional, default C<0.1>)
This is a magic number which controls how strongly the vectors in the grid can be influenced. Stronger
movement can mean faster learning if the clusters are very pronounced. If not, then the movement is
like noise and the convergence is not good. To mediate that effect, the learning rate is reduced
over the iterations.
=item C<sigma0>: (optional, defaults to radius)
A non-negative number representing the start value for the learning radius. Practically, the value
should be chosen in such a way to cover a larger part of the map. During the learning process this
value will be narrowed down, so that the learning radius impacts less and less neurons.
B<NOTE>: Do not choose C<1> as the C<log> function is used on this value.
=back
Subclasses will (re)define some of these parameters and add others:
Example:
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
I<$radius> = I<$nn>->radius
Returns the I<radius> of the map. Different topologies interpret this differently.
=item I<map>
I<$m> = I<$nn>->map
This method returns a reference to the map data. See the appropriate subclass of the data
representation.
=cut
sub map {
my $self = shift;
return $self->{map};
}
=pod
=item I<value>
I<$val> = I<$nn>->value (I<$x>, I<$y>)
I<$nn>->value (I<$x>, I<$y>, I<$val>)
Set or get the current vector value for a particular neuron. The neuron is addressed via its
coordinates.
=cut
sub value {
my $self = shift;
my ($x, $y) = (shift, shift);
my $v = shift;
return defined $v ? $self->{map}->[$x]->[$y] = $v : $self->{map}->[$x]->[$y];
}
=pod
=item I<label>
I<$label> = I<$nn>->label (I<$x>, I<$y>)
I<$nn>->label (I<$x>, I<$y>, I<$label>)
Set or get the label for a particular neuron. The neuron is addressed via its coordinates.
The label can be anything, it is just attached to the position.
=cut
sub label {
my $self = shift;
my ($x, $y) = (shift, shift);
my $l = shift;
return defined $l ? $self->{labels}->[$x]->[$y] = $l : $self->{labels}->[$x]->[$y];
}
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
=back
=head1 HOWTOs
=over
=item I<using Eigenvectors to initialize the SOM>
See the example script in the directory C<examples> provided in the
distribution. It uses L<PDL> (for speed and scalability, but the
results are not as good as I had thought).
=item I<loading and saving a SOM>
See the example script in the directory C<examples>. It uses
C<Storable> to directly dump the data structure onto disk. Storage and
retrieval is quite fast.
=back
=head1 FAQs
lib/AI/NeuralNet/SOM/Hexa.pm view on Meta::CPAN
Example:
my $m = $nn->map;
for my $x (0 .. $nn->diameter -1) {
for my $y (0 .. $nn->diameter -1){
warn "vector at $x, $y: ". Dumper $m->[$x]->[$y];
}
}
This array represents a hexagon like this (ASCII drawing is so cool):
<0,0>
<0,1> <1,0>
<0,2> <1,1> <2,0>
<0,3> <1,2> <2,1> <3,0>
...............................
=item I<as_string>
lib/AI/NeuralNet/SOM/Rect.pm view on Meta::CPAN
=pod
=over
=item I<map>
I<$m> = I<$nn>->map
This method returns the 2-dimensional array of vectors in the grid (as a reference to an array of
references to arrays of vectors). The representation of the 2-dimensional array is straightforward.
Example:
my $m = $nn->map;
for my $x (0 .. 5) {
for my $y (0 .. 4){
warn "vector at $x, $y: ". Dumper $m->[$x]->[$y];
}
}
( run in 0.505 second using v1.01-cache-2.11-cpan-49f99fa48dc )