view release on metacpan or search on metacpan
AI/Gene/Sequence.pm view on Meta::CPAN
}
return $rt;
}
# These are intended to be overriden, simple versions are
# provided for the sake of testing.
# Generates things to make up genes
# can be called with a token type to produce, or with none.
# if called with a token type, it will also be passed the original
# token as the second argument.
AI/Gene/Sequence.pm view on Meta::CPAN
$return .= $self->[0] . "\n";
$return .= (join ',', @{$self->[1]}). "\n";
return $return;
}
# used for testing
sub _test_dump {
my $self = shift;
my @rt = ($self->[0], join('',@{$self->[1]}));
return @rt;
}
1;
AI/Gene/Sequence.pm view on Meta::CPAN
This is a class which provides generic methods for the
creation and mutation of genetic sequences. Various mutations
are provided as is a way to ensure that genes created by
mutations remain useful (for instance, if a gene gives rise to
code, it can be tested for correct syntax).
If you do not need to keep check on what sort of thing is
currently occupying a slot in the gene, you would be better
off using the AI::Gene::Simple class instead as this
will be somewhat faster. The interface to the mutations is
AI/Gene/Sequence.pm view on Meta::CPAN
For instance, a regular expression could be encoded as:
$self = ['ccartm',['a', 'b', '|', '[A-Z]', '\W', '*?'] ]
Using a string to indicate the sort of thing held at the
corresponding part of the gene allows for a simple test
of the validity of a proposed gene by using a regular
expression.
=head2 Using the module
view all matches for this distribution
view release on metacpan or search on metacpan
# Before `make install' is performed this script should be runnable with
# `make test'. After `make install' it should work as `perl test.pl'
#########################
# change 'tests => 1' to 'tests => last_test_to_print';
use Test;
BEGIN { plan tests => 1 };
use AI::General;
ok(1); # If we made it this far, we're ok.
#########################
# Insert your test code below, the Test module is use()ed here so read
# its man page ( perldoc Test ) for help writing this test script.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Genetic/Pro.pm view on Meta::CPAN
$self->generation($self->generation + 1);
# update history -----------------------------------------------
$self->_save_history;
#---------------------------------------------------------------
# preservation of N unique chromosomes
@preserved = map { clone($_) } @{ $self->getFittest_as_arrayref($self->preserve - 1, 1) };
# selection ----------------------------------------------------
$self->_select_parents();
# crossover ----------------------------------------------------
$self->_crossover();
# mutation -----------------------------------------------------
lib/AI/Genetic/Pro.pm view on Meta::CPAN
#=======================================================================
sub intType { shift->type() }
#=======================================================================
# STATS ################################################################
#=======================================================================
sub getFittest_as_arrayref {
my ($self, $n, $uniq) = @_;
$n ||= 1;
$self->_calculate_fitness_all() unless scalar %{ $self->_fitness };
my @keys = sort { $self->_fitness->{$a} <=> $self->_fitness->{$b} } 0..$#{$self->chromosomes};
lib/AI/Genetic/Pro.pm view on Meta::CPAN
$n = scalar @keys if $n > scalar @keys;
return [ reverse @{$self->chromosomes}[ splice @keys, $#keys - $n + 1, $n ] ];
}
#=======================================================================
sub getFittest { return wantarray ? @{ shift->getFittest_as_arrayref(@_) } : shift @{ shift->getFittest_as_arrayref(@_) }; }
#=======================================================================
sub getAvgFitness {
my ($self) = @_;
my @minmax = minmax values %{$self->_fitness};
lib/AI/Genetic/Pro.pm view on Meta::CPAN
return oct('0b' . $ga->as_string($chromosome));
}
sub terminate {
my ($ga) = @_;
my $result = oct('0b' . $ga->as_string($ga->getFittest));
return $result == 4294967295 ? 1 : 0;
}
my $ga = AI::Genetic::Pro->new(
-fitness => \&fitness, # fitness function
lib/AI/Genetic/Pro.pm view on Meta::CPAN
# evolve 10 generations
$ga->evolve(10);
# best score
print "SCORE: ", $ga->as_value($ga->getFittest), ".\n";
# save evolution path as a chart
$ga->chart(-filename => 'evolution.png');
# save state of GA
lib/AI/Genetic/Pro.pm view on Meta::CPAN
=over 4
=item Speed
To increase speed XS code is used, however with portability in
mind. This distribution was tested on Windows and Linux platforms
(and should work on any other).
Multicore support is available through Many-Core Engine (C<MCE>).
You can gain the most speed up for big populations or time/CPU consuming
fitness functions, however for small populations and/or simple fitness
lib/AI/Genetic/Pro.pm view on Meta::CPAN
=item -strict
This defines if the check for modifying chromosomes in a user-defined fitness
function is active. Directly modifying chromosomes is not allowed and it is
a highway to big trouble. This mode should be used only for testing, because it is B<slow>.
=back
=item I<$ga>-E<gt>B<inject>($chromosomes)
lib/AI/Genetic/Pro.pm view on Meta::CPAN
Get I<max>, I<mean> and I<min> score of the current generation. In example:
my ($max, $mean, $min) = $ga->getAvgFitness();
=item I<$ga>-E<gt>B<getFittest>($n, $unique)
This function returns a list of the fittest chromosomes from the current
population. You can specify how many chromosomes should be returned and if
the returned chromosomes should be unique. See example below.
# only one - the best
my ($best) = $ga->getFittest;
# or 5 bests chromosomes, NOT unique
my @bests = $ga->getFittest(5);
# or 7 bests and UNIQUE chromosomes
my @bests = $ga->getFittest(7, 1);
If you want to get a large number of chromosomes, try to use the
C<getFittest_as_arrayref> function instead (for efficiency).
=item I<$ga>-E<gt>B<getFittest_as_arrayref>($n, $unique)
This function is very similar to C<getFittest>, but it returns a reference
to an array instead of a list.
=item I<$ga>-E<gt>B<generation>()
Get the number of the current generation.
lib/AI/Genetic/Pro.pm view on Meta::CPAN
=over 4
=item Examples.
=item More tests.
=item More warnings about incorrect parameters.
=back
lib/AI/Genetic/Pro.pm view on Meta::CPAN
Randal L. Schwartz for reporting a bug in this documentation.
Maciej Misiak for reporting problems with C<combination> (and a bug in a PMX strategy).
LEONID ZAMDBORG for recommending the addition of variable-length chromosomes as well as supplying relevant code samples, for testing and at the end reporting some bugs.
Christoph Meissner for reporting a bug.
Alec Chen for reporting some bugs.
view all matches for this distribution
view release on metacpan or search on metacpan
$self->{GENERATION}++;
$self->{SORTED} = 0;
last if $self->{TERM}->($self);
# my @f = $self->getFittest(10);
# for my $f (@f) {
# print STDERR " Fitness = ", $f->score, "..\n";
# print STDERR " Genes are: @{$f->genes}.\n";
# }
}
$self->{PEOPLE} = $self->sortIndividuals($self->{PEOPLE});
$self->{SORTED} = 1;
}
# sub getFittest():
# This method returns the fittest individuals.
sub getFittest {
my ($self, $N) = @_;
$N ||= 1;
$N = 1 if $N < 1;
-terminate => \&terminateFunc,
);
$ga->init(10);
$ga->evolve('rouletteTwoPoint', 100);
print "Best score = ", $ga->getFittest->score, ".\n";
sub fitnessFunc {
my $genes = shift;
my $fitness;
sub terminateFunc {
my $ga = shift;
# terminate if reached some threshold.
return 1 if $ga->getFittest->score > $THRESHOLD;
return 0;
}
=head1 DESCRIPTION
If a termination subroutine is given, it is executed and the return value is
checked. Evolution terminates if this sub returns a true value.
=back
=item I<$ga>-E<gt>B<getFittest>(?I<N>?)
This returns the I<N> fittest individuals. If not specified,
I<N> defaults to 1. As a side effect, it sorts the population by
fitness score. The actual AI::Genetic::Individual objects are returned.
You can use the C<genes()> and C<score()> methods to get the genes and the
scores of the individuals. Please check L<AI::Genetic::Individual> for details.
few clock cycles here and there, then it will be greatly magnified in
the total run time.
=head1 BUGS
I have tested this module quite a bit, and even used it to solve a
work-related problem successfully. But, if you think you found a bug
then please let me know, and I promise to look at it.
Also, if you have any requests, comments or suggestions, then feel
free to email me.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Image.pm view on Meta::CPAN
The size for the generated image (default: '512x512').
=item debug
Used for testing. If set to any true value, the image method
will return details of the error encountered instead of C<undef>
=back
=head2 image
view all matches for this distribution
view release on metacpan or search on metacpan
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Libraries
If you develop a new library, and you want it to be of the greatest
possible use to the public, we recommend making it free software that
everyone can redistribute and change. You can do so by permitting
redistribution under these terms (or, alternatively, under the terms of the
ordinary General Public License).
view all matches for this distribution
view release on metacpan or search on metacpan
MV = mv
NOOP = $(TRUE)
NOECHO = @
RM_F = rm -f
RM_RF = rm -rf
TEST_F = test -f
TOUCH = touch
UMASK_NULL = umask 0
DEV_NULL = > /dev/null 2>&1
MKPATH = $(ABSPERLRUN) -MExtUtils::Command -e 'mkpath' --
EQUALIZE_TIMESTAMP = $(ABSPERLRUN) -MExtUtils::Command -e 'eqtime' --
# --- MakeMaker special_targets section:
.SUFFIXES : .xs .c .C .cpp .i .s .cxx .cc $(OBJ_EXT)
.PHONY: all config static dynamic test linkext manifest blibdirs clean realclean disttest distdir
# --- MakeMaker c_o section:
distdir : create_distdir distmeta
$(NOECHO) $(NOOP)
# --- MakeMaker dist_test section:
disttest : distdir
cd $(DISTVNAME) && $(ABSPERLRUN) Makefile.PL
cd $(DISTVNAME) && $(MAKE) $(PASTHRU)
cd $(DISTVNAME) && $(MAKE) test $(PASTHRU)
# --- MakeMaker dist_ci section:
Makefile.PL DIR= \
MAKEFILE=$(MAKE_APERL_FILE) LINKTYPE=static \
MAKEAPERL=1 NORECURS=1 CCCDLFLAGS=
# --- MakeMaker test section:
TEST_VERBOSE=0
TEST_TYPE=test_$(LINKTYPE)
TEST_FILE = test.pl
TEST_FILES = t/*.t
TESTDB_SW = -d
testdb :: testdb_$(LINKTYPE)
test :: $(TEST_TYPE) subdirs-test
subdirs-test ::
$(NOECHO) $(NOOP)
test_dynamic :: pure_all
PERL_DL_NONLAZY=1 $(FULLPERLRUN) "-MExtUtils::Command::MM" "-e" "test_harness($(TEST_VERBOSE), '$(INST_LIB)', '$(INST_ARCHLIB)')" $(TEST_FILES)
testdb_dynamic :: pure_all
PERL_DL_NONLAZY=1 $(FULLPERLRUN) $(TESTDB_SW) "-I$(INST_LIB)" "-I$(INST_ARCHLIB)" $(TEST_FILE)
test_ : test_dynamic
test_static :: test_dynamic
testdb_static :: testdb_dynamic
# --- MakeMaker ppd section:
# Creates a PPD (Perl Package Description) for a binary distribution.
ppd :
view all matches for this distribution
view release on metacpan or search on metacpan
scripts/mnist.pl view on Meta::CPAN
use AI::ML::NeuralNetwork;
my %opt = (
"train-images" => "train-images-idx3-ubyte",
"train-labels" => "train-labels-idx1-ubyte",
"test-images" => "t10k-images-idx3-ubyte",
"test-labels" => "t10k-labels-idx1-ubyte"
);
_load_data();
view all matches for this distribution
view release on metacpan or search on metacpan
"runtime" : {
"requires" : {
"AI::MXNet" : "1.33"
}
},
"test" : {
"requires" : {}
}
},
"release_status" : "stable",
"version" : "1.33"
view all matches for this distribution
view release on metacpan or search on metacpan
MANIFEST
META.json
META.yml
README
t/AI-MXNet-Gluon-ModelZoo.t
t/test_gluon_model_zoo.t
view all matches for this distribution
view release on metacpan or search on metacpan
examples/cudnn_lstm_bucketing.pl view on Meta::CPAN
use AI::MXNet::Function::Parameters;
use AI::MXNet::Base;
use Getopt::Long qw(HelpMessage);
GetOptions(
'test' => \(my $do_test ),
'num-layers=i' => \(my $num_layers = 2 ),
'num-hidden=i' => \(my $num_hidden = 256 ),
'num-embed=i' => \(my $num_embed = 256 ),
'num-seq=i' => \(my $seq_size = 32 ),
'gpus=s' => \(my $gpus ),
examples/cudnn_lstm_bucketing.pl view on Meta::CPAN
char_lstm.pl - Example of training char LSTM RNN on tiny shakespeare using high level RNN interface
=head1 SYNOPSIS
--test Whether to test or train (default 0)
--num-layers number of stacked RNN layers, default=2
--num-hidden hidden layer size, default=200
--num-seq sequence size, default=32
--gpus list of gpus to run, e.g. 0 or 0,2,5. empty means using cpu.
Increase batch size when using multiple gpus for best performance.
examples/cudnn_lstm_bucketing.pl view on Meta::CPAN
my ($train_sentences, $vocabulary) = tokenize_text(
'./data/ptb.train.txt', start_label => $start_label,
invalid_label => $invalid_label
);
my ($validation_sentences) = tokenize_text(
'./data/ptb.test.txt', vocab => $vocabulary,
start_label => $start_label, invalid_label => $invalid_label
);
my $data_train = mx->rnn->BucketSentenceIter(
$train_sentences, $batch_size, buckets => $buckets,
invalid_label => $invalid_label,
examples/cudnn_lstm_bucketing.pl view on Meta::CPAN
batch_end_callback => mx->callback->Speedometer($batch_size, $disp_batches),
($model_prefix ? (epoch_end_callback => mx->rnn->do_rnn_checkpoint($cell, $model_prefix, 1)) : ())
);
};
my $test = sub {
assert($model_prefix, "Must specifiy path to load from");
my (undef, $data_val, $vocab) = get_data('NT');
my $stack;
if($stack_rnn)
{
examples/cudnn_lstm_bucketing.pl view on Meta::CPAN
if($num_layers >= 4 and split(/,/,$gpus) >= 4 and not $stack_rnn)
{
print("WARNING: stack-rnn is recommended to train complex model on multiple GPUs\n");
}
if($do_test)
{
# Demonstrates how to load a model trained with CuDNN RNN and predict
# with non-fused MXNet symbol
$test->();
}
else
{
$train->();
}
view all matches for this distribution
view release on metacpan or search on metacpan
Makefile.PL view on Meta::CPAN
INC => '-I../../include/mxnet',
OBJECT => 'mxnet_wrap.o',
LDDLFLAGS => join(' ', @lddlflags),
PREREQ_PM => {
# prereqs
# build/test prereqs
'Test::More' => 0,
},
PL_FILES => {},
);
view all matches for this distribution
view release on metacpan or search on metacpan
inc/Module/AutoInstall.pm view on Meta::CPAN
# various lexical flags
my ( @Missing, @Existing, %DisabledTests, $UnderCPAN, $HasCPANPLUS );
my ( $Config, $CheckOnly, $SkipInstall, $AcceptDefault, $TestOnly );
my ( $PostambleActions, $PostambleUsed );
# See if it's a testing or non-interactive session
_accept_default( $ENV{AUTOMATED_TESTING} or ! -t STDIN );
_init();
sub _accept_default {
$AcceptDefault = shift;
inc/Module/AutoInstall.pm view on Meta::CPAN
$CheckOnly = 1;
}
elsif ( $arg =~ /^--skip(?:deps)?$/ ) {
$SkipInstall = 1;
}
elsif ( $arg =~ /^--test(?:only)?$/ ) {
$TestOnly = 1;
}
}
}
inc/Module/AutoInstall.pm view on Meta::CPAN
grep { /^[^\-]/ or /^-core$/i } keys %{ +{@args} }
)[0]
);
while ( my ( $feature, $modules ) = splice( @args, 0, 2 ) ) {
my ( @required, @tests, @skiptests );
my $default = 1;
my $conflict = 0;
if ( $feature =~ m/^-(\w+)$/ ) {
my $option = lc($1);
inc/Module/AutoInstall.pm view on Meta::CPAN
if ( $mod =~ m/^-(\w+)$/ ) {
my $option = lc($1);
$default = $arg if ( $option eq 'default' );
$conflict = $arg if ( $option eq 'conflict' );
@tests = @{$arg} if ( $option eq 'tests' );
@skiptests = @{$arg} if ( $option eq 'skiptests' );
next;
}
printf( "- %-${maxlen}s ...", $mod );
inc/Module/AutoInstall.pm view on Meta::CPAN
if (
defined( my $cur = _version_check( _load($mod), $arg ||= 0 ) ) )
{
print "loaded. ($cur" . ( $arg ? " >= $arg" : '' ) . ")\n";
push @Existing, $mod => $arg;
$DisabledTests{$_} = 1 for map { glob($_) } @skiptests;
}
else {
print "missing." . ( $arg ? " (would need $arg)" : '' ) . "\n";
push @required, $mod => $arg;
}
inc/Module/AutoInstall.pm view on Meta::CPAN
) =~ /^[Yy]/
)
)
{
push( @Missing, @required );
$DisabledTests{$_} = 1 for map { glob($_) } @skiptests;
}
elsif ( !$SkipInstall
and $default
and $mandatory
and
_prompt( qq{==> The module(s) are mandatory! Really skip?}, 'n', )
=~ /^[Nn]/ )
{
push( @Missing, @required );
$DisabledTests{$_} = 1 for map { glob($_) } @skiptests;
}
else {
$DisabledTests{$_} = 1 for map { glob($_) } @tests;
}
}
$UnderCPAN = _check_lock(); # check for $UnderCPAN
inc/Module/AutoInstall.pm view on Meta::CPAN
$args{EXE_FILES} =
[ grep { exists $manifest->{$_} } @{ $args{EXE_FILES} } ];
}
$args{test}{TESTS} ||= 't/*.t';
$args{test}{TESTS} = join( ' ',
grep { !exists( $DisabledTests{$_} ) }
map { glob($_) } split( /\s+/, $args{test}{TESTS} ) );
my $missing = join( ',', @Missing );
my $config =
join( ',', UNIVERSAL::isa( $Config, 'HASH' ) ? %{$Config} : @{$Config} )
if $Config;
view all matches for this distribution
view release on metacpan or search on metacpan
0.06 = 0.05_01 Nov 26 2007
0.05_01 Mar 11 2007
- libmegahal.c/execute_command, free_words, etc:
change register int to unsigned int
- tests moved to t/test.t
- pod.t
0.05 = 0.04_02 Feb 08 2007
0.04_02 Jan 31 2007
- Removed inclusion of <getopt.h> in libmegahal.c
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/MicroStructure.pm view on Meta::CPAN
else {
push @micros, ["AI::MicroStructure::$structure"->new(),''];
}
}
my $all ={};
for my $test (@micros) {
my $micro = $test->[0];
my %items;
my $items = $micro->name(0);
$items{$_}++ for $micro->name(0);
my $key=sprintf("%s",$micro->structure);
$all->{$key}=[$test->[1],$micro->name($items)];
}
return $all;
}
sub save_cat {
my $self = shift;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NNEasy.pm view on Meta::CPAN
This architecture was 1st based in the module L<AI::NNFlex>, than I have rewrited it to fix some
serialization bugs, and have otimized the code and added some XS functions to get speed
in the learning process. Finally I have added an intuitive inteface to create and use the NN,
and added a winner algorithm to the output.
I have writed this module because after test different NN module on Perl I can't find
one that is portable through Linux and Windows, easy to use and the most important,
one that really works in a reall problem.
With this module you don't need to learn much about NN to be able to construct one, you just
define the construction of the NN, learn your set of inputs, and use it.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NNFlex.pm view on Meta::CPAN
Note: this method may be called on a per network, per node or per layer basis using the appropriate object.
=head1 EXAMPLES
See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.
=head1 PREREQs
None. NNFlex should run OK on any version of Perl 5 >.
view all matches for this distribution
view release on metacpan or search on metacpan
Makefile.PL view on Meta::CPAN
INC => '-I../../3rdparty/tvm/nnvm/include/nnvm',
OBJECT => 'nnvm_wrap.o',
LDDLFLAGS => join(' ', @lddlflags),
PREREQ_PM => {
# prereqs
# build/test prereqs
'Test::More' => 0,
},
PL_FILES => {},
);
view all matches for this distribution
view release on metacpan or search on metacpan
As a last resort, you can manually install it. Download the tarball, untar it,
then build it:
% perl Makefile.PL
% make && make test
Then install it:
% make install
view all matches for this distribution
view release on metacpan or search on metacpan
NaiveBayes1.pm view on Meta::CPAN
and comments (in no particular order):
Michael Stevens, Tom Dyson, Dan Von Kohorn, Craig Talbert,
Andrew Brian Clegg,
and CPAN-testers, including: Andreas Koenig, Alexandr Ciornii, jlatour,
Jost.Krieger, tvmaly, Matthew Musgrove, Michael Stevens, Nigel Horne,
Graham Crookham, David Cantrell (dcantrell).
=head1 AUTHOR
view all matches for this distribution
view release on metacpan or search on metacpan
examples/digits/deep_digits.pl view on Meta::CPAN
my $prev_nerl = $nerl;
my $prev_cost = 10000;
my $passes=0;
for(1..3000){
my @test = ($images(9000:9999)->sever,$y(9000:9999)->sever);
my $n = int rand(8000);
my $m = $n+499;
my @train = ($images->slice("$n:$m")->copy, $y->slice("$n:$m")->copy);
$nerl->train(@train,passes=>10);
my ($cost, $nc) = $nerl->cost( @test );
print "cost:$cost\n,num correct: $nc / 1000\n";
# $nerl->network->show_neuron(1);
$passes++;
if ($cost < $prev_cost or $passes<10){
$prev_cost = $cost;
view all matches for this distribution
view release on metacpan or search on metacpan
BackProp.pm view on Meta::CPAN
);
# Learn the data set
$net->learn_set(\@phrases);
# Run a test phrase through the network
my $test_phrase = $net->crunch("I love neural networking!");
my $result = $net->run($test_phrase);
# Get this, it prints "Jay Leno is networking!" ... LOL!
print $net->uncrunch($result),"\n";
BackProp.pm view on Meta::CPAN
Daniel Macks dmacks@sas.upenn.edu
Tobias was a great help with the initial releases, and helped with learning options and a great
many helpful suggestions. Rodin has gave me some great ideas for the new internals, as well
as disabling Storable. Steve is the author of AI::Perceptron, and gave some good suggestions for
weighting the neurons. Daniel was a great help with early beta testing of the module and related
ideas. Pat has been a great help for running the module through the works. Pat is the author of
the new Inter game, a in-depth strategy game. He is using a group of neural networks internally
which provides a good test bed for coming up with new ideas for the network. Thankyou for all of
your help, everybody.
=head1 DOWNLOAD
You can always download the latest copy of AI::NeuralNet::BackProp
from http://www.josiah.countystart.com/modules/AI/cgi-bin/rec.pl
=head1 MAILING LIST
view all matches for this distribution
view release on metacpan or search on metacpan
0.19 Sat Dec 3 14:52:39 EST 2016
- fix some errant sprintf's
0.18 Sat Dec 3 14:36:03 EST 2016
- force all tests serially, not just test_dynamic
0.17 Sat Dec 3 02:43:38 EST 2016
- force test harness to test serially
- update copyright notice
- clean up tests
0.16 Sat Jan 3 05:53:12 EST 2015
- version bump - hasnt been tested in a while...
- added auto-README generation to Makefile.PL
- update copyright notice
0.15 Wed Jul 11 00:13:02 2012
- tidy up build a bit
- consolidated neighbors code
- general housekeeping
0.10 Fri Aug 7 09:11:39 2009
- no longer relying on sizeof(void)
- removed a bit of old test code
- one more PTR2INT conversion
- experimentally dropped perl require to 5.6.2
- hopefully fixed a few casting problems for some platforms
0.09 Wed Aug 5 20:26:17 2009
- clean up things a bit
- now using Atol() instead of atoi()
- now using Drand01() instead of rand()
- now using seedDrand01() instead of srand()
- fixed problem with not using all training vectors, or some twice
- removed non-core Data::Dumper from tests
- added tests for store/retrieve via Storable
- first public release
0.06 Wed Jul 22 12:07:25 2009
- removed AI::NN::FSOM::ARRAY, ::MAP, and ::VECTOR modules
- removed Inline::C code from remaining modules
- moved train() into C
- now parsing input_ and output_dim parameters (finally!)
0.05 Mon Jul 20 13:20:06 2009
- re-added support for labels, originally in AI::NN::SOM
- added original AI::NN::SOM test suite (and it works!)
0.04 Sat Jul 18 16:45:27 2009
- removed dependence on Inline::C
- minor refactor
view all matches for this distribution
view release on metacpan or search on metacpan
To install this module, run the following commands:
perl Makefile.PL
make
make test
make install
SUPPORT AND DOCUMENTATION
After installing, you can find documentation for this module with the
view all matches for this distribution
view release on metacpan or search on metacpan
package RGB_test;
use lib "../../../..";
use Test;
BEGIN { plan test => 12}
use AI::NeuralNet::Kohonen::Demo::RGB;
ok(1,1);
$_ = new AI::NeuralNet::Kohonen;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Kohonen/Visual.pm view on Meta::CPAN
AI::NeuralNet::Kohonen::Visual - Tk-based Visualisation
=head1 SYNOPSIS
Test the test file in this distribution, or:
package YourClass;
use base "AI::NeuralNet::Kohonen::Visual";
sub get_colour_for { my ($self,$x,$y) = (shift,shift,shift);
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Kohonen.pm view on Meta::CPAN
package AI::NeuralNet::Kohonen;
use vars qw/$VERSION/;
$VERSION = 0.142; # 08 August 2006 test lost input file
=head1 NAME
AI::NeuralNet::Kohonen - Kohonen's Self-organising Maps
view all matches for this distribution
view release on metacpan or search on metacpan
$net->learn([0,0],[0]);
$net->learn([0,1],[0]);
$net->learn([1,0],[0]);
$net->learn([1,1],[1]);
# Present it with two test cases
my $result_bit_1 = $net->run([0,1])->[0];
my $result_bit_2 = $net->run([1,1])->[0];
# Display the results
print "AND test with inputs (0,1): $result_bit_1\n";
print "AND test with inputs (1,1): $result_bit_2\n";
=head1 VERSION & UPDATES
This is version B<0.44>, an update release for version 0.43.
In previous module releases $degrade_increment_flag was not used, as increment degrading
was always on. In this release I have looked at several other network types as well
as several texts and decided that it would be better to not use increment degrading. The
option is still there for those that feel the inclination to use it. I have found some areas
that do need the degrade flag to work at a faster speed. See test.pl for an example. If
the degrade flag wasn't in test.pl, it would take a very long time to learn.
=item $net->learn_set(\@set, [ options ]);
Thanks to Randal and Michiel for spoting some documentation and makefile bugs in the last release.
Thanks to Rodin for continual suggetions and questions about the module and more.
=head1 DOWNLOAD
You can always download the latest copy of AI::NeuralNet::Mesh
from http://www.josiah.countystart.com/modules/get.pl?mesh:pod
=head1 MAILING LIST
view all matches for this distribution
view release on metacpan or search on metacpan
- added more features for initialization
- factored out vector computation into ::SOM::Utils
0.01 Wed Jun 6 01:08:34 2007
- original version; created by h2xs 1.23 with options
-n AI::NeuralNet::SOM -X --use-new-tests
- first stab on things
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Simple.pm view on Meta::CPAN
We continue this process until the amount of error is small enough that we are
satisfied. In reality, we will rarely if ever get precise results from the
network, but we learn various strategies to interpret the results. In the
example above, we use a "winner takes all" strategy. Which ever of the output
nodes has the greatest value will be the "winner", and thus the answer.
In the examples directory, you will find a program named "logical_or.pl" which
demonstrates the above process.
=head2 Building a network
lib/AI/NeuralNet/Simple.pm view on Meta::CPAN
low enough to be acceptable. Often we have a large data set and merely keep
iterating until the desired error rate is achieved.
=item 3 Measuring results
One frequent mistake made with neural networks is failing to test the network
with different data from the training data. It's quite possible for a
backpropagation network to hit what is known as a "local minimum" which is not
truly where it should be. This will cause false results. To check for this,
after training we often feed in other known good data for verification. If the
results are not satisfactory, perhaps a different number of neurons per layer
view all matches for this distribution