view release on metacpan or search on metacpan
software--to make sure the software is free for all its users. The
General Public License applies to the Free Software Foundation's
software and to any other program whose authors commit to using it.
You can use it for your programs, too.
When we speak of free software, we are referring to freedom, not
price. Specifically, the General Public License is designed to make
sure that you have the freedom to give away or sell copies of free
software, that you receive source code or can get it if you want it,
that you can change the software or use pieces of it in new free
programs; and that you know you can do these things.
appropriately publish on each copy an appropriate copyright notice and
disclaimer of warranty; keep intact all the notices that refer to this
General Public License and to the absence of any warranty; and give any
other recipients of the Program a copy of this General Public License
along with the Program. You may charge a fee for the physical act of
transferring a copy.
2. You may modify your copy or copies of the Program or any portion of
it, and copy and distribute such modifications under the terms of Paragraph
1 above, provided that you also do the following:
that there is no warranty (or else, saying that you provide a
warranty) and that users may redistribute the program under these
conditions, and telling the user how to view a copy of this General
Public License.
d) You may charge a fee for the physical act of transferring a
copy, and you may at your option offer warranty protection in
exchange for a fee.
Mere aggregation of another independent work with the Program (or its
derivative) on a volume of a storage or distribution medium does not bring
c) accompany it with the information you received as to where the
corresponding source code may be obtained. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form alone.)
Source code for a work means the preferred form of the work for making
modifications to it. For an executable file, complete source code means
all the source code for all modules it contains; but, as a special
exception, it need not include source code for modules which are standard
libraries that accompany the operating system on which the executable
file runs, or for standard header files or definitions files that
view all matches for this distribution
view release on metacpan or search on metacpan
key => $ENV{CLEVERBOT_API_KEY},
nick => $ENV{CLEVERBOT_NICK},
user => $ENV{CLEVERBOT_API_USER},
);
# call to create() is mostly safe, you might get an error
# back but still 200 OK. You can avoid this (and wasting one
# API call) if you know the nick is already active for these
# API credentials.
$cleverbot->create();
METHODS
BUILD_logger
Called automatically if "logger" is not set. By default, it returns
whatever "get_logger" in Log::Any provides, but you can easily override
this in a derived class.
BUILD_ua
Called automatically if "ua" is not set. By default, it returns a plain
nick => 'NickTheRobot',
}
If the current "nick" has already been used for creation, the API call
will fail partially in that status 200 will be returned, but the status
field in the answer will contain an error about the fact that the nick
already exists (Error: reference name already exists). You can safely
ignore this error.
You can optionally pass a different other_nick. This will be set as
"nick" and used for creation (this will overwrite whatever "nick"
contains though).
view all matches for this distribution
view release on metacpan or search on metacpan
software--to make sure the software is free for all its users. The
General Public License applies to the Free Software Foundation's
software and to any other program whose authors commit to using it.
You can use it for your programs, too.
When we speak of free software, we are referring to freedom, not
price. Specifically, the General Public License is designed to make
sure that you have the freedom to give away or sell copies of free
software, that you receive source code or can get it if you want it,
that you can change the software or use pieces of it in new free
programs; and that you know you can do these things.
appropriately publish on each copy an appropriate copyright notice and
disclaimer of warranty; keep intact all the notices that refer to this
General Public License and to the absence of any warranty; and give any
other recipients of the Program a copy of this General Public License
along with the Program. You may charge a fee for the physical act of
transferring a copy.
2. You may modify your copy or copies of the Program or any portion of
it, and copy and distribute such modifications under the terms of Paragraph
1 above, provided that you also do the following:
that there is no warranty (or else, saying that you provide a
warranty) and that users may redistribute the program under these
conditions, and telling the user how to view a copy of this General
Public License.
d) You may charge a fee for the physical act of transferring a
copy, and you may at your option offer warranty protection in
exchange for a fee.
Mere aggregation of another independent work with the Program (or its
derivative) on a volume of a storage or distribution medium does not bring
c) accompany it with the information you received as to where the
corresponding source code may be obtained. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form alone.)
Source code for a work means the preferred form of the work for making
modifications to it. For an executable file, complete source code means
all the source code for all modules it contains; but, as a special
exception, it need not include source code for modules which are standard
libraries that accompany the operating system on which the executable
file runs, or for standard header files or definitions files that
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/DecisionTree.pm view on Meta::CPAN
# We use a minimum-description-length approach. We calculate the
# score of each node:
# n = number of nodes below
# r = number of results (categories) in the entire tree
# i = number of instances in the entire tree
# e = number of errors below this node
# Hypothesis description length (MML):
# describe tree: number of nodes + number of edges
# describe exceptions: num_exceptions * log2(total_num_instances) * log2(total_num_results)
lib/AI/DecisionTree.pm view on Meta::CPAN
consistent with the training data.
The usual goal of a decision tree is to somehow encapsulate the
training data in the smallest possible tree. This is motivated by an
"Occam's Razor" philosophy, in which the simplest possible explanation
for a set of phenomena should be preferred over other explanations.
Also, small trees will make decisions faster than large trees, and
they are much easier for a human to look at and understand. One of
the biggest reasons for using a decision tree instead of many other
machine learning techniques is that a decision tree is a much more
scrutable decision maker than, say, a neural network.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Embedding.pm view on Meta::CPAN
# Create Embedding object
sub new {
my $class = shift;
my %attr = @_;
$attr{'error'} = '';
$attr{'api'} = 'OpenAI' unless $attr{'api'};
$attr{'error'} = 'Invalid API' unless $attr{'api'} eq 'OpenAI';
$attr{'error'} = 'API Key missing' unless $attr{'key'};
$attr{'model'} = 'text-embedding-ada-002' unless $attr{'model'};
return bless \%attr, $class;
}
lib/AI/Embedding.pm view on Meta::CPAN
);
# Returns true if last operation was success
sub success {
my $self = shift;
return !$self->{'error'};
}
# Returns error if last operation failed
sub error {
my $self = shift;
return $self->{'error'};
}
# Header for calling OpenAI
sub _get_header_openai {
my $self = shift;
lib/AI/Embedding.pm view on Meta::CPAN
my $response = $self->_get_embedding($text);
if ($response->{'success'}) {
my $embedding = decode_json($response->{'content'});
return join (',', @{$embedding->{'data'}[0]->{'embedding'}});
}
$self->{'error'} = 'HTTP Error - ' . $response->{'reason'};
return $response if defined $verbose;
return undef;
}
# Return Embedding as an array
lib/AI/Embedding.pm view on Meta::CPAN
my $response = $self->_get_embedding($text);
if ($response->{'success'}) {
my $embedding = decode_json($response->{'content'});
return @{$embedding->{'data'}[0]->{'embedding'}};
}
$self->{'error'} = 'HTTP Error - ' . $response->{'reason'};
return $response if defined $verbose;
return undef;
}
# Return Test Embedding
sub test_embedding {
my ($self, $text, $dimension) = @_;
$self->{'error'} = '';
$dimension = 1536 unless defined $dimension;
if ($text) {
srand scalar split /\s+/, $text;
lib/AI/Embedding.pm view on Meta::CPAN
# Convert a CSV Embedding into a hashref
sub _make_vector {
my ($self, $embed_string) = @_;
if (!defined $embed_string) {
$self->{'error'} = 'Nothing to compare!';
return;
}
my %vector;
my @embed = split /,/, $embed_string;
lib/AI/Embedding.pm view on Meta::CPAN
}
# Return a comparator to compare to a set vector
sub comparator {
my($self, $embed) = @_;
$self->{'error'} = '';
my $vector1 = $self->_make_vector($embed);
return sub {
my($embed2) = @_;
my $vector2 = $self->_make_vector($embed2);
lib/AI/Embedding.pm view on Meta::CPAN
} else {
$vector2 = $self->{'comparator'};
}
if (!defined $vector2) {
$self->{'error'} = 'Nothing to compare!';
return;
}
if (scalar keys %$vector1 != scalar keys %$vector2) {
$self->{'error'} = 'Embeds are unequal length';
return;
}
return $self->_compare_vector($vector1, $vector2);
}
lib/AI/Embedding.pm view on Meta::CPAN
=head2 success
Returns true if the last method call was successful
=head2 error
Returns the last error message or an empty string if B<success> returned true
=head2 embedding
my $csv_embedding = $embedding->embedding('Some text passage', [$verbose]);
Generates an embedding for the given text and returns it as a comma-separated string. The C<embedding> method takes a single parameter, the text to generate the embedding for.
Returns a (rather long) string that can be stored in a C<TEXT> database field.
If the method call fails it sets the L</"error"> message and returns C<undef>. If the optional C<verbose> parameter is true, the complete L<HTTP::Tiny> response object is also returned to aid with debugging issues when using this module.
=head2 raw_embedding
my @raw_embedding = $embedding->raw_embedding('Some text passage', [$verbose]);
Generates an embedding for the given text and returns it as an array. The C<raw_embedding> method takes a single parameter, the text to generate the embedding for.
It is not normally necessary to use this method as the Embedding will almost always be used as a single homogeneous unit.
If the method call fails it sets the L</"error"> message and returns C<undef>. If the optional C<verbose> parameter is true, the complete L<HTTP::Tiny> response object is also returned to aid with debugging issues when using this module.
=head2 test_embedding
my $test_embedding = $embedding->test_embedding('Some text passage', $dimensions);
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Evolve/Befunge/Critter.pm view on Meta::CPAN
local $@ = '';
eval {
$rv = $self->invoke($board);
};
if($@ ne '') {
debug("eval error $@\n");
$rv = Result->new(name => $self->blueprint->name, died => 1);
my $reason = $@;
chomp $reason;
$rv->fate($reason);
}
lib/AI/Evolve/Befunge/Critter.pm view on Meta::CPAN
# sandboxing stuff
{
no warnings 'redefine';
# override Storage->expand() to impose bounds checking
my $_lbsgv_expand;
BEGIN { $_lbsgv_expand = \&Language::Befunge::Storage::Generic::Vec::expand; };
sub _expand {
my ($storage, $v) = @_;
if(exists($$storage{maxsize})) {
lib/AI/Evolve/Befunge/Critter.pm view on Meta::CPAN
# redundant assignment avoids a "possible typo" warning
*Language::Befunge::Storage::Generic::Vec::XS::expand = \&_expand;
*Language::Befunge::Storage::Generic::Vec::XS::expand = \&_expand;
*Language::Befunge::Storage::Generic::Vec::expand = \&_expand;
# override IP->spush() to impose stack size checking
my $_lbip_spush;
BEGIN { $_lbip_spush = \&Language::Befunge::IP::spush; };
sub _spush {
my ($ip, @newvals) = @_;
my $critter = $$ip{_ai_critter};
lib/AI/Evolve/Befunge/Critter.pm view on Meta::CPAN
my $rv = &$_lbip_spush(@_);
return $rv;
}
*Language::Befunge::IP::spush = \&_spush;
# override IP->ss_create() to impose stack count checking
sub _block_open {
my ($interp) = @_;
my $ip = $interp->get_curip;
my $critter = $$ip{_ai_critter};
my $count = $ip->svalue(1);
return $ip->dir_reverse unless $critter->spend($critter->stackcost * $count);
return Language::Befunge::Ops::block_open(@_);
}
# override op_flow_jump_to to impose skip count checking
sub _op_flow_jump_to_wrap {
my ($interp) = @_;
my $ip = $interp->get_curip;
my $critter = $$interp{_ai_critter};
my $count = $ip->svalue(1);
return $ip->dir_reverse unless $critter->spend($critter->repeatcost * abs($count));
return Language::Befunge::Ops::flow_jump_to(@_);
}
# override op_flow_repeat to impose loop count checking
sub _op_flow_repeat_wrap {
my ($interp) = @_;
my $ip = $interp->get_curip;
my $critter = $$interp{_ai_critter};
my $count = $ip->svalue(1);
return $ip->dir_reverse unless $critter->spend($critter->repeatcost * abs($count));
return Language::Befunge::Ops::flow_repeat(@_);
}
# override op_spawn_ip to impose thread count checking
sub _op_spawn_ip_wrap {
my ($interp) = @_;
my $ip = $interp->get_curip;
my $critter = $$interp{_ai_critter};
my $cost = 0;$critter->threadcost;
view all matches for this distribution
view release on metacpan or search on metacpan
inc/Module/Install.pm view on Meta::CPAN
# Whether or not inc::Module::Install is actually loaded, the
# $INC{inc/Module/Install.pm} is what will still get set as long as
# the caller loaded module this in the documented manner.
# If not set, the caller may NOT have loaded the bundled version, and thus
# they may not have a MI version that works with the Makefile.PL. This would
# result in false errors or unexpected behaviour. And we don't want that.
my $file = join( '/', 'inc', split /::/, __PACKAGE__ ) . '.pm';
unless ( $INC{$file} ) { die <<"END_DIE" }
Please invoke ${\__PACKAGE__} with:
inc/Module/Install.pm view on Meta::CPAN
# If the modification time is only slightly in the future,
# sleep briefly to remove the problem.
my $a = $s - time;
if ( $a > 0 and $a < 5 ) { sleep 5 }
# Too far in the future, throw an error.
my $t = time;
if ( $s > $t ) { die <<"END_DIE" }
Your installer $0 has a modification time in the future ($s > $t).
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ExpertSystem/Simple.pm view on Meta::CPAN
The corrct number of arguments were supplied with the method call, however the first
argument, FILENAME, was undefined.
=item Simple->load() XML parse failed
XML Twig encountered some errors when trying to parse the XML knowledgebase.
=item Simple->load() unable to use file
The file supplied to the load( ) method could not be used as it was either not a file
or not readable.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/FANN/Evolving.pm view on Meta::CPAN
'FANN_SIN_SYMMETRIC' => FANN_SIN_SYMMETRIC,
'FANN_COS_SYMMETRIC' => FANN_COS_SYMMETRIC,
# 'FANN_SIN' => FANN_SIN, # range is between 0 and 1
# 'FANN_COS' => FANN_COS, # range is between 0 and 1
},
'errorfunc' => {
'FANN_ERRORFUNC_LINEAR' => FANN_ERRORFUNC_LINEAR,
'FANN_ERRORFUNC_TANH' => FANN_ERRORFUNC_TANH,
},
'stopfunc' => {
'FANN_STOPFUNC_MSE' => FANN_STOPFUNC_MSE,
lib/AI/FANN/Evolving.pm view on Meta::CPAN
$constant{$k} = $v;
}
}
my %default = (
'error' => 0.0001,
'epochs' => 5000,
'train_type' => 'ordinary',
'epoch_printfreq' => 100,
'neuron_printfreq' => 0,
'neurons' => 15,
lib/AI/FANN/Evolving.pm view on Meta::CPAN
}
sub _scalar_properties {
(
training_algorithm => 'train',
train_error_function => 'errorfunc',
train_stop_function => 'stopfunc',
learning_rate => \&_mutate_double,
learning_momentum => \&_mutate_double,
quickprop_decay => \&_mutate_double,
quickprop_mu => \&_mutate_double,
lib/AI/FANN/Evolving.pm view on Meta::CPAN
}
sub _init {
my $self = shift;
my %args = @_;
for ( qw(error epochs train_type epoch_printfreq neuron_printfreq neurons activation_function) ) {
$self->{$_} = $args{$_} // $default{$_};
}
return $self;
}
lib/AI/FANN/Evolving.pm view on Meta::CPAN
# train
$self->{'ann'}->cascadetrain_on_data(
$data,
$self->neurons,
$self->neuron_printfreq,
$self->error,
);
}
else {
$log->debug("normal training");
lib/AI/FANN/Evolving.pm view on Meta::CPAN
# train
$self->{'ann'}->train_on_data(
$data,
$self->epochs,
$self->epoch_printfreq,
$self->error,
);
}
}
=item enum_properties
lib/AI/FANN/Evolving.pm view on Meta::CPAN
Returns a hash whose keys are names of enums and values the possible states for the
enum
=cut
=item error
Getter/setter for the error rate. Default is 0.0001
=cut
sub error {
my $self = shift;
if ( @_ ) {
my $value = shift;
$log->debug("setting error threshold to $value");
return $self->{'error'} = $value;
}
else {
$log->debug("getting error threshold");
return $self->{'error'};
}
}
=item epochs
lib/AI/FANN/Evolving.pm view on Meta::CPAN
}
# this is here so that we can trap method calls that need to be
# delegated to the FANN object. at this point we're not even
# going to care whether the FANN object implements these methods:
# if it doesn't we get the normal error for unknown methods, which
# the user then will have to resolve.
sub AUTOLOAD {
my $self = shift;
my $method = $AUTOLOAD;
$method =~ s/.+://;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/FANN.pm view on Meta::CPAN
Two classes are used: C<AI::FANN> that wraps the C C<struct fann> type
and C<AI::FANN::TrainData> that wraps C<struct fann_train_data>.
=item *
Prefixes and common parts on the C function names referring to those
structures have been removed. For instance C
C<fann_train_data_shuffle> becomes C<AI::FANN::TrainData::shuffle> that
will be usually called as...
$train_data->shuffle;
lib/AI/FANN.pm view on Meta::CPAN
Boolean methods return true on success and undef on failure.
=item *
Any error reported from the C side is automaticaly converter to a Perl
exception. No manual error checking is required after calling FANN
functions.
=item *
Memory management is automatic, no need to call destroy methods.
lib/AI/FANN.pm view on Meta::CPAN
FANN_SIN_SYMMETRIC
FANN_COS_SYMMETRIC
FANN_SIN
FANN_COS
# enum fann_errorfunc_enum:
FANN_ERRORFUNC_LINEAR
FANN_ERRORFUNC_TANH
# enum fann_stopfunc_enum:
FANN_STOPFUNC_MSE
lib/AI/FANN.pm view on Meta::CPAN
=item $ann->reset_MSE
-
=item $ann->train_on_file($filename, $max_epochs, $epochs_between_reports, $desired_error)
-
=item $ann->train_on_data($train_data, $max_epochs, $epochs_between_reports, $desired_error)
C<$train_data> is a AI::FANN::TrainData object.
=item $ann->cascadetrain_on_file($filename, $max_neurons, $neurons_between_reports, $desired_error)
-
=item $ann->cascadetrain_on_data($train_data, $max_neurons, $neurons_between_reports, $desired_error)
C<$train_data> is a AI::FANN::TrainData object.
=item $ann->train_epoch($train_data)
lib/AI/FANN.pm view on Meta::CPAN
=item $ann->training_algorithm($training_algorithm)
-
=item $ann->train_error_function
=item $ann->train_error_function($error_function)
-
=item $ann->train_stop_function
view all matches for this distribution
view release on metacpan or search on metacpan
of labels to the now independent Axis class. Axis will defer to the Label
itself to decide applicability, >,<,>=,<=, and the like.
- changed test.pl to work with the new setup
- added functions: greaterthan, greaterequal, lessthan, lessequal, and between
to AI::Fuzzy::Label
- added overriding of >,>=,<,<=, and <=> in AI::Fuzzy::Label.
0.03 Wed Oct 9 18:07:34 EDT 2002
- added functions: support, core, height, is_normal, is_subnormal
to AI::Fuzzy::Set
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/FuzzyEngine.pm view on Meta::CPAN
my $val = $var_a->defuzzify(); # $var_a returns a 1dim piddle with two elements
So do the fuzzy operations as provided by the fuzzy engine C<$fe> itself.
Any operation on more then one piddle expands those to common
dimensions, if possible, or throws a PDL error otherwise.
The way expansion is done is best explained by code
(see C<< AI::FuzzyEngine->_cat_array_of_piddles(@pdls) >>).
Assuming all piddles are in C<@pdls>,
calculation goes as follows:
view all matches for this distribution
view release on metacpan or search on metacpan
FuzzyInference.pm view on Meta::CPAN
Finally, a defuzzification operator is applied to the aggregated fuzzy
set to generate a single crisp value for each of the output variables.
For a more detailed explanation of fuzzy inference, you can check out
the tutorial by Jerry Mendel at
S<http://sipi.usc.edu/~mendel/publications/FLS_Engr_Tutorial_Errata.pdf>.
Note: The terminology used in this module might differ from that used
in the above tutorial.
view all matches for this distribution
view release on metacpan or search on metacpan
AI/Gene/Sequence.pm view on Meta::CPAN
$rt++;
}
return $rt;
}
# These are intended to be overriden, simple versions are
# provided for the sake of testing.
# Generates things to make up genes
# can be called with a token type to produce, or with none.
# if called with a token type, it will also be passed the original
AI/Gene/Sequence.pm view on Meta::CPAN
=item valid_gene
=back
You may also want to override the following methods:
=over 4
=item new
AI/Gene/Sequence.pm view on Meta::CPAN
This hash should contain keys which fit $1 in C<mutate_(.*)>
and values indicating the weight to be given to that method.
The module will normalise this nicely, so you do not have to.
This lets you define your own mutation methods in addition to
overriding any you do not like in the module.
=item C<mutate_insert([num, pos])>
Inserts a single token into the string at position I<pos>.
The token will be randomly generated by the calling object's
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Genetic/Pro.pm view on Meta::CPAN
#=======================================================================
sub as_array {
my ($self, $chromosome) = @_;
if($self->type eq q/bitvector/){
# This could lead to internal error, bacause of underlaying Tie::Array::Packed
#return @$chromosome if wantarray;
#return $chromosome;
my @chr = @$chromosome;
return @chr if wantarray;
view all matches for this distribution
view release on metacpan or search on metacpan
# sub init():
# This method initializes the population to completely
# random individuals. It deletes all current individuals!!!
# It also examines the type of individuals we want, and
# require()s the proper class. Throws an error if it can't.
# Must pass to it an anon list that will be passed to the
# newRandom method of the individual.
# In case of bitvector, $newArgs is length of bitvector.
# In case of rangevector, $newArgs is anon list of anon lists.
# Takes a variable number of arguments. The first argument is the
# total number, N, of new individuals to add. The remaining arguments
# are genomes to inject. There must be at most N genomes to inject.
# If the number, n, of genomes to inject is less than N, N - n random
# genomes are added. Perhaps an example will help?
# returns 1 on success and undef on error.
sub inject {
my ($self, $count, @genomes) = @_;
unless ($self->{INIT}) {
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Image.pm view on Meta::CPAN
# Create Image object
sub new {
my $class = shift;
my %attr = @_;
$attr{'error'} = '';
$attr{'api'} = 'OpenAI' unless $attr{'api'};
$attr{'error'} = 'Invalid API' unless $attr{'api'} eq 'OpenAI';
$attr{'error'} = 'API Key missing' unless $attr{'key'};
$attr{'model'} = 'dall-e-2' unless $attr{'model'};
$attr{'size'} = '512x512' unless $attr{'size'};
return bless \%attr, $class;
lib/AI/Image.pm view on Meta::CPAN
);
# Returns true if last operation was success
sub success {
my $self = shift;
return !$self->{'error'};
}
# Returns error if last operation failed
sub error {
my $self = shift;
return $self->{'error'};
}
# Header for calling OpenAI
sub _get_header_openai {
my $self = shift;
lib/AI/Image.pm view on Meta::CPAN
The size for the generated image (default: '512x512').
=item debug
Used for testing. If set to any true value, the image method
will return details of the error encountered instead of C<undef>
=back
=head2 image
lib/AI/Image.pm view on Meta::CPAN
my $success = $ai->success();
Returns true if the last operation was successful.
=head2 error
my $error = $ai->error();
Returns the error message if the last operation failed.
=head1 EXAMPLE
It is common that the generated image will want to be saved as a file. This can be easily acheived
using the C<getstore> method of L<LWP::Simple>.
view all matches for this distribution
view release on metacpan or search on metacpan
LibNeural.pm view on Meta::CPAN
=item $nn->train([I1,I2,...],[O1,O2,...],MINERR,TRAINRATE)
Completes a training cycle for the given inputs I1-IN, with the expected
results of O1-OM, where N is the number of inputs and M is the number of
outputs. MINERR is the mean squared error at the output that you wish to be achieved. TRAINRATE is the learning rate to be used.
=item (O1,O2) = $nn->run([I1,I2,...])
Calculate the corresponding outputs (O1-OM) for the given inputs (I1-ON) based
on the previous training. Should only be called after the network has been
view all matches for this distribution
view release on metacpan or search on metacpan
# --- MakeMaker const_config section:
# These definitions are from config.sh (via /usr/lib/perl/5.10/Config.pm).
# They may have been overridden via Makefile.PL or on the command line.
AR = ar
CC = cc
CCCDLFLAGS = -fPIC
CCDLFLAGS = -Wl,-E
DLEXT = so
view all matches for this distribution
view release on metacpan or search on metacpan
scripts/mnist.pl view on Meta::CPAN
my $res;
for my $key ( keys %opt ) {
my $file = "$url/$opt{$key}.gz";
my $ff = File::Fetch->new(uri => $file);
my $aux = $ff->fetch() or die $ff->error;
#print "$file\n";
#$res = $http->get("$file");
#my $content = $res->{content};
# # $res = $http->get("$route/".$opt{$key});
#print STDERR Dumper $content;
view all matches for this distribution
view release on metacpan or search on metacpan
examples/char_lstm.pl view on Meta::CPAN
) or HelpMessage(1);
=head1 NAME
char_lstm.pl - Example of training char LSTM RNN on tiny shakespeare using high level RNN interface
with optional inferred sampling (RNN generates Shakespeare like text)
=head1 SYNOPSIS
--num-layers number of stacked RNN layers, default=2
--num-hidden hidden layer size, default=256
examples/char_lstm.pl view on Meta::CPAN
--batch-size the batch size type, default=32
--bidirectional use bidirectional cell, default false (0)
--disp-batches show progress for every n batches, default=50
--chkp-prefix prefix for checkpoint files, default='lstm_'
--cell-mode RNN cell mode (LSTM, GRU, RNN, default=LSTM)
--sample-size a size of inferred sample text (default=10000) after each epoch
--chkp-epoch save checkpoint after this many epoch, default=1 (saving every checkpoint)
=cut
package AI::MXNet::RNN::IO::ASCIIIterator;
view all matches for this distribution
view release on metacpan or search on metacpan
void* p_list_arguments;
void* p_declare_backward_dependency;
};
/*!
* \brief return str message of the last error
* all function in this file will return 0 when success
* and -1 when an error occured,
* MXGetLastError can be called to retrieve the error
*
* this function is threadsafe and can be called by different thread
* \return error info
*/
const char *MXGetLastError();
//-------------------------------------
// Part 0: Global State setups
const mx_uint ***aux_shape_data,
int *out);
/*!
* \brief partially infer shape of unknown input shapes given the known one.
*
* Return partially inferred results if not all shapes could be inferred.
* The shapes are packed into a CSR matrix represented by arg_ind_ptr and arg_shape_data
* The call will be treated as a kwargs call if key != nullptr or num_args==0, otherwise it is positional.
*
* \param sym symbol handle
* \param num_args numbe of input arguments.
view all matches for this distribution
view release on metacpan or search on metacpan
inc/Module/AutoInstall.pm view on Meta::CPAN
$TestOnly = 1;
}
}
}
# overrides MakeMaker's prompt() to automatically accept the default choice
sub _prompt {
goto &ExtUtils::MakeMaker::prompt unless $AcceptDefault;
my ( $prompt, $default ) = @_;
my $y = ( $default =~ /^[Yy]/ );
inc/Module/AutoInstall.pm view on Meta::CPAN
return
unless system( 'sudo', $^X, $0, "--config=$config",
"--installdeps=$missing" );
print << ".";
*** The 'sudo' command exited with error! Resuming...
.
}
return _prompt(
qq(
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/MegaHAL.pm view on Meta::CPAN
use vars qw(@EXPORT @ISA $VERSION $AUTOLOAD);
@EXPORT = qw(megahal_setnoprompt
megahal_setnowrap
megahal_setnobanner
megahal_seterrorfile
megahal_setstatusfile
megahal_initialize
megahal_initial_greeting
megahal_command
megahal_do_reply
lib/AI/MegaHAL.pm view on Meta::CPAN
# Bless ourselves into the AI::MegaHAL class.
$self = bless({ },$class);
# Make sure that we can find a brain or a training file somewhere
# else die with an error.
my $path = $args{'Path'} || ".";
if(-e "$path/megahal.brn" || -e "$path/megahal.trn") {
chdir($path) || die("Error: chdir: $!\n");
} else {
die("Error: unable to locate megahal.brn or megahal.trn\n");
view all matches for this distribution
view release on metacpan or search on metacpan
bin/from-folder.pl view on Meta::CPAN
our $cache = {};
our @target = split("\/",$opts{cache_file});
my $set = AI::MicroStructure::ObjectSet->new();
eval {
local $^W = 0; # because otherwhise doesn't pass errors
#`rm $opts{cache_file}`;
$cache = lock_retrieve($opts{cache_file});
$cache = {} unless $cache;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NNEasy.pm view on Meta::CPAN
sub NNEasy {
my $this = ref($_[0]) ? shift : undef ;
my $CLASS = ref($this) || __PACKAGE__ ;
my $file = shift(@_) ;
my @out_types = ref($_[0]) eq 'ARRAY' ? @{ shift(@_) } : ( ref($_[0]) eq 'HASH' ? %{ shift(@_) } : shift(@_) ) ;
my $error_ok = shift(@_) ;
my $in = shift(@_) ;
my $out = shift(@_) ;
my @layers = ref($_[0]) eq 'ARRAY' ? @{ shift(@_) } : ( ref($_[0]) eq 'HASH' ? %{ shift(@_) } : shift(@_) ) ;
my $conf = shift(@_) ;
lib/AI/NNEasy.pm view on Meta::CPAN
@out_types = sort {$a <=> $b} @out_types ;
$this->{OUT_TYPES} = \@out_types ;
if ( $error_ok <= 0 ) {
my ($min_dif , $last) ;
my $i = -1 ;
foreach my $out_types_i ( @out_types ) {
++$i ;
if ($i > 0) {
my $dif = $out_types_i - $last ;
$min_dif = $dif if !defined $min_dif || $dif < $min_dif ;
}
$last = $out_types_i ;
}
$error_ok = $min_dif / 2 ;
$error_ok -= $error_ok*0.1 ;
}
$this->{ERROR_OK} = $error_ok ;
return $this ;
}
sub _layer_conf {
lib/AI/NNEasy.pm view on Meta::CPAN
my $out = shift(@_) ;
my $n = shift(@_) ;
$n ||= 100 ;
my $err ;
for (1..100) {
$this->{NN}->run($in) ;
$err = $this->{NN}->learn($out) ;
}
$err *= -1 if $err < 0 ;
return $err ;
}
*_learn_set_get_output_error = \&_learn_set_get_output_error_c ;
sub _learn_set_get_output_error_pl {
my $this = ref($_[0]) ? shift : undef ;
my $CLASS = ref($this) || __PACKAGE__ ;
my $set = shift(@_) ;
my $error_ok = shift(@_) ;
my $ins_ok = shift(@_) ;
my $verbose = shift(@_) ;
for (my $i = 0 ; $i < @$set ; $i+=2) {
$this->{NN}->run($$set[$i]) ;
$this->{NN}->learn($$set[$i+1]) ;
}
my ($err,$learn_ok,$print) ;
for (my $i = 0 ; $i < @$set ; $i+=2) {
$this->{NN}->run($$set[$i]) ;
my $er = $this->{NN}->RMSErr($$set[$i+1]) ;
$er *= -1 if $er < 0 ;
++$learn_ok if $er < $error_ok ;
$err += $er ;
$print .= join(' ',@{$$set[$i]}) ." => ". join(' ',@{$$set[$i+1]}) ." > $er\n" if $verbose ;
}
$err /= $ins_ok ;
return ( $err , $learn_ok , $print ) ;
}
lib/AI/NNEasy.pm view on Meta::CPAN
my $ins_sz = @set / 2 ;
$ins_ok ||= $ins_sz ;
my $err_static_limit = 15 ;
my $err_static_limit_positive ;
if ( ref($limit) eq 'ARRAY' ) {
($limit,$err_static_limit,$err_static_limit_positive) = @$limit ;
}
$limit ||= 30000 ;
$err_static_limit_positive ||= $err_static_limit/2 ;
my $error_ok = $this->{ERROR_OK} ;
my $check_diff_count = 1000 ;
my ($learn_ok,$counter,$err,$err_last,$err_count,$err_static, $reset_count1 , $reset_count2 ,$print) ;
$err_static = 0 ;
while ( ($learn_ok < $ins_ok) && ($counter < $limit) ) {
($err , $learn_ok , $print) = $this->_learn_set_get_output_error(\@set , $error_ok , $ins_ok , $verbose) ;
++$counter ;
if ( !($counter % 100) || $learn_ok == $ins_ok ) {
my $err_diff = $err_last - $err ;
$err_diff *= -1 if $err_diff < 0 ;
$err_count += $err_diff ;
++$err_static if $err_diff <= 0.00001 || $err > 1 ;
print "err_static = $err_static\n" if $verbose && $err_static ;
$err_last = $err ;
my $reseted ;
if ( $err_static >= $err_static_limit || ($err > 1 && $err_static >= $err_static_limit_positive) ) {
$err_static = 0 ;
$counter -= 2000 ;
$reseted = 1 ;
++$reset_count1 ;
if ( ( $reset_count1 + $reset_count2 ) > 2 ) {
lib/AI/NNEasy.pm view on Meta::CPAN
$this->{NN}->init ;
}
}
if ( !($counter % $check_diff_count) ) {
$err_count /= ($check_diff_count/100) ;
print "ERR COUNT> $err_count\n" if $verbose ;
if ( !$reseted && $err_count < 0.001 ) {
$err_static = 0 ;
$counter -= 1000 ;
++$reset_count2 ;
if ( ($reset_count1 + $reset_count2) > 2 ) {
$reset_count1 = $reset_count2 = 0 ;
lib/AI/NNEasy.pm view on Meta::CPAN
print "** Reseting weights due LOW diff...\n" if $verbose ;
$this->{NN}->init ;
}
}
$err_count = 0 ;
}
if ( $verbose ) {
print "\nepoch $counter : error_ok = $error_ok : error = $err : err_diff = $err_diff : err_static = $err_static : ok = $learn_ok\n" ;
print $print ;
}
}
print "epoch $counter : error = $err : ok = $learn_ok\n" if $verbose > 1 ;
}
}
sub get_set_error {
my $this = ref($_[0]) ? shift : undef ;
my $CLASS = ref($this) || __PACKAGE__ ;
my @set = ref($_[0]) eq 'ARRAY' ? @{ shift(@_) } : ( ref($_[0]) eq 'HASH' ? %{ shift(@_) } : shift(@_) ) ;
my $ins_ok = shift(@_) ;
my $ins_sz = @set / 2 ;
$ins_ok ||= $ins_sz ;
my $err ;
for (my $i = 0 ; $i < @set ; $i+=2) {
$this->{NN}->run($set[$i]) ;
my $er = $this->{NN}->RMSErr($set[$i+1]) ;
$er *= -1 if $er < 0 ;
$err += $er ;
}
$err /= $ins_ok ;
return $err ;
}
sub run {
my $this = ref($_[0]) ? shift : undef ;
my $CLASS = ref($this) || __PACKAGE__ ;
lib/AI/NNEasy.pm view on Meta::CPAN
sub out_type_winner {
my $this = ref($_[0]) ? shift : undef ;
my $CLASS = ref($this) || __PACKAGE__ ;
my $val = shift(@_) ;
my ($out_type , %err) ;
foreach my $types_i ( @{ $this->{OUT_TYPES} } ) {
my $er = $types_i - $val ;
$er *= -1 if $er < 0 ;
$err{$types_i} = $er ;
}
my $min_type_err = (sort { $err{$a} <=> $err{$b} } keys %err)[0] ;
$out_type = $min_type_err ;
return $out_type ;
}
lib/AI/NNEasy.pm view on Meta::CPAN
sv_catsv(ret , elem) ;
}
return ret ;
}
void _learn_set_get_output_error_c( SV* self , SV* set , double error_ok , int ins_ok , bool verbose ) {
dXSARGS;
STRLEN len;
int i ;
HV* self_hv = OBJ_HV( self );
AV* set_av = OBJ_AV( set ) ;
SV* nn = FETCH_ATTR(self_hv , "NN") ;
SV* print_verbose = verbose ? sv_2mortal(newSVpv("",0)) : NULL ;
SV* ret ;
double err = 0 ;
double er = 0 ;
int learn_ok = 0 ;
for (i = 0 ; i <= av_len(set_av) ; i+=2) {
SV* set_in = *av_fetch(set_av, i ,0) ;
lib/AI/NNEasy.pm view on Meta::CPAN
SPAGAIN ;
ret = POPs ;
er = SvNV(ret) ;
if (er < 0) er *= -1 ;
if (er < error_ok) ++learn_ok ;
err += er ;
if ( verbose ) sv_catpvf(print_verbose , "%s => %s > %f\n" ,
SvPV( _av_join( OBJ_AV(set_in) ) , len) ,
SvPV( _av_join( OBJ_AV(set_out) ) , len) ,
er
) ;
}
err /= ins_ok ;
if (verbose) {
EXTEND(SP , 3) ;
ST(0) = sv_2mortal(newSVnv(err)) ;
ST(1) = sv_2mortal(newSViv(learn_ok)) ;
ST(2) = print_verbose ;
XSRETURN(3) ;
}
else {
EXTEND(SP , 2) ;
ST(0) = sv_2mortal(newSVnv(err)) ;
ST(1) = sv_2mortal(newSViv(learn_ok)) ;
XSRETURN(2) ;
}
}
lib/AI/NNEasy.pm view on Meta::CPAN
Here's an example of a NN to compute XOR:
use AI::NNEasy ;
## Our maximal error for the output calculation.
my $ERR_OK = 0.1 ;
## Create the NN:
my $nn = AI::NNEasy->new(
'xor.nne' , ## file to save the NN.
[0,1] , ## Output types of the NN.
$ERR_OK , ## Maximal error for output.
2 , ## Number of inputs.
1 , ## Number of outputs.
[3] , ## Hidden layers. (this is setting 1 hidden layer with 3 nodes).
) ;
lib/AI/NNEasy.pm view on Meta::CPAN
[0,1] => [1],
[1,0] => [1],
[1,1] => [0],
);
## Calculate the actual error for the set:
my $set_err = $nn->get_set_error(\@set) ;
## If set error is bigger than maximal error lest's learn this set:
if ( $set_err > $ERR_OK ) {
$nn->learn_set( \@set ) ;
## Save the NN:
$nn->save ;
}
lib/AI/NNEasy.pm view on Meta::CPAN
An array of outputs that the NN can have, so the NN can find the nearest number in this
list to give your the right output.
=item ERROR_OK
The maximal error of the calculated output.
If not defined ERROR_OK will be calculated by the minimal difference between 2 types at
@OUTPUT_TYPES dived by 2:
@OUTPUT_TYPES = [0 , 0.5 , 1] ;
lib/AI/NNEasy.pm view on Meta::CPAN
Here's a completly example of use:
my $nn = AI::NNEasy->new(
'xor.nne' , ## file to save the NN.
[0,1] , ## Output types of the NN.
0.1 , ## Maximal error for output.
2 , ## Number of inputs.
1 , ## Number of outputs.
[3] , ## Hidden layers. (this is setting 1 hidden layer with 3 nodes).
{random_connections=>0 , networktype=>'feedforward' , random_weights=>1 , learning_algorithm=>'backprop' , learning_rate=>0.1 , bias=>1} ,
) ;
lib/AI/NNEasy.pm view on Meta::CPAN
=back
=head2 learn_set (@SET , OK_OUTPUTS , LIMIT , VERBOSE)
Learn a set of inputs until get the right error for the outputs.
=over 4
=item @SET
A list of inputs and outputs.
=item OK_OUTPUTS
Minimal number of outputs that should be OK when calculating the erros.
By default I<OK_OUTPUTS> should have the same size of number of different
inouts in the @SET.
=item LIMIT
lib/AI/NNEasy.pm view on Meta::CPAN
If TRUE turn verbose method ON when learning.
=back
=head2 get_set_error (@SET , OK_OUTPUTS)
Get the actual error of a set in the NN. If the returned error is bigger than
I<ERROR_OK> defined on I<new()> you should learn or relearn the set.
=head2 run (@INPUT)
Run a input and return the output calculated by the NN based in what the NN already have learned.
lib/AI/NNEasy.pm view on Meta::CPAN
Same of I<run()>, but the output will return the nearest output value based in the
I<@OUTPUT_TYPES> defined at I<new()>.
For example an input I<[0,1]> learned that have
the output I<[1]>, actually will return something like 0.98324 as output and
not 1, since the error never should be 0. So, with I<run_get_winner()>
we get the output of I<run()>, let's say that is 0.98324, and find what output
is near of this number, that in this case should be 1. An output [0], will return
by I<run()> something like 0.078964, and I<run_get_winner()> return 0.
=head1 Samples
lib/AI/NNEasy.pm view on Meta::CPAN
Some functions of this module have I<Inline> functions writed in C.
I have made a C version only for the functions that are wild called, like:
AI::NNEasy::_learn_set_get_output_error
AI::NNEasy::NN::tanh
AI::NNEasy::NN::feedforward::run
view all matches for this distribution
view release on metacpan or search on metacpan
examples/add.pl view on Meta::CPAN
[ 15, 15 ], [ 30 ],
[ 12, 8 ], [ 20 ],
]);
my $err = 10;
# Stop after 4096 epochs -- don't want to wait more than that
for ( my $i = 0; ($err > 0.0001) && ($i < 4096); $i++ ) {
$err = $dataset->learn($network);
print "Epoch = $i error = $err\n";
}
foreach (@{$dataset->run($network)})
{
foreach (@$_){print $_}
view all matches for this distribution
view release on metacpan or search on metacpan
/*! \brief handle to a symbol that can be bind as operator */
typedef NNSymbol *SymbolHandle;
/*! \brief handle to Graph */
typedef NNGraph *GraphHandle;
/*!
* \brief Set the last error message needed by C API
* \param msg The error message to set.
*/
void NNAPISetLastError(const char* msg);
/*!
* \brief return str message of the last error
* all function in this file will return 0 when success
* and -1 when an error occured,
* NNGetLastError can be called to retrieve the error
*
* this function is threadsafe and can be called by different thread
* \return error info
*/
const char *NNGetLastError(void);
/*!
* \brief list all the available operator names, include entries.
* \param out_size the size of returned array
view all matches for this distribution
view release on metacpan or search on metacpan
software--to make sure the software is free for all its users. The
General Public License applies to the Free Software Foundation's
software and to any other program whose authors commit to using it.
You can use it for your programs, too.
When we speak of free software, we are referring to freedom, not
price. Specifically, the General Public License is designed to make
sure that you have the freedom to give away or sell copies of free
software, that you receive source code or can get it if you want it,
that you can change the software or use pieces of it in new free
programs; and that you know you can do these things.
appropriately publish on each copy an appropriate copyright notice and
disclaimer of warranty; keep intact all the notices that refer to this
General Public License and to the absence of any warranty; and give any
other recipients of the Program a copy of this General Public License
along with the Program. You may charge a fee for the physical act of
transferring a copy.
2. You may modify your copy or copies of the Program or any portion of
it, and copy and distribute such modifications under the terms of Paragraph
1 above, provided that you also do the following:
that there is no warranty (or else, saying that you provide a
warranty) and that users may redistribute the program under these
conditions, and telling the user how to view a copy of this General
Public License.
d) You may charge a fee for the physical act of transferring a
copy, and you may at your option offer warranty protection in
exchange for a fee.
Mere aggregation of another independent work with the Program (or its
derivative) on a volume of a storage or distribution medium does not bring
c) accompany it with the information you received as to where the
corresponding source code may be obtained. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form alone.)
Source code for a work means the preferred form of the work for making
modifications to it. For an executable file, complete source code means
all the source code for all modules it contains; but, as a special
exception, it need not include source code for modules which are standard
libraries that accompany the operating system on which the executable
file runs, or for standard header files or definitions files that
view all matches for this distribution
view release on metacpan or search on metacpan
- fixing lexical sorting of version numbers
1.10 Tue Feb 22 09:31:12 AST 2011
- fixed testing problems due to differences in precision
- fixed podchecker warning (some space)
- better test error reporting in 2.t
- added t/pod.t, thanks to Michael Stevens
1.9 Tue Aug 31 09:27:51 ADT 2010
- fixed testing problems due to differences in precision in t/2.t
1.8 Fri Aug 21 06:36:34 ADT 2009
- fixed a pod documentation error
1.7 Thu Aug 20 14:20:15 ADT 2009
- improvements in documentation
- added method add_csv_file
- added method drop_attributes
- removed real_attr and added attribute_type field
1.6 Wed Aug 19 09:09:57 ADT 2009
- improved an error message
- fixed some testing problems due to whitespace
- small improvement in generating documentation
1.5 Wed Jan 30 08:06:22 AST 2008
- fixed testing problems due to differences in the lowest
view all matches for this distribution