view release on metacpan or search on metacpan
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
The precise terms and conditions for copying, distribution and
modification follow.
5. By copying, distributing or modifying the Program (or any work based
on the Program) you indicate your acceptance of this license to do so,
and all its terms and conditions.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the original
licensor to copy, distribute or modify the Program subject to these
terms and conditions. You may not impose any further restrictions on the
recipients' exercise of the rights granted herein.
7. The Free Software Foundation may publish revised and/or new versions
recipients of the item may redistribute it under the same conditions they
received it.
1. You may make and give away verbatim copies of the source form of the
Standard Version of this Package without restriction, provided that you
duplicate all of the original copyright notices and associated disclaimers.
2. You may apply bug fixes, portability fixes and other modifications derived
from the Public Domain or from the Copyright Holder. A Package modified in such
a way shall still be considered the Standard Version.
view all matches for this distribution
view release on metacpan or search on metacpan
BackProp.pm view on Meta::CPAN
my $delta = $ammount * ($what - $value) * $self->{SYNAPSES}->{LIST}->[$i]->{INPUT};
$self->{SYNAPSES}->{LIST}->[$i]->{WEIGHT} += $delta;
$self->{SYNAPSES}->{LIST}->[$i]->{PKG}->weight($ammount,$what);
}
# This formula in use by default is original by me (Josiah Bryan) as far as I know.
# If it is equal, then don't adjust
#
### Disabled because this soemtimes causes
### infinte loops when learning with range limits enabled
view all matches for this distribution
view release on metacpan or search on metacpan
- removed remaining Inline::C macros and INLINE.h
- moved train() into C
- now parsing input_ and output_dim parameters (finally!)
0.05 Mon Jul 20 13:20:06 2009
- re-added support for labels, originally in AI::NN::SOM
- added original AI::NN::SOM test suite (and it works!)
0.04 Sat Jul 18 16:45:27 2009
- removed dependence on Inline::C
- minor refactor
0.02 Wed Jul 15 18:56:13 2009
- moved data structures into C structs
0.01 Thu Jul 2 09:07:01 2009
- original version; created by h2xs 1.23 with options
-AXn AI::NeuralNet::FastSOM
view all matches for this distribution
view release on metacpan or search on metacpan
0.011 Thu Mar 13 18:21:37 2003
Added unified distance matrix display.
0.01 Thu Mar 13 12:21:37 2003
- original version; created by h2xs 1.21 with options
-X -n AI::NeuralNet::Kohonen::Demo::RGB
view all matches for this distribution
view release on metacpan or search on metacpan
0.011 Thu Mar 13 18:21:37 2003
Added unified distance matrix display.
0.01 Thu Mar 13 12:21:37 2003
- original version; created by h2xs 1.21 with options
-X -n AI::NeuralNet::Kohonen::Demo::RGB
view all matches for this distribution
view release on metacpan or search on metacpan
0.011 Thu Mar 13 14:00:01 2003
Had forgotten to make the find_bmu method public.
Allowed for rectangular maps rather than just square.
0.01 Thu Mar 13 12:32:00 2003
- original version; created by h2xs 1.21 with options
-X -n AI::NeuralNet::Kohonen
view all matches for this distribution
view release on metacpan or search on metacpan
In this module I have included a more accurate form of "learning" for the
mesh. This form preforms descent toward a local error minimum (0) on a
directional delta, rather than the desired value for that node. This allows
for better, and more accurate results with larger datasets. This module also
uses a simpler recursion technique which, suprisingly, is more accurate than
the original technique that I've used in other ANNs.
=head1 EXPORTS
This module exports three functions by default:
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
=over
=item maybe implement the SOM on top of PDL?
=item provide a ::SOM::Compat to have compatibility with the original AI::NeuralNet::SOM?
=item implement different window forms (bubble/gaussian), linear/random
=item implement the format mentioned in the original AI::NeuralNet::SOM
=item add methods as_html to individual topologies
=item add iterators through vector lists for I<initialize> and I<train>
view all matches for this distribution
view release on metacpan or search on metacpan
Added learn_rate() method to expose the network.learn_rate. This
should help programmers who wish to fine-tune the network training.
0.01 Sun Oct 5 10:03:18 2003
- original version; created by h2xs 1.22 with options
-AX -n AI::NeuralNet::Simple
view all matches for this distribution
view release on metacpan or search on metacpan
Permissions for Redistribution of the Standard Version
(2) You may Distribute verbatim copies of the Source form of the
Standard Version of this Package in any medium without restriction,
either gratis or for a Distributor Fee, provided that you duplicate
all of the original copyright notices and associated disclaimers. At
your discretion, such verbatim copies may or may not include a
Compiled form of the Package.
(3) You may apply any bug fixes, portability changes, and other
modifications made available from the Copyright Holder. The resulting
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl extension PBDD.
0.01 Wed Oct 5 17:14:59 2011
- original version; created by h2xs 1.23 with options
-AX PBDD
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/PSO.pm view on Meta::CPAN
my $meMax = 'null'; # 'individuality' maximum random weight (this should really be between 0, 1)
my $themWeight = 'null'; # 'social' weighting constant (higher weight (than individual) means trust group more, self less)
my $themMin = 'null'; # 'social' minimum random weight (this should really be between 0, 1)
my $themMax = 'null'; # 'social' maximum random weight (this should really be between 0, 1)
my $psoRandomRange = 'null'; # PSO::.86 new variable to support original unmodified algorithm
my $useModifiedAlgorithm = 'null';
#-#-# user/debug parameters #-#-#
my $verbose = 0; # This one defaults for obvious reasons...
lib/AI/PSO.pm view on Meta::CPAN
#
# compute_fitness
# - computes the fitness of a particle by using the user-specified fitness function
#
# NOTE: I originally had a 'fitness cache' so that particles that stumbled upon the same
# position wouldn't have to recalculate their fitness (which is often expensive).
# However, this may be undesirable behavior for the user (if you come across the same position
# then you may be settling in on a local maxima so you might want to randomize things and
# keep searching. For this reason, I'm leaving the cache out. It would be trivial
# for users to implement their own cache since they are passed the same array of values.
lib/AI/PSO.pm view on Meta::CPAN
themMax => 1.0, # 'social' maximum random weight
exitFitness => 0.9, # minimum fitness to achieve before exiting
verbose => 0, # 0 prints solution
# 1 prints (Y|N):particle:fitness at each iteration
# 2 dumps each particle (+1)
psoRandomRange => 4.0, # setting this enables the original PSO algorithm and
# also subsequently ignores the me*/them* parameters
);
sub custom_fitness_function(@input) {
lib/AI/PSO.pm view on Meta::CPAN
location in the problem hyperspace. There are also some stochastic
weights involved in the positional updates so that each particle is
truly independent and can take its own search path while still
incorporating good information from other particles. In this
particluar perl module, the user is able to choose from two
implementations of the algorithm. One is the original implementation
from I<Swarm Intelligence> which requires the definition of a
'random range' to which the two stochastic weights are required to
sum. The other implementation allows the user to define the weighting
of how much a particle follows its own path versus following its
peers. In both cases there is an element of randomness.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ParticleSwarmOptimization/MCE.pm view on Meta::CPAN
return $sum;
}
=head1 Description
This module is enhancement of on original AI::ParticleSwarmOptimization to support
multi-core processing with use of MCE. Below you can find original documentation
of that module, but with one difference. There is new parameter "-workers", which
one can use to define of number of parallel processes that will be used during
computations.
The Particle Swarm Optimization technique uses communication of the current best
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ParticleSwarmOptimization/Pmap.pm view on Meta::CPAN
return $sum;
}
=head1 Description
This module is enhancement of on original AI::ParticleSwarmOptimization to support
multi-core processing with use of Pmap. Below you can find original documentation
of that module, but with one difference. There is new parameter "-workers", which
one can use to define of number of parallel processes that will be used during
computations.
The Particle Swarm Optimization technique uses communication of the current best
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl module Win32::MSI::HighLevel
1.000 Sat Oct 18 16:49:30 2008
- original version; created by ExtUtils::ModuleMaker 0.49
1.002 Sun Oct 19 12:10:02 2008
- Fix manifest
1.003 Thu Oct 23 08:20:23 2008
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl extension AI::Pathfinding::AStar::Rectangle.
0.01 Wed Apr 1 13:54:05 2009
- original version; created by h2xs 1.23 with options
-A -n AI::Pathfinding::AStar::Rectangle
0.02 Wed Apr 1 13:54:05 2009
- Some bugfixes
0.16 September 25 23:34 2010
view all matches for this distribution
view release on metacpan or search on metacpan
0.02 Thu Aug 26 09:30:00 2004
- restructured the module to act as a base class - It is now actually useful!!
- reworked the documentation to be more clear
0.01 Sat Sep 10 11:20:23 2003
- original version uploaded to CPAN
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
represent a state, or I<node> in your search space. To use SMA* search to
find a shortest path from a starting node to a goal in your search space, you must
define what a I<node> is, in your search space (or I<point>, or I<state>).
A common example used for informed search methods, and one that is used in Russell's
original paper, is optimal puzzle solving, such as solving an 8 or 15-tile puzzle
in the least number of moves. If trying to solve such a puzzle, a I<node> in the
search space could be defined as a configuration of that puzzle (a paricular
ordering of the tiles).
There is an example provided in the /t directory of this module's distribution,
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
Simplified Memory-bounded A* search (or SMA* search) addresses some of the
limitations of conventional A* search, by bounding the amount of space required
to perform a shortest-path search. This module is an implementation of
SMA*, which was first introduced by Stuart Russell in 1992. SMA* is a simpler,
more efficient variation of the original MA* search introduced by P. Chakrabarti
et al. in 1989 (see references below).
=head2 Motivation and Comparison to A* Search
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
# validate
$nerve->take_lab_test( ... );
$nerve->take_mock_exam( ... );
# fill results to original file
$nerve->validate( {
stimuli_validate => $validation_data_csv,
predicted_column_index => 4,
} );
# or
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
# processing data
use AI::Perceptron::Simple ":process_data";
shuffle_stimuli ( ... )
shuffle_data ( ORIGINAL_STIMULI, $new_file_1, $new_file_2, ... );
shuffle_data ( $original_stimuli => $new_file_1, $new_file_2, ... );
=head1 EXPORT
None by default.
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
I<This module can only process CSV files.>
Any field ie columns that will be used for processing must be binary ie. C<0> or C<1> only. Your dataset can contain other columns with non-binary data as long as they are not one of the dendrites.
There are soem sample dataset which can be found in the C<t> directory. The original dataset can also be found in C<docs/book_list.csv>. The files can also be found L<here|https://github.com/Ellednera/AI-Perceptron-Simple>.
=head1 PERCEPTRON DATA
The perceptron/neuron data is stored using the C<Storable> module.
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
=head2 shuffle_stimuli ( ... )
The parameters and usage are the same as C<shuffled_data>. See the next two subroutines.
=head2 shuffle_data ( $original_data => $shuffled_1, $shuffled_2, ... )
=head2 shuffle_data ( ORIGINAL_DATA, $shuffled_1, $shuffled_2, ... )
Shuffles C<$original_data> or C<ORIGINAL_DATA> and saves them to other files.
=cut
sub shuffle_stimuli {
shuffle_data( @_ );
}
sub shuffle_data {
my $stimuli = shift or croak "Please specify the original file name";
my @shuffled_stimuli_names = @_
or croak "Please specify the output files for the shuffled data";
my @aoa;
for ( @shuffled_stimuli_names ) {
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
Indicates the nerve was tuned up, down or no tuning needed
=item old sum
The original sum of all C<weightage * input> or C<dendrite_size * binary_input>
=item threshold
The threshold of the nerve
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
=head2 validate ( \%options )
This method validates the perceptron against another set of data after it has undergone the training process.
This method calculates the output of each row of data and write the result into the predicted column. The data begin written into the new file or the original file will maintain it's sequence.
Please take note that this method will load all the data of the validation stimuli, so please split your stimuli into multiple files if possible and call this method a few more times.
For C<%options>, the followings are needed unless mentioned:
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
=item results_write_to => $new_csv_file
Optional.
The default behaviour will write the predicted output back into C<stimuli_validate> ie the original data. The sequence of the data will be maintained.
=back
I<*This method will call C<_real_validate_or_test> to do the actual work.>
view all matches for this distribution
view release on metacpan or search on metacpan
+ more documentation & bugfixes
+ wrote a proper test-suite
+ weights & threshold no longer random
0.01
+ original dist created Fri Feb 18 16:05:09 2000 by h2xs 1.19
+ project began July 20, 1999, originally designed on a napkin which,
sadly enough, I probably still have lying around somewhere.
view all matches for this distribution
view release on metacpan or search on metacpan
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
The precise terms and conditions for copying, distribution and
modification follow.
5. By copying, distributing or modifying the Program (or any work based
on the Program) you indicate your acceptance of this license to do so,
and all its terms and conditions.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the original
licensor to copy, distribute or modify the Program subject to these
terms and conditions. You may not impose any further restrictions on the
recipients' exercise of the rights granted herein.
7. The Free Software Foundation may publish revised and/or new versions
recipients of the item may redistribute it under the same conditions they
received it.
1. You may make and give away verbatim copies of the source form of the
Standard Version of this Package without restriction, provided that you
duplicate all of the original copyright notices and associated disclaimers.
2. You may apply bug fixes, portability fixes and other modifications derived
from the Public Domain or from the Copyright Holder. A Package modified in such
a way shall still be considered the Standard Version.
view all matches for this distribution
view release on metacpan or search on metacpan
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
The precise terms and conditions for copying, distribution and
modification follow.
5. By copying, distributing or modifying the Program (or any work based
on the Program) you indicate your acceptance of this license to do so,
and all its terms and conditions.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the original
licensor to copy, distribute or modify the Program subject to these
terms and conditions. You may not impose any further restrictions on the
recipients' exercise of the rights granted herein.
7. The Free Software Foundation may publish revised and/or new versions
recipients of the item may redistribute it under the same conditions they
received it.
1. You may make and give away verbatim copies of the source form of the
Standard Version of this Package without restriction, provided that you
duplicate all of the original copyright notices and associated disclaimers.
2. You may apply bug fixes, portability fixes and other modifications derived
from the Public Domain or from the Copyright Holder. A Package modified in such
a way shall still be considered the Standard Version.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Prolog.pm view on Meta::CPAN
the game. Typing C<halt.> and hitting return twice will allow you to exit.
See the C<bin/> and C<data/> directories in the distribution.
Additionally, you can read L<AI::Prolog::Article> for a better description of
how to use C<AI::Prolog>. This document is an article originally published in
The Perl Review (L<http://www.theperlreview.com/>) and which they have
graciously allowed me to redistribute.
See also Robert Pratte's perl.com article, "Logic Programming with Perl and
Prolog" (L<http://www.perl.com/pub/a/2005/12/15/perl_prolog.html>) for more
view all matches for this distribution
view release on metacpan or search on metacpan
maint/process-notebook.pl view on Meta::CPAN
## Check output (if on TTY)
if [ -t 0 ]; then
perldoc $DST;
fi
## Check and run script in the directory of the original (e.g., to get data
## files).
perl -c $DST
#&& perl -MCwd -MPath::Tiny -E '
#my $nb = path(shift @ARGV);
#my $script = path(shift @ARGV)->absolute;
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Termites.pm view on Meta::CPAN
C<samples/termites.pl> can run the simulation and generate nice PNGs.
An online Artificial Termites simulation can be found here:
L<http://www.permutationcity.co.uk/alife/termites.html>.
The origin of this module lies on the following PerlMonks post:
L<http://perlmonks.org/?node_id=908684>.
=head1 AUTHOR
view all matches for this distribution
view release on metacpan or search on metacpan
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl extension AIIA::GMT.
0.01 Sat Jan 26 22:45:35 2008
- original version; created by h2xs 1.23 with options
-cfn AIIA::GMT
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl extension AIX::LPP.
0.4 Tue Apr 9 20:22:16 2002
- original version; created by h2xs 1.20 with options
-X -v 0.4 -n AIX::LPP
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl extension AIX::LVM.
1.1 Fixed print commands and some comments
1.0 Fri Dec 31 23:21:40 2010
- original version;
view all matches for this distribution
view release on metacpan or search on metacpan
Revision history for Perl extension AIX::Perfstat.
0.01 Mon Jul 10 15:22:28 2006
- original version; created by h2xs 1.23 with options
-A -n AIX::Perfstat
0.02 Thu Jul 20 18:22:52 2006
- Added an example Perl script, updated the documentation,
and updated the copyright in all the files.
view all matches for this distribution