view release on metacpan or search on metacpan
lib/AI/NeuralNet/Kohonen/Visual.pm view on Meta::CPAN
sub train { my ($self,$epochs) = (shift,shift);
$epochs = $self->{epochs} unless defined $epochs;
$self->{display_scale} = 10 if not defined $self->{display_scale};
&{$self->{train_start}} if exists $self->{train_start};
$self->prepare_display if not defined $self->{_mw} or ref $self->{_mw} ne 'MainWindow';
# Replaces Tk's MainLoop
for (0..$self->{epochs}) {
lib/AI/NeuralNet/Kohonen/Visual.pm view on Meta::CPAN
$self->{_mw}->destroy;
$self->{_mw} = undef;
return;
}
$self->{t}++; # Measure epoch
&{$self->{epoch_start}} if exists $self->{epoch_start};
for (0..$#{$self->{input}}){
my $target = $self->_select_target;
my $bmu = $self->find_bmu($target);
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Kohonen.pm view on Meta::CPAN
=item map_dim_x
=item map_dim_y
The dimensions of the feature map to create - defaults to a toy 19.
(note: this is Perl indexing, starting at zero).
=item epochs
Number of epochs to run for (see L<METHOD train>).
Minimum number is C<1>.
=item learning_rate
The initial learning rate.
=item train_start
Reference to code to call at the begining of training.
=item epoch_start
Reference to code to call at the begining of every epoch
(such as a colour calibration routine).
=item epoch_end
lib/AI/NeuralNet/Kohonen.pm view on Meta::CPAN
=cut
sub train { my ($self,$epochs) = (shift,shift);
$epochs = $self->{epochs} unless defined $epochs;
&{$self->{train_start}} if exists $self->{train_start};
for my $epoch (1..$epochs){
$self->{t} = $epoch;
&{$self->{epoch_start}} if exists $self->{epoch_start};
for (0..$#{$self->{input}}){
my $target = $self->_select_target;
my $bmu = $self->find_bmu($target);
$self->_adjust_neighbours_of($bmu,$target);
}
view all matches for this distribution
view release on metacpan or search on metacpan
# See POD for usage
sub run {
my $self = shift;
my $inputs = shift;
my $const = $self->{const};
#my $start = new Benchmark;
$inputs = $self->crunch($inputs) if($inputs == 0);
no strict 'refs';
for my $x (0..$#{$inputs}) {
last if($x>$self->{inputs});
d("inputing $inputs->[$x] at index $x with ID $self->{input}->{IDs}->[$x].\n",1);
for my $x ($#{$inputs}+1..$self->{inputs}-1) {
d("inputing 1 at index $x with ID $self->{input}->{IDs}->[$x].\n",1);
$self->{mesh}->[$x]->input(1,$self->{input}->{IDs}->[$x]);
}
}
#$self->{benchmark} = timestr(timediff(new Benchmark, $start));
return $self->{output}->get_outputs();
}
# See POD for usage
sub run_uc {
my $max = $args{max} || 1024; # max iteterations
my $degrade = $args{degrade} || 0; # enable gradient degrading
my $error = ($args{error}>-1 && defined $args{error}) ? $args{error} : -1;
my $dinc = 0.0002; # amount to adjust gradient by
my $diff = 100; # error magin between results
my $start = new Benchmark;
$inputs = $self->crunch($inputs) if($inputs == 0);
$outputs = $self->crunch($outputs) if($outputs == 0);
my ($flag,$ldiff,$cdiff,$_mi,$loop,$y);
while(!$flag && ($max ? $loop<$max : 1)) {
my $b = new Benchmark;
join_cols($outputs,($self->{col_width})?$self->{col_width}:5) if(d()==4);
d("\n",4);
d('.',12);
d('['.join(',',@{$got})."-".join(',',@{$outputs}).']',13);
}
my $str = "Learning took $loop loops and ".timestr(timediff(new Benchmark,$start))."\n";
d($str,3); $self->{benchmark} = "$loop loops and ".timestr(timediff(new Benchmark,$start))."\n";
return $str;
}
# See POD for usage
the ./examples/ directory.
See C<perldoc PCX::Loader> for information on the methods of the object returned.
You can download PCX::Loader from
http://www.josiah.countystart.com/modules/get.pl?pcx-loader:mpod
=head1 CUSTOM ACTIVATION FUNCTIONS
Included in this package are four custom activation functions meant to be used
The network is stored internally as one long array of node objects. The goal here
is to connect one range of nodes in that array to another range of nodes. The calling
function has already calculated the indices into the array, and it passed it to you
as the four arguments after the $self refrence. The first two arguments we will call
$r1a and $r1b. These define the start and end indices of the first range, or "layer." Likewise,
the next two arguemnts, $r2a and $r2b, define the start and end indices of the second
layer. We also grab a refrence to the mesh array so we dont have to type the $self
refrence over and over.
The loop that folows the arguments in the above example is very simple. It opens
a for() loop over the range of numbers, calculating the size instead of just going
Thanks to Rodin for continual suggetions and questions about the module and more.
=head1 DOWNLOAD
You can always download the latest copy of AI::NeuralNet::Mesh
from http://www.josiah.countystart.com/modules/get.pl?mesh:pod
=head1 MAILING LIST
A mailing list has been setup for AI::NeuralNet::Mesh and AI::NeuralNet::BackProp.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
others. So no use of files, no arcane dependencies, etc.
=head2 Scenario
The basic idea is that the neural network consists of a 2-dimensional
array of N-dimensional vectors. When the training is started these
vectors may be completely random, but over time the network learns
from the sample data, which is a set of N-dimensional vectors.
Slowly, the vectors in the network will try to approximate the sample
vectors fed in. If in the sample vectors there were clusters, then
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
like noise and the convergence is not good. To mediate that effect, the learning rate is reduced
over the iterations.
=item C<sigma0>: (optional, defaults to radius)
A non-negative number representing the start value for the learning radius. Practically, the value
should be chosen in such a way to cover a larger part of the map. During the learning process this
value will be narrowed down, so that the learning radius impacts less and less neurons.
B<NOTE>: Do not choose C<1> as the C<log> function is used on this value.
lib/AI/NeuralNet/SOM.pm view on Meta::CPAN
=over
=item providing data vectors
If you provide a list of vectors, these will be used in turn to seed the neurons. If the list is
shorter than the number of neurons, the list will be started over. That way it is trivial to
zero everything:
$nn->initialize ( [ 0, 0, 0 ] );
=item providing no data
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/NeuralNet/Simple.pm view on Meta::CPAN
Our brains are comprised of neurons connected to one another by axons. The
axon makes the actual connection to a neuron via a synapse. When neurons
receive information, they process it and feed this information to other neurons
who in turn process the information and send it further until eventually
commands are sent to various parts of the body and muscles twitch, emotions are
felt and we start eyeing our neighbor's popcorn in the movie theater, wondering
if they'll notice if we snatch some while they're watching the movie.
=head2 A simple example of a neuron
Now that you have a solid biology background (uh, no), how does this work when
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
$r1->resolve( $tx );
undef $_tx;
undef $r1;
});
$self->emit(request => $tx);
$_tx = $self->ua->start_p($tx);
return $res
}
=head2 C<< build_copyModel_request >>
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
$r1->resolve( $tx );
undef $_tx;
undef $r1;
});
$self->emit(request => $tx);
$_tx = $self->ua->start_p($tx);
return $res
}
=head2 C<< build_deleteModel_request >>
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
$r1->resolve( $tx );
undef $_tx;
undef $r1;
});
$self->emit(request => $tx);
$_tx = $self->ua->start_p($tx);
return $res
}
=head2 C<< build_pullModel_request >>
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
$r1->resolve( $tx );
undef $_tx;
undef $r1;
});
$self->emit(request => $tx);
$_tx = $self->ua->start_p($tx);
return $res
}
=head2 C<< build_pushModel_request >>
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
lib/AI/Ollama/Client/Impl.pm view on Meta::CPAN
}
})->retain;
# Start our transaction
$self->emit(request => $tx);
$tx = $self->ua->start_p($tx)->then(sub($tx) {
$r1->resolve( $tx );
undef $r1;
})->catch(sub($err) {
$self->emit(response => $tx, $err);
$r1->fail( $err => $tx );
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/PBDD.pm view on Meta::CPAN
=item B<$cube = makeSet($vars,$size)>
=item B<$cube = makeSet($vars,$size,$offset)>
Create a cube (all-true minterm, e.g. C<$v1 AND $v2 AND $v3> where each C<$vi> is a BDD variable) of C<$size> variables from the array referenced by C<$vars>, starting at position 0 (or C<$offset>).
=item B<$bdd = exists($bdd1,$cube)>
BDD existential quantification. Parameter C<$cube> is an all-true minterm. The returned result is already referenced.
lib/AI/PBDD.pm view on Meta::CPAN
=back
=head1 SEE ALSO
BDDs and their operations are described in many academic papers that can be found on the Internet. A good place to get started with BDDs is the wikipedia article L<http://en.wikipedia.org/wiki/Binary_decision_diagram>.
It can also be useful to look at the test code for this package in the C<t> directory, as well as at the JBDD documentation and exaples at L<http://javaddlib.sourceforge.net/jbdd/>.
=head1 VERSION
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/PSO.pm view on Meta::CPAN
a local maxima. Having more particles active means there is more of
a chance that you will not be stuck in a local maxima. Often times
different neighborhoods (when not configured in a global neighborhood
fashion) will converge to different maxima. It is quite interesting
to watch graphically. If the fitness function is expensive to
compute, then it is often useful to start out with a small number of
particles first and get a feel for how the algorithm converges.
The algorithm implemented in this module is taken from the book
I<Swarm Intelligence> by Russell Eberhart and James Kennedy.
I highly suggest you read the book if you are interested in this
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ParticleSwarmOptimization/MCE.pm view on Meta::CPAN
Set true to initialize particles with a random velocity. Otherwise particle
velocity is set to 0 on initalization.
A range based on 1/100th of -I<-posMax> - I<-posMin> is used for the initial
speed in each dimension of the velocity vector if a random start velocity is
used.
=item I<-stallSpeed>: positive number, optional
Speed below which a particle is considered to be stalled and is repositioned to
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ParticleSwarmOptimization/Pmap.pm view on Meta::CPAN
Set true to initialize particles with a random velocity. Otherwise particle
velocity is set to 0 on initalization.
A range based on 1/100th of -I<-posMax> - I<-posMin> is used for the initial
speed in each dimension of the velocity vector if a random start velocity is
used.
=item I<-stallSpeed>: positive number, optional
Speed below which a particle is considered to be stalled and is repositioned to
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/ParticleSwarmOptimization.pm view on Meta::CPAN
Set true to initialize particles with a random velocity. Otherwise particle
velocity is set to 0 on initalization.
A range based on 1/100th of -I<-posMax> - I<-posMin> is used for the initial
speed in each dimension of the velocity vector if a random start velocity is
used.
=item I<-stallSpeed>: positive number, optional
Speed below which a particle is considered to be stalled and is repositioned to
view all matches for this distribution
view release on metacpan or search on metacpan
Benchmark/perl-vs-xs.pl view on Meta::CPAN
for my $y (0 .. WIDTH_Y - 1 )
{
$m->set_passability($x, $y, $map[$x][$y]) ;
}
}
my ( $x_start, $y_start ) = ( WIDTH_X >> 1, WIDTH_Y >> 1 );
my ( $x_end, $y_end ) = ( 0, 0 );
my $t0 = [gettimeofday];
my $path;
my $r = timethese( -1, {Perl=>sub { astar( $x_start, $y_start, $x_end, $y_end ) },
XS=>sub {$m->astar($x_start, $y_start, $x_end, $y_end);}});
cmpthese($r);
die;
for (0..99) {
$path = &astar( $x_start, $y_start, $x_end, $y_end );
}
print "Elapsed: ".tv_interval ( $t0 )."\n";
print "Path length: ".length($path)."\n";
# start end points
$map[ $x_start ][ $y_start ] = 3;
$map[ $x_end ][ $y_end ] = 4;
# draw path
my %vect = (
# x y
1 => [-1, 1, '|/'],
Benchmark/perl-vs-xs.pl view on Meta::CPAN
7 => [-1,-1, '|\\'],
8 => [ 0,-1, '\'|'],
9 => [ 1,-1, '|/']
);
my ( $x, $y ) = ( $x_start, $y_start );
for ( split //, $path )
{
$map[$x][$y] = '|o';
$x += $vect{$_}->[0];
$y += $vect{$_}->[1];
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Pathfinding/AStar.pm view on Meta::CPAN
return $path;
}
sub findPath {
my ($map, $start, $target) = @_;
my $nodes = {};
my $curr_node = undef;
my $open = Heap::Binomial->new;
#add starting square to the open list
$curr_node = AI::Pathfinding::AStar::AStarNode->new($start,0,0); # AStarNode(id,g,h)
$curr_node->{parent} = undef;
$curr_node->{cost} = 0;
$curr_node->{g} = 0;
$curr_node->{h} = 0;
$curr_node->{inopen} = 1;
$nodes->{$start} = $curr_node;
$open->add($curr_node);
$map->doAStar($target,$open,$nodes,undef);
my $path = $map->fillPath($open,$nodes,$target);
return wantarray ? @{$path} : $path;
}
sub findPathIncr {
my ($map, $start, $target, $state, $max) = @_;
my $open = undef;
my $curr_node = undef;;
my $nodes = {};
if (defined($state)) {
$nodes = $state->{'visited'};
$open = $state->{'open'};
}
else {
$open = Heap::Binomial->new;
#add starting square to the open list
$curr_node = AI::Pathfinding::AStar::AStarNode->new($start,0,0); # AStarNode(id,g,h)
$curr_node->{parent} = undef;
$curr_node->{cost} = 0;
$curr_node->{g} = 0;
$curr_node->{h} = 0;
$curr_node->{inopen} = 1;
$nodes->{$start} = $curr_node;
$open->add($curr_node);
}
$map->doAStar($target,$open,$nodes,$max);
lib/AI/Pathfinding/AStar.pm view on Meta::CPAN
package main;
use My::Map::Package;
my $map = My::Map::Package->new or die "No map for you!";
my $path = $map->findPath($start, $target);
print join(', ', @$path), "\n";
#Or you can do it incrementally, say 3 nodes at a time
my $state = $map->findPathIncr($start, $target, undef, 3);
while ($state->{path}->[-1] ne $target) {
print join(', ', @{$state->{path}}), "\n";
$state = $map->findPathIncr($start, $target, $state, 3);
}
print "Completed Path: ", join(', ', @{$state->{path}}), "\n";
=head1 DESCRIPTION
This module implements the A* pathfinding algorithm. It acts as a base class from which a custom map object can be derived. It requires from the map object a subroutine named C<getSurrounding> (described below) and provides to the object two routin...
AI::Pathfinding::AStar requires that the map object define a routine named C<getSurrounding> which accepts the starting and target node ids for which you are calculating the path. In return it should provide an array reference containing the followi...
=over
=item * Node ID
lib/AI/Pathfinding/AStar.pm view on Meta::CPAN
=back
Basically you should return an array reference like this: C<[ [$node1, $cost1, $h1], [$node2, $cost2, $h2], [...], ...];> For more information on heuristics and the best ways to calculate them, visit the links listed in the I<SEE ALSO> section below...
As mentioned earlier, AI::Pathfinding::AStar provides two routines named C<findPath> and C<findPathIncr>. C<findPath> requires as input the starting and target node identifiers. It is unimportant what format you choose for your node IDs. As long a...
=head1 PREREQUISITES
This module requires Heap (specifically Heap::Binomial and Heap::Elem) to function.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
has _quotas_are_cb => ( isa => 'Bool', is => 'rw' );
has _quotas_expr => ( isa => 'Maybe[Str]', is => 'rw' );
has _should_rle_be_done => ( isa => 'Bool', is => 'rw' );
has _should_trace_be_done => ( isa => 'Bool', is => 'rw' );
has _simulate_to => ( isa => 'Maybe[Str]', is => 'rw' );
has _start_board => ( isa => 'Int', is => 'rw' );
has _stats_factors =>
( isa => 'HashRef', is => 'rw', default => sub { return +{}; }, );
my $_component_re = qr/[A-Za-z][A-Za-z0-9_]*/;
my $_module_re = qr/$_component_re(?:::$_component_re)*/;
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
sub BUILD
{
my $self = shift;
# Command line parameters
my $_start_board = 1;
my $num_boards = 32000;
my $output_filename = "-";
my $should_trace_be_done = 0;
my $should_rle_be_done = 1;
my $_quotas_expr = undef;
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
man => \$man,
"o|output=s" => \$output_filename,
"num-boards=i" => \$num_boards,
"trace" => \$should_trace_be_done,
"rle!" => \$should_rle_be_done,
"start-board=i" => \$_start_board,
"quotas-expr=s" => \$_quotas_expr,
"quotas-are-cb" => \$quotas_are_cb,
"offset-quotas" => \$offset_quotas,
"opt-for=s" => \$optimize_for,
"simulate-to=s" => \$simulate_to,
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
--output=[filename] | -o [filename] - output to this file instead of STDOUT.
EOF
return;
}
$self->_start_board($_start_board);
$self->_num_boards($num_boards);
$self->_output_filename($output_filename);
$self->_should_trace_be_done($should_trace_be_done);
$self->_should_rle_be_done($should_rle_be_done);
$self->_quotas_expr($_quotas_expr);
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
# TODO : Restore later.
$self->_input_obj(
$class->new(
{
start_board => $self->_start_board(),
num_boards => $self->_num_boards(),
}
)
);
}
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
sub _get_line_of_command
{
my $self = shift;
my $args_string = join( " ",
$self->_start_board(),
$self->_start_board() + $self->_num_boards() - 1, 1 );
return "freecell-solver-range-parallel-solve $args_string";
}
sub _line_ends_mapping
{
lib/AI/Pathfinding/OptimizeMultiple/App/CmdLine.pm view on Meta::CPAN
my $self = shift;
my $board = shift;
my $results = $self->_arbitrator()->calc_board_iters($board);
print "\@info=" . join( ",", @{ $results->{per_scan_iters} } ) . "\n";
print +( $board + $self->_start_board() ) . ": "
. $results->{board_iters} . "\n";
}
sub _real_do_trace
{
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
###################################################################
#
# Add a state from which to begin the search. There can
# be multiple start-states.
#
###################################################################
sub add_start_state
{
my ($self, $state) = @_;
my $state_eval_func = $self->{_state_eval_func};
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
}
###################################################################
#
# start the SMAstar search process
#
###################################################################
sub start_search
{
my ($self,
$log_function,
$str_function,
$max_states_in_queue,
$max_cost,
) = @_;
if(!defined($str_function)){
croak "SMAstar start_search: str_function is not defined.\n";
}
sma_star_tree_search(\($self->{_priority_queue}),
\&AI::Pathfinding::SMAstar::Path::is_goal,
\&AI::Pathfinding::SMAstar::Path::get_descendants_iterator_smastar,
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
# gets called once per iteration, useful for showing algorithm progress
_show_prog_func => \&FrontierObj::progress_callback,
);
# You can start the search from multiple start-states.
# Add the initial states to the smastar object before starting the search.
foreach my $frontierObj (@start_states){
$smastar->add_start_state($frontierObj);
}
#
# Start the search. If successful, $frontierGoalPath will contain the
# goal path. The optimal path to the goal node will be encoded in the
# ancestry of the goal path. $frontierGoalPath->antecedent() contains
# the goal path's parent path, and so forth back to the start path, which
# contains only the start state.
#
# $frontierGoalPath->state() contains the goal FrontierObj itself.
#
my $frontierGoalPath = $smastar->start_search(
\&log_function, # returns a string used for logging progress
\&str_function, # returns a string used to *uniquely* identify a node
$max_states_in_queue, # indicate the maximum states allowed in memory
$MAX_COST, # indicate the maximum cost allowed in search
);
In the example above, a hypothetical object, C<FrontierObj>, is used to
represent a state, or I<node> in your search space. To use SMA* search to
find a shortest path from a starting node to a goal in your search space, you must
define what a I<node> is, in your search space (or I<point>, or I<state>).
A common example used for informed search methods, and one that is used in Russell's
original paper, is optimal puzzle solving, such as solving an 8 or 15-tile puzzle
in the least number of moves. If trying to solve such a puzzle, a I<node> in the
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
=head3 A* search
A* Search is an I<optimal> and I<complete> algorithm for computing a sequence of
operations leading from a system's start-state (node) to a specified goal.
In this context, I<optimal> means that A* search will return the shortest
(or cheapest) possible sequence of operations (path) leading to the goal,
and I<complete> means that A* will always find a path to
the goal if such a path exists.
lib/AI/Pathfinding/SMAstar.pm view on Meta::CPAN
my $smastar = AI::Pathfinding::SMAstar->new();
Creates a new SMA* search object.
=head2 start_search()
my $frontierGoalObj = $smastar->start_search(
\&log_function, # returns a string used for logging progress
\&str_function, # returns a string used to *uniquely* identify a node
$max_states_in_queue, # indicate the maximum states allowed in memory
$MAX_COST, # indicate the maximum cost allowed in search
);
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
This is the CSV file containing the validation data, make sure that it contains a column with the predicted values as it is needed in the next key mentioned: C<predicted_column_index>
=item predicted_column_index => $column_number
This is the index of the column that contains the predicted output values. C<$index> starts from C<0>.
This column will be filled with binary numbers and the full new data will be saved to the file specified in the C<results_write_to> key.
=item results_write_to => $new_csv_file
lib/AI/Perceptron/Simple.pm view on Meta::CPAN
#####
my $stimuli_validate = $data_hash_ref->{ stimuli_validate };
my $predicted_index = $data_hash_ref->{ predicted_column_index };
# actual processing starts here
my $output_file = defined $data_hash_ref->{ results_write_to }
? $data_hash_ref->{ results_write_to }
: $stimuli_validate;
# open for writing results
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Perceptron.pm view on Meta::CPAN
sub compute_output {
my $self = shift;
my @inputs = @_;
my $sum = $self->threshold; # start at threshold
for my $i (0 .. $self->num_inputs-1) {
$sum += $self->weights->[$i] * $inputs[$i];
}
# binary (returning the real $sum is not part of this model)
view all matches for this distribution
view release on metacpan or search on metacpan
third parties under the terms of this General Public License (except
that you may choose to grant warranty protection to some or all
third parties, at your option).
c) If the modified program normally reads commands interactively when
run, you must cause it, when started running for such interactive use
in the simplest and most usual way, to print or display an
announcement including an appropriate copyright notice and a notice
that there is no warranty (or else, saying that you provide a
warranty) and that users may redistribute the program under these
conditions, and telling the user how to view a copy of this General
possible use to humanity, the best way to achieve this is to make it
free software which everyone can redistribute and change under these
terms.
To do so, attach the following notices to the program. It is safest to
attach them to the start of each source file to most effectively convey
the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) 19yy <name of author>
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) 19xx name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
view all matches for this distribution
view release on metacpan or search on metacpan
third parties under the terms of this General Public License (except
that you may choose to grant warranty protection to some or all
third parties, at your option).
c) If the modified program normally reads commands interactively when
run, you must cause it, when started running for such interactive use
in the simplest and most usual way, to print or display an
announcement including an appropriate copyright notice and a notice
that there is no warranty (or else, saying that you provide a
warranty) and that users may redistribute the program under these
conditions, and telling the user how to view a copy of this General
possible use to humanity, the best way to achieve this is to make it
free software which everyone can redistribute and change under these
terms.
To do so, attach the following notices to the program. It is safest to
attach them to the start of each source file to most effectively convey
the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) 19yy <name of author>
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) 19xx name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/Prolog.pm view on Meta::CPAN
join ", " => map { /^$RE{num}{real}$/ ? $_ : $proto->quote($_) } @_;
}
sub continue {
my $self = shift;
return 1 unless $self->{_engine}; # we haven't started yet!
!$self->{_engine}->halt;
}
1;
lib/AI/Prolog.pm view on Meta::CPAN
either game with the command:
aiprolog data/spider.pro
aiprolog data/sleepy.pro
When the C<aiprolog> shell starts, you can type C<start.> to see how to play
the game. Typing C<halt.> and hitting return twice will allow you to exit.
See the C<bin/> and C<data/> directories in the distribution.
Additionally, you can read L<AI::Prolog::Article> for a better description of
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/SimulatedAnnealing.pm view on Meta::CPAN
} # end while
push @lists, \@list;
} # next $number_spec
# Populate @cursors with the starting position for each list of numbers:
for (0..$#lists) {
push @cursors, 0;
} # next
# Perform the tests:
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/TensorFlow/Libtensorflow.pm view on Meta::CPAN
=head1 DESCRIPTION
The C<libtensorflow> library provides low-level C bindings
for TensorFlow with a stable ABI.
For more detailed information about this library including how to get started,
see L<AI::TensorFlow::Libtensorflow::Manual>.
=head1 CLASS METHODS
=head2 Version
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AI/XGBoost/DMatrix.pm view on Meta::CPAN
Get the weight of each instance
=head2 set_base_margin
Set base margin of booster to start from
=head3 Parameters
=over 4
view all matches for this distribution
view release on metacpan or search on metacpan
ax|||n
bad_type|||
bind_match|||
block_end|||
block_gimme||5.004000|
block_start|||
boolSV|5.004000||p
boot_core_PerlIO|||
boot_core_UNIVERSAL|||
boot_core_xsutils|||
bytes_from_utf8||5.007001|
ix|||n
jmaybe|||
keyword|||
leave_scope|||
lex_end|||
lex_start|||
linklist|||
listkids|||
list|||
load_module_nocontext|||vn
load_module||5.006000|v
package|||
packlist||5.008001|
pad_add_anon|||
pad_add_name|||
pad_alloc|||
pad_block_start|||
pad_check_dup|||
pad_compname_type|||
pad_findlex|||
pad_findmy|||
pad_fixup_inner_anons|||
pv_display||5.006000|
pv_uni_display||5.007003|
qerror|||
re_croak2|||
re_dup|||
re_intuit_start||5.006000|
re_intuit_string||5.006000|
realloc||5.007002|n
reentrant_free|||
reentrant_init|||
reentrant_retry|||vn
simplify_sort|||
skipspace|||
sortsv||5.007003|
ss_dup|||
stack_grow|||
start_glob|||
start_subparse||5.004000|
stashpv_hvname_match||5.009003|
stdize_locale|||
strEQ|||
strGE|||
strGT|||
strnNE|||
study_chunk|||
sub_crush_depth|||
sublex_done|||
sublex_push|||
sublex_start|||
sv_2bool|||
sv_2cv|||
sv_2io|||
sv_2iuv_non_preserve|||
sv_2iv_flags||5.009001|
next unless $f =~ /$match/;
print "\n=== $f ===\n\n";
my $info = 0;
if ($API{$f}{base} || $API{$f}{todo}) {
my $base = format_version($API{$f}{base} || $API{$f}{todo});
print "Supported at least starting from perl-$base.\n";
$info++;
}
if ($API{$f}{provided}) {
my $todo = $API{$f}{todo} ? format_version($API{$f}{todo}) : "5.003";
print "Support by $ppport provided back to perl-$todo.\n";
/* Hint: newCONSTSUB
* Returns a CV* as of perl-5.7.1. This return value is not supported
* by Devel::PPPort.
*/
/* newCONSTSUB from IO.xs is in the core starting with 5.004_63 */
#if ((PERL_VERSION < 4) || ((PERL_VERSION == 4) && (PERL_SUBVERSION < 63))) && ((PERL_VERSION != 4) || (PERL_SUBVERSION != 5))
#if defined(NEED_newCONSTSUB)
static void DPPP_(my_newCONSTSUB)(HV *stash, char *name, SV *sv);
static
#else
PL_curstash = PL_curcop->cop_stash = stash;
newSUB(
#if ((PERL_VERSION < 3) || ((PERL_VERSION == 3) && (PERL_SUBVERSION < 22)))
start_subparse(),
#elif ((PERL_VERSION == 3) && (PERL_SUBVERSION == 22))
start_subparse(0),
#else /* 5.003_23 onwards */
start_subparse(FALSE, 0),
#endif
newSVOP(OP_CONST, 0, newSVpv(name,0)),
newSVOP(OP_CONST, 0, &PL_sv_no), /* SvPV(&PL_sv_no) == "" -- GMB */
newSTATEOP(0, Nullch, newSVOP(OP_CONST, 0, sv))
void
DPPP_(my_sv_catpvf_mg)(pTHX_ SV *sv, const char *pat, ...)
{
va_list args;
va_start(args, pat);
sv_vcatpvfn(sv, pat, strlen(pat), &args, Null(SV**), 0, Null(bool*));
SvSETMAGIC(sv);
va_end(args);
}
void
DPPP_(my_sv_catpvf_mg_nocontext)(SV *sv, const char *pat, ...)
{
dTHX;
va_list args;
va_start(args, pat);
sv_vcatpvfn(sv, pat, strlen(pat), &args, Null(SV**), 0, Null(bool*));
SvSETMAGIC(sv);
va_end(args);
}
void
DPPP_(my_sv_setpvf_mg)(pTHX_ SV *sv, const char *pat, ...)
{
va_list args;
va_start(args, pat);
sv_vsetpvfn(sv, pat, strlen(pat), &args, Null(SV**), 0, Null(bool*));
SvSETMAGIC(sv);
va_end(args);
}
void
DPPP_(my_sv_setpvf_mg_nocontext)(SV *sv, const char *pat, ...)
{
dTHX;
va_list args;
va_start(args, pat);
sv_vsetpvfn(sv, pat, strlen(pat), &args, Null(SV**), 0, Null(bool*));
SvSETMAGIC(sv);
va_end(args);
}
* which is why the stack variable has been renamed to 'xdigit'.
*/
#ifndef grok_bin
#if defined(NEED_grok_bin)
static UV DPPP_(my_grok_bin)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result);
static
#else
extern UV DPPP_(my_grok_bin)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result);
#endif
#ifdef grok_bin
# undef grok_bin
#endif
#define grok_bin(a,b,c,d) DPPP_(my_grok_bin)(aTHX_ a,b,c,d)
#define Perl_grok_bin DPPP_(my_grok_bin)
#if defined(NEED_grok_bin) || defined(NEED_grok_bin_GLOBAL)
UV
DPPP_(my_grok_bin)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result)
{
const char *s = start;
STRLEN len = *len_p;
UV value = 0;
NV value_nv = 0;
const UV max_div_2 = UV_MAX / 2;
|| (!overflowed && value > 0xffffffff )
#endif
) {
warn("Binary number > 0b11111111111111111111111111111111 non-portable");
}
*len_p = s - start;
if (!overflowed) {
*flags = 0;
return value;
}
*flags = PERL_SCAN_GREATER_THAN_UV_MAX;
#endif
#endif
#ifndef grok_hex
#if defined(NEED_grok_hex)
static UV DPPP_(my_grok_hex)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result);
static
#else
extern UV DPPP_(my_grok_hex)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result);
#endif
#ifdef grok_hex
# undef grok_hex
#endif
#define grok_hex(a,b,c,d) DPPP_(my_grok_hex)(aTHX_ a,b,c,d)
#define Perl_grok_hex DPPP_(my_grok_hex)
#if defined(NEED_grok_hex) || defined(NEED_grok_hex_GLOBAL)
UV
DPPP_(my_grok_hex)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result)
{
const char *s = start;
STRLEN len = *len_p;
UV value = 0;
NV value_nv = 0;
const UV max_div_16 = UV_MAX / 16;
|| (!overflowed && value > 0xffffffff )
#endif
) {
warn("Hexadecimal number > 0xffffffff non-portable");
}
*len_p = s - start;
if (!overflowed) {
*flags = 0;
return value;
}
*flags = PERL_SCAN_GREATER_THAN_UV_MAX;
#endif
#endif
#ifndef grok_oct
#if defined(NEED_grok_oct)
static UV DPPP_(my_grok_oct)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result);
static
#else
extern UV DPPP_(my_grok_oct)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result);
#endif
#ifdef grok_oct
# undef grok_oct
#endif
#define grok_oct(a,b,c,d) DPPP_(my_grok_oct)(aTHX_ a,b,c,d)
#define Perl_grok_oct DPPP_(my_grok_oct)
#if defined(NEED_grok_oct) || defined(NEED_grok_oct_GLOBAL)
UV
DPPP_(my_grok_oct)(pTHX_ char *start, STRLEN *len_p, I32 *flags, NV *result)
{
const char *s = start;
STRLEN len = *len_p;
UV value = 0;
NV value_nv = 0;
const UV max_div_8 = UV_MAX / 8;
|| (!overflowed && value > 0xffffffff )
#endif
) {
warn("Octal number > 037777777777 non-portable");
}
*len_p = s - start;
if (!overflowed) {
*flags = 0;
return value;
}
*flags = PERL_SCAN_GREATER_THAN_UV_MAX;
view all matches for this distribution
view release on metacpan or search on metacpan
# Before `make install' is performed this script should be runnable with
# `make test'. After `make install' it should work as `perl test.pl'
######################### We start with some black magic to print on failure.
# Change 1..1 below to 1..last_test_to_print .
# (It may become useful if the test is moved to ./t subdirectory.)
BEGIN { $| = 1; print "1..1\n"; }
view all matches for this distribution
view release on metacpan or search on metacpan
#!perl -w
# Before `make install' is performed this script should be runnable with
# `make test'. After `make install' it should work as `perl test.pl'
######################### We start with some black magic to print on failure.
# Change 1..1 below to 1..last_test_to_print .
# (It may become useful if the test is moved to ./t subdirectory.)
BEGIN { $| = 1; print "1..2\n"; }
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AIX/Perfstat.pm view on Meta::CPAN
The C<AIX::Perfstat::cpu>, C<AIX::Perfstat::disk>, and
C<AIX::Perfstat::netinterface> functions each take up to
two arguments and return a reference to an array of hashes. The
arguments specify the number of records to return, and the name
of the record to start with. These arguments are equivalent to the
C<desired_number> and C<name> parameters to the C<perfstat> functions.
Only valid data is returned (Example: If you call
C<AIX::Perfstat::netinterface(5)> on a machine with only 2 network
interfaces, the returned array will only contain two entries.) When
these functions are called with a variable for the name parameter
view all matches for this distribution
view release on metacpan or search on metacpan
lib/ALBD.pm view on Meta::CPAN
# means that CUI C0000000 and C1111111 co-occurred 10 times).
#
# Now with an understanding of the data strucutres, below is a breif
# description of each:
#
# startingMatrix <- A matrix containing the explicit matrix rows for all of the
# start terms. This makes it easy to have multiple start terms
# and using this matrix as opposed to the entire explicit
# matrix drastically improves performance.
# explicitMatrix <- A matrix containing explicit connections (known connections)
# for every CUI in the dataset.
# implicitMatrix <- A matrix containing implicit connections (discovered
lib/ALBD.pm view on Meta::CPAN
# performs LBD
# input: none
# ouptut: none, but a results file is written to disk
sub performLBD {
my $self = shift;
my $start; #used to record run times
#implicit matrix ranking requires a different set of procedures
if ($lbdOptions{'rankingProcedure'} eq 'implicitMatrix') {
$self->performLBD_implicitMatrixRanking();
return;
lib/ALBD.pm view on Meta::CPAN
}
print "Open Discovery\n";
print $self->_parametersToString();
#Get inputs
my $startCuisRef = $self->_getStartCuis();
my $linkingAcceptTypesRef = $self->_getAcceptTypes('linking');
my $targetAcceptTypesRef = $self->_getAcceptTypes('target');
print "startCuis = ".(join(',', @{$startCuisRef}))."\n";
print "linkingAcceptTypes = ".(join(',', keys %{$linkingAcceptTypesRef}))."\n";
print "targetAcceptTypes = ".(join(',', keys %{$targetAcceptTypesRef}))."\n";
#Get the Explicit Matrix
$start = time;
my $explicitMatrixRef;
if(!defined $lbdOptions{'explicitInputFile'}) {
die ("ERROR: explicitInputFile must be defined in LBD config file\n");
}
$explicitMatrixRef = Discovery::fileToSparseMatrix($lbdOptions{'explicitInputFile'});
print "Got Explicit Matrix in ".(time() - $start)."\n";
#Get the Starting Matrix
$start = time();
my $startingMatrixRef =
Discovery::getRows($startCuisRef, $explicitMatrixRef);
print "Got Starting Matrix in ".(time() - $start)."\n";
#if using average minimum weight, grab the a->b scores
my %abPairsWithScores = ();
if ($lbdOptions{'rankingProcedure'} eq 'averageMinimumWeight'
|| $lbdOptions{'rankingProcedure'} eq 'ltc_amw') {
lib/ALBD.pm view on Meta::CPAN
if ((scalar keys %{$linkingAcceptTypesRef}) > 0) {
Filters::semanticTypeFilter_columns(
$explicitMatrixRef, $linkingAcceptTypesRef, $umls_interface);
}
#initialize the abPairs to frequency of co-occurrence
foreach my $row (keys %{$startingMatrixRef}) {
foreach my $col (keys %{${$startingMatrixRef}{$row}}) {
$abPairsWithScores{"$row,$col"} = ${${$startingMatrixRef}{$row}}{$col};
}
}
Rank::getBatchAssociationScores(\%abPairsWithScores, $explicitMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association);
}
#Apply Semantic Type Filter to the explicit matrix
if ((scalar keys %{$linkingAcceptTypesRef}) > 0) {
$start = time();
Filters::semanticTypeFilter_rowsAndColumns(
$explicitMatrixRef, $linkingAcceptTypesRef, $umls_interface);
print "Semantic Type Filter in ".(time() - $start)."\n";
}
#Get Implicit Connections
$start = time();
my $implicitMatrixRef;
if (defined $lbdOptions{'implicitInputFile'}) {
$implicitMatrixRef = Discovery::fileToSparseMatrix($lbdOptions{'implicitInputFile'});
} else {
$implicitMatrixRef = Discovery::findImplicit($explicitMatrixRef, $startingMatrixRef);
}
print "Got Implicit Matrix in ".(time() - $start)."\n";
#Remove Known Connections
$start = time();
$implicitMatrixRef = Discovery::removeExplicit($startingMatrixRef, $implicitMatrixRef);
print "Removed Known Connections in ".(time() - $start)."\n";
#Apply Semantic Type Filter
if ((scalar keys %{$targetAcceptTypesRef}) > 0) {
$start = time();
Filters::semanticTypeFilter_columns(
$implicitMatrixRef, $targetAcceptTypesRef, $umls_interface);
print "Semantic Type Filter in ".(time() - $start)."\n";
}
#Score Implicit Connections
$start = time();
my $scoresRef;
if ($lbdOptions{'rankingProcedure'} eq 'allPairs') {
$scoresRef = Rank::scoreImplicit_fromAllPairs($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association);
} elsif ($lbdOptions{'rankingProcedure'} eq 'averageMinimumWeight') {
$scoresRef = Rank::scoreImplicit_averageMinimumWeight($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association, \%abPairsWithScores);
} elsif ($lbdOptions{'rankingProcedure'} eq 'linkingTermCount') {
$scoresRef = Rank::scoreImplicit_linkingTermCount($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef);
} elsif ($lbdOptions{'rankingProcedure'} eq 'frequency') {
$scoresRef = Rank::scoreImplicit_frequency($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef);
} elsif ($lbdOptions{'rankingProcedure'} eq 'ltcAssociation') {
$scoresRef = Rank::scoreImplicit_ltcAssociation($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association);
} elsif ($lbdOptions{'rankingProcedure'} eq 'ltc_amw') {
$scoresRef = Rank::scoreImplicit_LTC_AMW($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association, \%abPairsWithScores);
} else {
die ("Error: Invalid Ranking Procedure\n");
}
print "Scored in: ".(time()-$start)."\n";
#Rank Implicit Connections
$start = time();
my $ranksRef = Rank::rankDescending($scoresRef);
print "Ranked in: ".(time()-$start)."\n";
#Output The Results
open OUT, ">$lbdOptions{implicitOutputFile}"
or die "unable to open implicit ouput file: "
."$lbdOptions{implicitOutputFile}\n";
lib/ALBD.pm view on Meta::CPAN
# performs LBD, closed discovery
# input: none
# ouptut: none, but a results file is written to disk
sub performLBD_closedDiscovery {
my $self = shift;
my $start; #used to record run times
print "Closed Discovery\n";
print $self->_parametersToString();
#Get inputs
my $startCuisRef = $self->_getStartCuis();
my $targetCuisRef = $self->_getTargetCuis();
my $linkingAcceptTypesRef = $self->_getAcceptTypes('linking');
#Get the Explicit Matrix
$start = time;
my $explicitMatrixRef;
if(!defined $lbdOptions{'explicitInputFile'}) {
die ("ERROR: explicitInputFile must be defined in LBD config file\n");
}
$explicitMatrixRef = Discovery::fileToSparseMatrix($lbdOptions{'explicitInputFile'});
print "Got Explicit Matrix in ".(time() - $start)."\n";
#Get the Starting Matrix
$start = time();
my $startingMatrixRef =
Discovery::getRows($startCuisRef, $explicitMatrixRef);
print "Got Starting Matrix in ".(time() - $start)."\n";
print " numRows in startMatrix = ".(scalar keys %{$startingMatrixRef})."\n";
#Apply Semantic Type Filter to the explicit matrix
if ((scalar keys %{$linkingAcceptTypesRef}) > 0) {
$start = time();
Filters::semanticTypeFilter_rowsAndColumns(
$explicitMatrixRef, $linkingAcceptTypesRef, $umls_interface);
print "Semantic Type Filter in ".(time() - $start)."\n";
}
#Get the Target Matrix
$start = time();
my $targetMatrixRef =
Discovery::getRows($targetCuisRef, $explicitMatrixRef);
print "Got Target Matrix in ".(time() - $start)."\n";
print " numRows in targetMatrix = ".(scalar keys %{$targetMatrixRef})."\n";
#find the linking terms in common for starting and target matrices
print "Finding terms in common\n";
#get starting linking terms
my %startLinks = ();
foreach my $row (keys %{$startingMatrixRef}) {
foreach my $col (keys %{${$startingMatrixRef}{$row}}) {
$startLinks{$col} = ${${$startingMatrixRef}{$row}}{$col};
}
}
print " num start links = ".(scalar keys %startLinks)."\n";
#get target linking terms
my %targetLinks = ();
foreach my $row (keys %{$targetMatrixRef}) {
foreach my $col (keys %{${$targetMatrixRef}{$row}}) {
$targetLinks{$col} = ${${$targetMatrixRef}{$row}}{$col};
}
}
print " num target links = ".(scalar keys %targetLinks)."\n";
#find linking terms in common
my %inCommon = ();
foreach my $startLink (keys %startLinks) {
if (exists $targetLinks{$startLink}) {
$inCommon{$startLink} = $startLinks{$startLink} + $targetLinks{$startLink};
}
}
print " num in common = ".(scalar keys %inCommon)."\n";
#Score and Rank
#Score the linking terms in common
my $scoresRef = \%inCommon;
#TODO score is just summed frequency right now
#Rank Implicit Connections
$start = time();
my $ranksRef = Rank::rankDescending($scoresRef);
print "Ranked in: ".(time()-$start)."\n";
#Output The Results
open OUT, ">$lbdOptions{implicitOutputFile}"
or die "unable to open implicit ouput file: "
."$lbdOptions{implicitOutputFile}\n";
lib/ALBD.pm view on Meta::CPAN
my $paramsString = $self->_parametersToString();
print OUT $paramsString;
print OUT $outputString;
print OUT "\n\n---------------------------------------\n\n";
print OUT "starting linking terms:\n";
print OUT join("\n", keys %startLinks);
print OUT "\n\n---------------------------------------\n\n";
print OUT "target linking terms:\n";
print OUT join("\n", keys %targetLinks, );
lib/ALBD.pm view on Meta::CPAN
# a new method has been created.
# input: none
# output: none, but a results file is written to disk
sub performLBD_implicitMatrixRanking {
my $self = shift;
my $start; #used to record run times
print $self->_parametersToString();
print "In Implicit Ranking\n";
#Get inputs
my $startCuisRef = $self->_getStartCuis();
my $linkingAcceptTypesRef = $self->_getAcceptTypes('linking');
my $targetAcceptTypesRef = $self->_getAcceptTypes('target');
print "startCuis = ".(join(',', @{$startCuisRef}))."\n";
print "linkingAcceptTypes = ".(join(',', keys %{$linkingAcceptTypesRef}))."\n";
print "targetAcceptTypes = ".(join(',', keys %{$targetAcceptTypesRef}))."\n";
#Score Implicit Connections
$start = time();
my $scoresRef;
$scoresRef = Rank::scoreImplicit_fromImplicitMatrix($startCuisRef, $lbdOptions{'implicitInputFile'}, $lbdOptions{rankingMeasue}, $umls_association);
print "Scored in: ".(time()-$start)."\n";
#Rank Implicit Connections
$start = time();
my $ranksRef = Rank::rankDescending($scoresRef);
print "Ranked in: ".(time()-$start)."\n";
#Output The Results
open OUT, ">$lbdOptions{implicitOutputFile}"
or die "unable to open implicit ouput file: "
."$lbdOptions{implicitOutputFile}\n";
lib/ALBD.pm view on Meta::CPAN
print "In timeSlicing_generatePrecisionAndRecall\n";
my $numIntervals = 10;
#Get inputs
my $startAcceptTypesRef = $self->_getAcceptTypes('start');
my $linkingAcceptTypesRef = $self->_getAcceptTypes('linking');
my $targetAcceptTypesRef = $self->_getAcceptTypes('target');
#Get the Explicit Matrix
lib/ALBD.pm view on Meta::CPAN
}
$explicitMatrixRef = Discovery::fileToSparseMatrix($lbdOptions{'explicitInputFile'});
#------------------------------------------
#create the starting matrix
my $startingMatrixRef
= TimeSlicing::generateStartingMatrix($explicitMatrixRef, \%lbdOptions, $startAcceptTypesRef, $NUM_SAMPLES, $umls_interface);
#get association scores for the starting matrix
my $assocScoresRef = TimeSlicing::getAssociationScores(
$startingMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association);
my ($min, $max) = TimeSlicing::getMinMax($assocScoresRef);
my $range = $max-$min;
#load the post cutoff matrix for the necassary rows
my $postCutoffMatrixRef
= TimeSlicing::loadPostCutOffMatrix($startingMatrixRef, $explicitMatrixRef, $lbdOptions{'postCutoffFileName'});
#apply a semantic type filter to the post cutoff matrix
if ((scalar keys %{$targetAcceptTypesRef}) > 0) {
Filters::semanticTypeFilter_columns(
$postCutoffMatrixRef, $targetAcceptTypesRef, $umls_interface);
lib/ALBD.pm view on Meta::CPAN
if ($numSamples == 0) {
$numSamples = 10;
}
#apply a threshold (number of samples)
my $thresholdedStartingMatrixRef = TimeSlicing::grabKHighestRankedSamples($numSamples, $assocScoresRef, $startingMatrixRef);
#generate implicit knowledge
my $implicitMatrixRef = Discovery::findImplicit($explicitMatrixRef, $thresholdedStartingMatrixRef);
#Remove Known Connections
$implicitMatrixRef
= Discovery::removeExplicit($startingMatrixRef, $implicitMatrixRef);
#apply a semantic type filter to the implicit matrix
if ((scalar keys %{$targetAcceptTypesRef}) > 0) {
Filters::semanticTypeFilter_columns(
$implicitMatrixRef, $targetAcceptTypesRef, $umls_interface);
lib/ALBD.pm view on Meta::CPAN
# output: none, but precision, recall, precision at k, and map values
# output to STDOUT
sub timeSlicing_generatePrecisionAndRecall_implicit {
my $NUM_SAMPLES = 200; #TODO, read fomr file number of samples to average over for timeslicing
my $self = shift;
my $start; #used to record run times
print "In timeSlicing_generatePrecisionAndRecall_implicit\n";
#Get inputs
my $startAcceptTypesRef = $self->_getAcceptTypes('start');
my $linkingAcceptTypesRef = $self->_getAcceptTypes('linking');
my $targetAcceptTypesRef = $self->_getAcceptTypes('target');
#-----------
# Starting Matrix Creation
lib/ALBD.pm view on Meta::CPAN
if(!defined $lbdOptions{'explicitInputFile'}) {
die ("ERROR: explicitInputFile must be defined in LBD config file\n");
}
$explicitMatrixRef = Discovery::fileToSparseMatrix($lbdOptions{'explicitInputFile'});
#create the starting matrix
print "generating starting\n";
my $startingMatrixRef
= TimeSlicing::generateStartingMatrix($explicitMatrixRef, \%lbdOptions, $startAcceptTypesRef, $NUM_SAMPLES, $umls_interface);
#----------
#--------
# Gold Loading/Creation
lib/ALBD.pm view on Meta::CPAN
print "inputting gold\n";
$goldMatrixRef = Discovery::fileToSparseMatrix($lbdOptions{'goldInputFile'});
}
else {
print "loading post cutoff\n";
$goldMatrixRef = TimeSlicing::loadPostCutOffMatrix($startingMatrixRef, $explicitMatrixRef, $lbdOptions{'postCutoffFileName'});
#remove explicit knowledge from the post cutoff matrix
$goldMatrixRef = Discovery::removeExplicit($startingMatrixRef, $goldMatrixRef);
#apply a semantic type filter to the post cutoff matrix
print "applying semantic filter to post-cutoff matrix\n";
if ((scalar keys %{$targetAcceptTypesRef}) > 0) {
Filters::semanticTypeFilter_columns(
lib/ALBD.pm view on Meta::CPAN
if ((scalar keys %{$linkingAcceptTypesRef}) > 0) {
Filters::semanticTypeFilter_columns(
$explicitMatrixRef, $linkingAcceptTypesRef, $umls_interface);
}
#intitialize the abPairs to the frequency of co-ocurrence
foreach my $row (keys %{$startingMatrixRef}) {
foreach my $col (keys %{${$startingMatrixRef}{$row}}) {
$abPairsWithScores{"$row,$col"} = ${${$startingMatrixRef}{$row}}{$col};
}
}
Rank::getBatchAssociationScores(
\%abPairsWithScores, $explicitMatrixRef, $lbdOptions{'rankingMeasure'}, $umls_association);
}
lib/ALBD.pm view on Meta::CPAN
print "generating predictions\n";
#generate implicit knowledge
print "Squaring Matrix\n";
$predictionsMatrixRef = Discovery::findImplicit(
$explicitMatrixRef, $startingMatrixRef);
#Remove Known Connections
print "Removing Known from Predictions\n";
$predictionsMatrixRef
= Discovery::removeExplicit($startingMatrixRef, $predictionsMatrixRef);
#apply a semantic type filter to the predictions matrix
print "Applying Semantic Filter to Predictions\n";
if ((scalar keys %{$targetAcceptTypesRef}) > 0) {
Filters::semanticTypeFilter_columns(
lib/ALBD.pm view on Meta::CPAN
#-------------------------------------------
#At this point, the explicitMatrixRef has been filtered and thresholded
#The predictions matrix Ref has been generated from the filtered and
# thresholded explicitMatrixRef, only rows of starting terms remain, filtered, and
# had explicit removed
#Association scores are generated using the explicitMatrixRef
#--------------
# Get the ranks of all predictions
#--------------
#get the scores and ranks seperately for each row
# thereby generating scores and ranks for each starting
# term individually
my %rowRanks = ();
my ($n1pRef, $np1Ref, $npp);
print "getting row ranks\n";
foreach my $rowKey (keys %{$predictionsMatrixRef}) {
#grab rows from start and implicit matrices
my %startingRow = ();
$startingRow{$rowKey} = ${$startingMatrixRef}{$rowKey};
my %implicitRow = ();
$implicitRow{$rowKey} = ${$predictionsMatrixRef}{$rowKey};
#Score Implicit Connections
my $scoresRef;
if ($lbdOptions{'rankingProcedure'} eq 'allPairs') {
#get stats just a single time
if (!defined $n1pRef || !defined $np1Ref || !defined $npp) {
($n1pRef, $np1Ref, $npp) = Rank::getAllStats($explicitMatrixRef);
}
$scoresRef = Rank::scoreImplicit_fromAllPairs(\%startingRow, $explicitMatrixRef, \%implicitRow, $lbdOptions{'rankingMeasure'}, $umls_association, $n1pRef, $np1Ref, $npp);
} elsif ($lbdOptions{'rankingProcedure'} eq 'averageMinimumWeight') {
#get stats just a single time
if (!defined $n1pRef || !defined $np1Ref || !defined $npp) {
($n1pRef, $np1Ref, $npp) = Rank::getAllStats($explicitMatrixRef);
}
$scoresRef = Rank::scoreImplicit_averageMinimumWeight(\%startingRow, $explicitMatrixRef, \%implicitRow, $lbdOptions{'rankingMeasure'}, $umls_association, \%abPairsWithScores, $n1pRef, $np1Ref, $npp);
} elsif ($lbdOptions{'rankingProcedure'} eq 'linkingTermCount') {
$scoresRef = Rank::scoreImplicit_linkingTermCount(\%startingRow, $explicitMatrixRef, \%implicitRow);
} elsif ($lbdOptions{'rankingProcedure'} eq 'frequency') {
$scoresRef = Rank::scoreImplicit_frequency(\%startingRow, $explicitMatrixRef, \%implicitRow);
} elsif ($lbdOptions{'rankingProcedure'} eq 'ltcAssociation') {
$scoresRef = Rank::scoreImplicit_ltcAssociation(\%startingRow, $explicitMatrixRef, \%implicitRow, $lbdOptions{'rankingMeasure'}, $umls_association);
} elsif ($lbdOptions{'rankingProcedure'} eq 'ltc_amw') {
#get stats just a single time
if (!defined $n1pRef || !defined $np1Ref || !defined $npp) {
($n1pRef, $np1Ref, $npp) = Rank::getAllStats($explicitMatrixRef);
}
$scoresRef = Rank::scoreImplicit_LTC_AMW(\%startingRow, $explicitMatrixRef, \%implicitRow, $lbdOptions{'rankingMeasure'}, $umls_association, \%abPairsWithScores, $n1pRef, $np1Ref, $npp);
} else {
die ("Error: Invalid Ranking Procedure\n");
}
#Rank Implicit Connections
lib/ALBD.pm view on Meta::CPAN
close IN;
return \%optionsHash;
}
# transforms the string of start cuis to an array
# input: none
# output: an array ref of CUIs
sub _getStartCuis {
my $self = shift;
my @startCuis = split(',',$lbdOptions{'startCuis'});
return \@startCuis;
}
# transforms the string of target cuis to an array
# input: none
# output: an array ref of CUIs
lib/ALBD.pm view on Meta::CPAN
# functions for debugging
##############################################################################
=comment
sub debugLBD {
my $self = shift;
my $startingCuisRef = shift;
print "Starting CUIs = ".(join(',', @{$startingCuisRef}))."\n";
#Get the Explicit Matrix
my ($explicitMatrixRef, $cuiToIndexRef, $indexToCuiRef, $matrixSize) =
Discovery::tableToSparseMatrix('N_11', $cuiFinder);
print "Explicit Matrix:\n";
_printMatrix($explicitMatrixRef, $matrixSize, $indexToCuiRef);
print "-----------------------\n";
#Get the Starting Matrix
my $startingMatrixRef =
Discovery::getRows($startingCuisRef, $explicitMatrixRef);
print "Starting Matrix:\n";
_printMatrix($startingMatrixRef, $matrixSize, $indexToCuiRef);
print "-----------------------\n";
#Get Implicit Connections
my $implicitMatrixRef
= Discovery::findImplicit($explicitMatrixRef, $startingMatrixRef,
$indexToCuiRef, $matrixSize);
print "Implicit Matrix:\n";
_printMatrix($implicitMatrixRef, $matrixSize, $indexToCuiRef);
print "-----------------------\n";
lib/ALBD.pm view on Meta::CPAN
print "npp = $npp\n";
print "n1p = $n1p\n";
print "np1 = $np1\n";
#Test other rank methods
my $scoresRef = Rank::scoreImplicit_fromAllPairs($startingMatrixRef, $explicitMatrixRef, $implicitMatrixRef, $lbdOptions{rankingMethod}, $umls_association);
my $ranksRef = Rank::rankDescending($scoresRef);
print "Scores: \n";
foreach my $cui (keys %{$scoresRef}) {
print " scores{$cui} = ${$scoresRef}{$cui}\n";
}
view all matches for this distribution
view release on metacpan or search on metacpan
lib/ALPM.pod view on Meta::CPAN
This version of ALPM is compatible with pacman 4.
=head1 SYNOPSIS
## We can start by setting options all by ourselves.
use ALPM;
my $alpm = ALPM->new('/', '/var/lib/db'); # root and dbpath
$alpm->set_cachedirs('/var/cache/pacman/pkg');
$alpm->set_logfile('/var/log/pacman.log');
view all matches for this distribution
view release on metacpan or search on metacpan
lib/AMF/Connection.pm view on Meta::CPAN
If encoding is set to AMF3 the Flex Messaging framework is used on returned responses content (I.e. objects casted to "flex.messaging.messages.AcknowledgeMessage" and "flex.messaging.messages.ErrorMessage" are returned).
Simple batch requests and responses is provided also.
See the sample usage synopsis above to start using the module.
=head1 DATE TYPE SUPPORT
The latest 0.79 version of Storable::AMF added basic date support with the new_date() and perl_date() utilitiy functions. This is just great. Internally an AMF Date Type represents a timestamp in milliseconds since the epoch in UTC ("neutral") timezo...
lib/AMF/Connection.pm view on Meta::CPAN
# ... prepare parameters...
my $searchAMFObject = bless( {
'searchId' => $searchId,
'startHit' => int($startHit),
'searchString' => $searchString,
'hitsPerPage' => ($hitsPerPage) ? int($hitsPerPage) : 20,
'sortId' => $sortId,
}, 'com.mycompany.application.flex.data.SearchQueryFx');
view all matches for this distribution