AI-NNFlex

 view release on metacpan or  search on metacpan

CHANGES  view on Meta::CPAN


0.20
20050308

v0.17 was never released, as I rejigged the whole lot for
object inheritance before I got around to uploading it to CPAN.
Why? I hear you ask, when it worked OK already.
1) its faster, a lot faster.
2) feedforward isn't the only kind of network, and I wanted to
be free to overload some of the methods (especially init) to
simplify writing a Hopfield module (in progress)
3) its more theoretically correct

So now, AI::NNFlex is the base class for the other types of 
networks, and you should never need to call AI::NNFlex class
directly - you should call the constructor of the subclass, such
as:
my $network = AI::NNFlex::momentum->new(params);

The upshot of that is that the network type and learning algorithm
parameters are now obsolete.

CHANGES  view on Meta::CPAN

Added PNG support to AI::NNFlex::draw

Added AI::NNFlex::Dataset
This creates a dataset object that can be run against a
network

Added AI::NNFlex::lesion
Damages a network with a probability of losing a node
or a connection. See the perldoc

Cleaned up the POD docs a bit, although theres a lot still
to do.

################################################################

INSTALL  view on Meta::CPAN

Note: the dependency upon Math::Matrix is for the
Hopfield module only. If you want to use Backprop
you can safely leave it unresolved.

If you want to perform a standard install, placing
the modules etc in the perl standard library locations,
run:
perl Makefile.PL

followed by:
make install

++++++++++++++++++++++++++++++++++++++++++++++++++

TODO  view on Meta::CPAN

Put in some more error checking, particularly trying to create connections
between layers/nodes that don't exist.

Write a simple net simulator with syntax loosely based on xerion. At
present this lot is API driven, it should be straightforward to write
a basic simulator that calls the API in the backend.

read & write methods for both networks and datasets modelled on snns format (for use with frontend script). data should be snns format, network definition file will probably have to differ

Implement an error method in addition to dbug, and clean up the dbug & error calls


examples/bp.pl  view on Meta::CPAN


        #display the overall network error
        #after each epoch
        calcOverallError();

        print "epoch = ".$j."  RMS Error = ".$RMSerror."\n";

    }

    #training has finished
    #display the results
    displayResults();

 }

#============================================================
#********** END OF THE MAIN PROGRAM **************************
#=============================================================



examples/bp.pl  view on Meta::CPAN

 }


#************************************
 sub initData()
 {

    print "initialising data\n";

    # the data here is the XOR data
    # it has been rescaled to the range
    # [-1][1]
    # an extra input valued 1 is also added
    # to act as the bias

    $trainInputs[0][0]  = 1;
    $trainInputs[0][1]  = -1;
    $trainInputs[0][2]  = 1;    #bias
    $trainOutput[0] = 1;

    $trainInputs[1][0]  = -1;

examples/lesion.pl  view on Meta::CPAN

				debug=>[],bias=>1,
				momentum=>0.6,
				round=>1);



$network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);


$network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);

$network->add_layer(	nodes=>1,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"linear",
			randomweights=>1);


$network->init();

my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[0],
			[0,1],[1],
			[1,0],[1],

examples/reinforceTest.pl  view on Meta::CPAN

# this is /really/ experimental - see perldoc NNFlex::reinforce
use AI::NNFlex;

my $object = AI::NNFlex->new([{"nodes"=>2,"persistent activation"=>0,"decay"=>0.0,"random activation"=>0,"threshold"=>0.0,"activation function"=>"tanh","random weights"=>1},
                        {"nodes"=>2,"persistent activation"=>0,"decay"=>0.0,"random activation"=>0,"threshold"=>0.0,"activation function"=>"tanh","random weights"=>1},
                       {"nodes"=>1,"persistent activation"=>0,"decay"=>0.0,"random activation"=>0,"threshold"=>0.0,"activation function"=>"linear","random weights"=>1}],{'random connections'=>0,'networktype'=>'feedforward', 'random weights'=>1,'learn...


$object->run([1,0]);
$output = $object->output();
foreach (@$output)
{
	print "1,0 - $_ ";
}
print "\n";

lib/AI/NNFlex.pm  view on Meta::CPAN

# fromnode=>[LAYER,NODE],tonode=>[LAYER,NODE]
#
# returns success or failure
#
#
#########################################################################
sub connect
{
	my $network = shift;
	my %params = @_;
	my $result = 0;

	if ($params{'fromnode'})
	{
		$result = $network->connectnodes(%params);
	}
	elsif ($params{'fromlayer'})
	{
		$result = $network->connectlayers(%params);
	}
	return $result;

}

########################################################################
# AI::NNFlex::connectlayers
########################################################################
sub connectlayers
{
	my $network=shift;
	my %params = @_;

lib/AI/NNFlex.pm  view on Meta::CPAN


=head2 AI::NNFlex->new ( parameter => value );
	

randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

debug=>[LIST OF CODES FOR MODULES TO DEBUG]

round=>0 or 1, a true value sets the network to round output values to nearest of 1, -1 or 0


The constructor implements a fairly generalised network object with a number of parameters.


The following parameters are optional:
 randomweights
 fixedweights
 debug
 round

lib/AI/NNFlex.pm  view on Meta::CPAN

=head2 AI::NNFlex

=head3 add_layer

 Syntax:

 $network->add_layer(	nodes=>NUMBER OF NODES IN LAYER,
			persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
			decay=>RATE OF ACTIVATION DECAY PER PASS,
			randomactivation=>MAXIMUM STARTING ACTIVATION,
			threshold=>NYI,
			activationfunction=>"ACTIVATION FUNCTION",
			randomweights=>MAX VALUE OF STARTING WEIGHTS);

Add layer adds whatever parameters you specify as attributes of the layer, so if you want to implement additional parameters simply use them in your calling code.

Add layer returns success or failure, and if successful adds a layer object to the $network->{'layers'} array. This layer object contains an attribute $layer->{'nodes'}, which is an array of nodes in the layer.

=head3 init

 Syntax:

lib/AI/NNFlex.pm  view on Meta::CPAN

Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.

=head1 SEE ALSO

 AI::NNFlex::Backprop
 AI::NNFlex::Feedforward
 AI::NNFlex::Mathlib
 AI::NNFlex::Dataset

 AI::NNEasy - Developed by Graciliano M.Passos 
 (Shares some common code with NNFlex)
 

=head1 TODO

 Lots of things:

 clean up the perldocs some more
 write gamma modules
 write BPTT modules
 write a perceptron learning module

lib/AI/NNFlex.pm  view on Meta::CPAN

v0.11 introduces the lesion method, png support in the draw module and datasets.

v0.12 fixes a bug in reinforce.pm & adds a reflector in feedforward->run to make $network->run($dataset) work.

v0.13 introduces the momentum learning algorithm and fixes a bug that allowed training to proceed even if the node activation function module can't be loaded

v0.14 fixes momentum and backprop so they are no longer nailed to tanh hidden units only.

v0.15 fixes a bug in feedforward, and reduces the debug overhead

v0.16 changes some underlying addressing of weights, to simplify and speed  

v0.17 is a bugfix release, plus some cleaning of UI

v0.20 changes AI::NNFlex to be a base class, and ships three different network types (i.e. training algorithms). Backprop & momentum are both networks of the feedforward class, and inherit their 'run' method from feedforward.pm. 0.20 also fixes a who...

v0.21 cleans up the perldocs more, and makes nnflex more distinctly a base module. There are quite a number of changes in Backprop in the v0.21 distribution.

v0.22 introduces the ::connect method, to allow creation of recurrent connections, and manual control over connections between nodes/layers.

v0.23 includes a Hopfield module in the distribution.

v0.24 fixes a bug in the bias weight calculations

=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net

=cut

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN


	fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

	debug=>[LIST OF CODES FOR MODULES TO DEBUG]

	learningrate=>the learning rate of the network

	momentum=>the momentum value (momentum learning only)

	round=>0 or 1 - 1 sets the network to round output values to
		nearest of 1, -1 or 0

	fahlmanconstant=>0.1
		


The following parameters are optional:

 randomweights

 fixedweights

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN

=head2 AI::NNFlex::Backprop

=head2 add_layer

 Syntax:

 $network->add_layer(	nodes=>NUMBER OF NODES IN LAYER,
			persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
			decay=>RATE OF ACTIVATION DECAY PER PASS,
			randomactivation=>MAXIMUM STARTING ACTIVATION,
			threshold=>NYI,
			activationfunction=>"ACTIVATION FUNCTION",
			errorfunction=>'ERROR TRANSFORMATION FUNCTION',
			randomweights=>MAX VALUE OF STARTING WEIGHTS);


The activation function must be defined in AI::NNFlex::Mathlib. Valid predefined activation functions are tanh & linear.

The error transformation function defines a transform that is done on the error value. It must be a valid function in AI::NNFlex::Mathlib. Using a non linear transformation function on the error value can sometimes speed up training.

The following parameters are optional:

 persistentactivation

 decay

 randomactivation

 threshold

 errorfunction

 randomweights



=head2 init

 Syntax:

lib/AI/NNFlex/Backprop.pm  view on Meta::CPAN


Graciliano M.Passos for suggestions & improved code (see SEE ALSO).

Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.

=head1 SEE ALSO

 AI::NNFlex

 AI::NNEasy - Developed by Graciliano M.Passos 
 Shares some common code with NNFlex. 
 

=head1 TODO



=head1 CHANGES


=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net



=cut

lib/AI/NNFlex/Dataset.pm  view on Meta::CPAN


Method to delete existing dataset entries by index

Method to validate linear separability of a dataset.

=head1 CHANGES


=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify 
it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net



=cut

lib/AI/NNFlex/Feedforward.pm  view on Meta::CPAN

 AI::NNFlex::Backprop
 AI::NNFlex::Dataset 


=head1 CHANGES



=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net

=cut

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN


	# Get a list of all the nodes in the network
	foreach my $layer (@{$network->{'layers'}})
	{
		foreach my $node (@{$layer->{'nodes'}})
		{
			# cover the assumption that some inherited code
			# will require an activation function
			if (!$node->{'activationfunction'})
			{
				$node->{'activationfunction'}= 'hopfield_threshold';
				$node->{'activation'} =0;
				$node->{'lastactivation'} = 0;
			}
			push @nodes,$node;
		}
	}

	# we'll probably need this later
	$network->{'nodes'} = \@nodes;

lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

	my $product = $inversepattern->multiply($patternmatrix);

	my $weights = $product->subtract($minus);

	my @element = ('1');
	my @truearray;
	for (1..scalar @{$dataset->{'data'}}){push @truearray,"1"}
	
	my $truematrix = Math::Matrix->new(\@truearray);

	my $thresholds = $truematrix->multiply($patternmatrix);
	#$thresholds = $thresholds->transpose();

	my $counter=0;
	foreach (@{$network->{'nodes'}})
	{
		my @slice;
		foreach (@{$weights->slice($counter)})
		{
			push @slice,$$_[0];
		}

		push @slice,${$thresholds->slice($counter)}[0][0];

		$_->{'connectednodes'}->{'weights'} = \@slice;
		$counter++;
	}

	return 1;

}


lib/AI/NNFlex/Hopfield.pm  view on Meta::CPAN

=head1 TODO

More detailed documentation. Better tests. More examples.

=head1 CHANGES

v0.1 - new module

=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net



=cut

lib/AI/NNFlex/Mathlib.pm  view on Meta::CPAN

#######################################################
#
# Version history
# ===============
#
# 1.0 	CColbourn	20050315	Compiled into a
#					single module
#
# 1.1	CColbourn	20050321	added in sigmoid_slope
#
# 1.2	CColbourn	20050330	Added in hopfield_threshold
#
# 1,3	CColbourn	20050407	Changed sigmoid function to
#					a standard sigmoid. sigmoid2
#					now contains old sigmoid,
#					which is more used in BPTT
#					and I think needs cross 
#					entropy calc to work.
#
#######################################################
#Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify

package AI::NNFlex::Mathlib;
use strict;

#######################################################
# tanh activation function
#######################################################
sub tanh
{

lib/AI/NNFlex/Mathlib.pm  view on Meta::CPAN



	my $return = $value * (1-$value);
	if (scalar @debug > 0)
	{$network->dbug("sigmoid_slope returning $value",5);}

	return $return;
}

############################################################
# hopfield_threshold
# standard hopfield threshold activation - doesn't need a 
# slope (because hopfield networks don't use them!)
############################################################
sub hopfield_threshold
{
	my $network = shift;
	my $value = shift;

	if ($value <0){return -1}
	if ($value >0){return 1}
	return $value;
}

############################################################

lib/AI/NNFlex/Mathlib.pm  view on Meta::CPAN

1;

=pod

=head1 NAME

AI::NNFlex::Mathlib - miscellaneous mathematical functions for the AI::NNFlex NN package

=head1 DESCRIPTION

The AI::NNFlex::Mathlib package contains activation and error functions. At present there are the following:

Activation functions

=over

=item *
tanh

=item *
linear

=item *
hopfield_threshold

=back

Error functions

=over

=item *
atanh

=back

If you want to implement your own activation/error functions, you can add them to this module. All activation functions to be used by certain types of net (like Backprop) require an additional function <function name>_slope, which returns the 1st ord...

This rule doesn't apply to all network types. Hopfield for example requires no slope calculation.

=head1 CHANGES

v1.2 includes hopfield_threshold

=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net



=cut

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

#
##########################################################
# Versions
# ========
#
# 1.0	20041125	CColbourn	New module
# 1.1	20050116	CColbourn	Fixed reverse @layers
#					bug reported by GM Passos
#
# 1.2	20050218	CColbourn	Mod'd to change weight
#					addressing from hash to
#					array for nnf0.16
#
# 1.3	20050307	CColbourn	repackaged as a subclass
#					of nnflex
#
##########################################################
# ToDo
# ----
#
#

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

	
	randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

	fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

	debug=>[LIST OF CODES FOR MODULES TO DEBUG]

	learningrate=>the learning rate of the network

	round=>0 or 1 - 1 sets the network to round output values to
		nearest of 1, -1 or 0


The following parameters are optional:
 randomweights
 fixedweights
 debug
 round

(Note, if randomweights is not specified the network will default to a random value from 0 to 1.

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN

=head2 AI::NNFlex

=head3 add_layer

 Syntax:

 $network->add_layer(	nodes=>NUMBER OF NODES IN LAYER,
			persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
			decay=>RATE OF ACTIVATION DECAY PER PASS,
			randomactivation=>MAXIMUM STARTING ACTIVATION,
			threshold=>NYI,
			activationfunction=>"ACTIVATION FUNCTION",
			randomweights=>MAX VALUE OF STARTING WEIGHTS);

=head3 init

 Syntax:

 $network->init();

Initialises connections between nodes, sets initial weights and loads external components. The base AI::NNFlex init method implementes connections backwards and forwards from each node in each layer to each node in the preceeding and following layers...

lib/AI/NNFlex/Reinforce.pm  view on Meta::CPAN


=head1 SEE ALSO

 AI::NNFlex
 AI::NNFlex::Backprop
 AI::NNFlex::Dataset


=head1 COPYRIGHT

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

=head1 CONTACT

 charlesc@nnflex.g0n.net



=cut

t/Backprop.t  view on Meta::CPAN

my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# add an extra layer to test out connect
$result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"sigmoid",
			randomweights=>1);


# Test initialise network
$result = $network->init();
ok($result); #test 3
##


# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result); 

# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],

t/Backprop.t  view on Meta::CPAN

##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

# test saving weights
$result = $network->dump_state(filename=>'state.wts',activations=>1);
ok($result);

# test loading weights
$result = $network->load_state(filename=>'state.wts');
ok($result);

t/Dataset.t  view on Meta::CPAN

my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# test adding an entry
$result = $dataset->add([[1,1],[0,1]]);
ok($result);

# test save
$result = $dataset->save(filename=>'test.pat');
ok ($result);

# test empty dataset
my $dataset2 = AI::NNFlex::Dataset->new();
ok($dataset);

# test load
$result = $dataset2->load(filename=>'test.pat');
ok($result);

#  compare original & loaded dataset
my $comparison;
if (scalar @{$dataset->{'data'}} == scalar @{$dataset2->{'data'}}){$comparison=1}
ok($comparison);

# delete a pair from the dataset
$result = $dataset->delete([4,5]);
ok($result);

# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/Hopfield.t  view on Meta::CPAN

# example script to build a hopfield net
use strict;
use AI::NNFlex::Hopfield;
use AI::NNFlex::Dataset;
use Test;


BEGIN{plan tests=>4}
my $matrixpresent = eval("require(Math::Matrix)");
my $matrixabsent = !$matrixpresent;

my $network = AI::NNFlex::Hopfield->new();

skip($matrixabsent,$network);


$network->add_layer(nodes=>2);
$network->add_layer(nodes=>2);

my $result = $network->init();
skip($matrixabsent,$result);

my $dataset = AI::NNFlex::Dataset->new();

$dataset->add([-1, 1, -1, 1]);
$dataset->add([-1, -1, 1, 1]);

skip($matrixabsent,$dataset);

$network->learn($dataset);

t/backprop.t  view on Meta::CPAN

my $network = AI::NNFlex::Backprop->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1,
				momentum=>0.6);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# add an extra layer to test out connect
$result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);


# Test initialise network
$result = $network->init();
ok($result); #test 3
##


# test connect layer
$result = $network->connect(fromlayer=>1,tolayer=>1);
ok($result); 

# test connect node
$result = $network->connect(fromnode=>'1,0',tonode=>'1,1');
ok($result);






# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],

t/backprop.t  view on Meta::CPAN

##


# Test a learning pass
my $err = $dataset->learn($network);
ok($err); #test 5
##


# Test a run pass
$result = $dataset->run($network);
ok($result); #test 8
##

t/reinforce.t  view on Meta::CPAN

# test create network
my $network = AI::NNFlex::Reinforce->new(randomconnections=>0,
				randomweights=>1,
				learningrate=>.1,
				debug=>[],bias=>1);

ok($network); #test 1
##

# test add layer
my $result = $network->add_layer(	nodes=>2,
			persistentactivation=>0,
			decay=>0.0,
			randomactivation=>0,
			threshold=>0.0,
			activationfunction=>"tanh",
			randomweights=>1);
ok($result); #test 2
##

# Test initialise network
$result = $network->init();
ok($result); #test 3
##

# test create dataset
my $dataset = AI::NNFlex::Dataset->new([
			[0,0],[1,1],
			[0,1],[1,0],
			[1,0],[0,1],
			[1,1],[0,0]]);
ok ($dataset); #test 4
##

# Test a run pass
$result = $dataset->run($network);
ok($result); #test 5
##



( run in 1.969 second using v1.01-cache-2.11-cpan-49f99fa48dc )