AI-MaxEntropy

 view release on metacpan or  search on metacpan

README  view on Meta::CPAN

    The entry "type => ..." specifies which algorithm is used for the
    optimization. Valid values include:

      'lbfgs'       Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS)
      'gis'         General Iterative Scaling (GIS)

    If ommited, 'lbfgs' is used by default.

   progress_cb
    The entry "progress_cb => ..." specifies the progress callback
    subroutine which is used to trace the process of the algorithm. The
    specified callback routine will be called at each iteration of the
    algorithm.

    For L-BFGS, "progress_cb" will be directly passed to "fmin" in
    Algorithm::LBFGS. f(x) is the negative log-likelihood of current lambda
    vector.

    For GIS, the "progress_cb" is supposed to have a prototype like

      progress_cb(i, lambda, d_lambda, lambda_norm, d_lambda_norm)

inc/Module/AutoInstall.pm  view on Meta::CPAN

    return 1 if -w $path;

    print << ".";
*** You are not allowed to write to the directory '$path';
    the installation may fail due to insufficient permissions.
.

    if (
        eval '$>' and lc(`sudo -V`) =~ /version/ and _prompt(
            qq(
==> Should we try to re-execute the autoinstall process with 'sudo'?),
            ((-t STDIN) ? 'y' : 'n')
        ) =~ /^[Yy]/
      )
    {

        # try to bootstrap ourselves from sudo
        print << ".";
*** Trying to re-execute the autoinstall process with 'sudo'...
.
        my $missing = join( ',', @Missing );
        my $config = join( ',',
            UNIVERSAL::isa( $Config, 'HASH' ) ? %{$Config} : @{$Config} )
          if $Config;

        return
          unless system( 'sudo', $^X, $0, "--config=$config",
            "--installdeps=$missing" );

lib/AI/MaxEntropy.pm  view on Meta::CPAN

       last_cut => -1,
       _c => {}
    };
    return bless $self, $class;
}

sub see {
    my ($self, $x, $y, $w) = @_;
    $w = 1 if not defined($w);
    my ($x1, $y1) = ([], undef);
    # preprocess if $x is hashref
    $x = [
        map {
	    my $attr = $_;
	    ref($x->{$attr}) eq 'ARRAY' ? 
	        map { "$attr:$_" } @{$x->{$attr}} : "$_:$x->{$_}" 
        } keys %$x
    ] if ref($x) eq 'HASH';
    # update af_num
    $self->{af_num} = scalar(@$x) if $self->{af_num} == 0;
    $self->{af_num} = -1 if $self->{af_num} != scalar(@$x);

lib/AI/MaxEntropy.pm  view on Meta::CPAN

optimization. Valid values include:

  'lbfgs'       Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS)
  'gis'         General Iterative Scaling (GIS)

If ommited, C<'lbfgs'> is used by default.

=head3 progress_cb

The entry C<progress_cb =E<gt> ...> specifies the progress callback
subroutine which is used to trace the process of the algorithm. 
The specified callback routine will be called at each iteration of the
algorithm.

For L-BFGS, C<progress_cb> will be directly passed to
L<Algorithm::LBFGS/fmin>. C<f(x)> is the negative log-likelihood of current
lambda vector.

For GIS, the C<progress_cb> is supposed to have a prototype like

  progress_cb(i, lambda, d_lambda, lambda_norm, d_lambda_norm)

lib/AI/MaxEntropy/Model.pm  view on Meta::CPAN

    ];
    DumpFile($file, $data);    
}

sub all_x { @{$_[0]->{x_list}} }
sub all_labels { @{$_[0]->{y_list}} }

sub score {
    my $self = shift;
    my ($x, $y) = @_;
    # preprocess if $x is hashref
    $x = [
        map {
	    my $attr = $_;
	    ref($x->{$attr}) eq 'ARRAY' ? 
	        map { "$attr:$_" } @{$x->{$attr}} : "$_:$x->{$_}" 
        } keys %$x
    ] if ref($x) eq 'HASH';
    # calculate score
    my @x1 = map { $self->{x_bucket}->{$_} } @$x;
    my $lambda_f = 0;

lib/AI/MaxEntropy/Util.pm  view on Meta::CPAN

samples in the training set and tested with samples in the testing set.
Usually, 2 measures of performance are concerned.
One is precision, indicating the percentage of samples which are correctly
predicted in the testing set. The other one is recall, indicating the 
precision of samples with a certain label.

=head1 FUNCTIONS

=head2 train_and_test

This function automated the process of training and testing.

    my $me = AI::MaxEntropy->new;
    my $sample = [
        [ ['a', 'b'] => 'x' => 1.5 ],
	...
    ];
    my ($result, $model) = train_and_test($me, $sample, 'xxxo');

First, the whole samples set will be divided into a training set and a
testing set according to the specified pattern. A pattern is a string,



( run in 0.292 second using v1.01-cache-2.11-cpan-8d75d55dd25 )