AI-DecisionTree

 view release on metacpan or  search on metacpan

Changes  view on Meta::CPAN

Revision history for Perl extension AI::DecisionTree.

0.11 - Fri Mar  2 21:26:51 CST 2012

 - Fixed typo in docs - 'leaf_color' should be 'leaf_colors'. [spotted
   by Julio Fraire]

 - Silence a couple of Perl::Critic warnings 

 - Convert to Dist::Zilla 

0.09 - Sat Jul 26 23:09:44 2008

 - Make a new release so that a META.yml file is generated. [Suggested
   by Alexandr Ciornii]

 - as_graphviz(%args) wasn't passing %args through properly to the
   GraphViz->new() method.  Fixed.  [Ian Knopke]

 - get_result() can now take a callback subroutine, rather than just a
   hash reference, to check the attributes of the given instance.
   This allows lazy instance checking.

0.08  Mon Jul  7 18:01:16 CDT 2003

 - Added a 'leaf_color' parameter for making GraphViz objects more
   colorful.

0.07  Fri Jun  6 10:37:51 CDT 2003

 - Created tests for the set_results() and copy_instances() methods.

 - Added documentation for as_graphviz() and increased the information
   contained in the GraphViz object.

 - Added the ability to limit the absolute depth of the tree when
   training.

0.06  Wed Sep 18 13:59:24 EST 2002

 - Fixed an XS memory leak that was afflicting all training instances.
   Added tests to make sure leak stays plugged.

 - Added the 'purge' and 'verbose' parameters to new().

 - add_instance() now accepts a 'name' parameter.

 - Users can now control whether training instances are purged after
   training, using the 'purge' parameter to new() and/or the
   do_purge() method.

 - Added the set_results() and copy_instances() methods, which let you
   re-use training instances from one tree to another.

 - Added the instances() and purge() accessor methods.

0.05  Thu Sep 12 01:22:34 AEST 2002

 - Fixed a concurrency problem that occurred when making more than one
   decision tree.  All tree data is now stored as member data, not
   class data.

 - DecisionTree.pm is now pure-perl again (though Instance.pm still
   has an XS component).

 - Fixed a one-off bug in the Instance.xs code that could create
   garbage data.

 - Handles "sparse" data better.  Sparse data means that every
   attribute doesn't have to be defined for every training/test
   instance.  This can now be a meaningful property - the absence of a
   value is currently equivalent to a special "<undef>" value.

 - Don't trigger warnings when undefined attribute values are
   encountered (as happens with sparse data).

 - Added documentation for the 'prune' parameter to new()

 - More consistent with memory allocation in Instance.xs - uses the
   perl memory macros/functions from `perldoc perlclib` instead of raw
   malloc/realloc/free.

 - Catches possible infinite loop situations when growing the tree
   (which shouldn't usually happen, but other mistakes can cause it)

 - The Instance class now has a numeric-only interface, without string
   translation.  String translation is done in the main DecisionTree
   class.  This isn't really a user-visible change.

0.04  Wed Sep  4 19:52:23 AEST 2002

 - Now uses regular XS instead of Inline for the C code parts. [patch
   by Matt Sergeant]

 - Converted the inner loop of the best_attr() method to C code,
   because it was spending a lot of time in accessor methods for the C
   structures it was using.  Don't worry, I'm not going C-crazy.  I
   won't be making many (any?) more of these kinds of changes, but
   these ones were probably necessary.

 - Removed a bit of debugging code that I left in for 0.03.

0.03  Mon Sep  2 11:41:18 AEST 2002

 - Added a 'prune' parameter to new(), which controls whether the tree
   will be pruned after training.  This is usually a good idea, so the
   default is to prune.  Currently we prune using a simple
   minimum-description-length criterion.

 - Training instances are now represented using a C struct rather than
   a Perl hash.  This can dramatically reduce memory usage, though it
   doesn't have much effect on speed.  Note that Inline.pm is now
   required.

 - The list of instances is now deleted after training, since it's no
   longer needed.

 - Small speedup to the train() method, achieved by less copying of data.

 - If get_result() is called in a list context, it now returns a list
   containing the assigned result, a "confidence" score (tentative,
   subject to change), and the tree depth of the leaf this instance
   ended up at.

 - Internally, each node in the tree now contains information about
   how many training examples contributed to training this node, and
   what the distribution of their classes was.

 - Added an as_graphviz() method, which will help visualize trees.
   They're not terribly pretty graphviz objects yet, but they're
   visual.

0.02  Sat Aug 10, 2002 21:02 AEST

 - Added support for noisy data, currently by picking the best (most
   common) result when noise is encountered.  See the 'noise_mode'
   parameter to new().

 - Added the rule_tree() method, which returns a data structure
   representing the tree. [James Smith]

 - Significantly sped up the train() method, especially for large data
   sets.

 - The get_result() method is no longer implemented recursively, which
   simplifies it and speeds it up.

 - Reformatted the documentation and added a TO DO section.

 - Documented the nodes() method.

0.01  Sat Jun  8 12:45:03 2002

 - original version; created by h2xs 1.21 with options
   -XA -n AI::DecisionTree

INSTALL  view on Meta::CPAN


This is the Perl distribution AI-DecisionTree.

Installing AI-DecisionTree is straightforward.

## Installation with cpanm

If you have cpanm, you only need one line:

    % cpanm AI::DecisionTree

If you are installing into a system-wide directory, you may need to pass the
"-S" flag to cpanm, which uses sudo to install the module:

    % cpanm -S AI::DecisionTree

## Installing with the CPAN shell

Alternatively, if your CPAN shell is set up, you should just be able to do:

    % cpan AI::DecisionTree

## Manual installation

As a last resort, you can manually install it. Download the tarball, untar it,
then build it:

    % perl Makefile.PL
    % make && make test

Then install it:

    % make install

If you are installing into a system-wide directory, you may need to run:

    % sudo make install

## Documentation

AI-DecisionTree documentation is available as POD.
You can run perldoc from a shell to read the documentation:

    % perldoc AI::DecisionTree

Instance/Instance.pm  view on Meta::CPAN


1;
__END__

=head1 NAME

AI::DecisionTree::Instance - C-struct wrapper for training instances

=head1 SYNOPSIS

  use AI::DecisionTree::Instance;
  
  my $i = new AI::DecisionTree::Instance([3,5], 7, 'this_instance');
  $i->value_int(0) == 3;
  $i->value_int(1) == 5;
  $i->result_int == 7;

=head1 DESCRIPTION

This class is just a simple Perl wrapper around a C struct embodying a
single training instance.  Its purpose is to reduce memory usage.  In
a "typical" training set with about 1000 instances, memory usage can
be reduced by about a factor of 5 (from 43.7M to 8.2M in my test
program).

A fairly tight loop is also implemented that helps speed up the

Instance/Instance.xs  view on Meta::CPAN

extern "C" {
#endif
#include "EXTERN.h"
#include "perl.h"
#include "XSUB.h"
#ifdef __cplusplus
}
#endif

typedef struct {
  char *name;
  int result;
  int num_values;
  int *values;
} Instance;

MODULE = AI::DecisionTree::Instance         PACKAGE = AI::DecisionTree::Instance

PROTOTYPES: DISABLE

Instance *
new (class, values_ref, result, name)
    char * class
    SV *   values_ref
    int    result
    char * name
  CODE:
    {
      int i;
      Instance* instance;
      AV* values = (AV*) SvRV(values_ref);
      New(0, instance, 1, Instance);
    
      instance->name   = savepv(name);
      instance->result = result;
      instance->num_values = 1 + av_len(values);
      New(0, instance->values, instance->num_values, int);
    
      for(i=0; i<instance->num_values; i++) {
        instance->values[i] = (int) SvIV( *av_fetch(values, i, 0) );
      }
    
      RETVAL = instance;
    }
  OUTPUT:
    RETVAL

char *
name (instance)
    Instance*   instance
  CODE:
    {
      RETVAL = instance->name;
    }
  OUTPUT:
    RETVAL

void
set_result (instance, result)
    Instance*   instance
    int         result
  CODE:
    {
      instance->result = result;
    }

void
set_value (instance, attribute, value)
    Instance*   instance
    int         attribute
    int         value
  PPCODE:
    {
      int *new_values;
      int i;
    
      if (attribute >= instance->num_values) {
        if (!value) return; /* Nothing to do */
        
        printf("Expanding from %d to %d places\n", instance->num_values, attribute);
	Renew(instance->values, attribute, int);
        if (!instance->values)
          croak("Couldn't grab new memory to expand instance");
        
        for (i=instance->num_values; i<attribute-1; i++)
          instance->values[i] = 0;
        instance->num_values = 1 + attribute;
      }
    
      instance->values[attribute] = value;
    }

int
value_int (instance, attribute)
    Instance *  instance
    int         attribute
  CODE:
    {
      if (attribute >= instance->num_values) {
        RETVAL = 0;
      }
      else {
        RETVAL = instance->values[attribute];
      }
    }
  OUTPUT:
    RETVAL

int
result_int (instance)
    Instance *  instance
  CODE:
    {
      RETVAL = instance->result;
    }
  OUTPUT:
    RETVAL

void
DESTROY (instance)
    Instance *  instance
  PPCODE:
    {
      Safefree(instance->name);
      Safefree(instance->values);
      Safefree(instance);
    }

int
tally (pkg, instances_r, tallies_r, totals_r, attr)
    char * pkg
    SV *   instances_r
    SV *   tallies_r
    SV *   totals_r
    int    attr
  CODE:
    {
      AV *instances = (AV*) SvRV(instances_r);
      HV *tallies   = (HV*) SvRV(tallies_r);
      HV *totals    = (HV*) SvRV(totals_r);
      I32 top = av_len(instances);
      int num_undef = 0;
      
      I32 i, v;
      SV **instance_r, **hash_entry, **sub_hash_entry;
      Instance *instance;
      
      for (i=0; i<=top; i++) {
	instance_r = av_fetch(instances, i, 0);
	instance = (Instance *) SvIV(SvRV(*instance_r));
        v = attr < instance->num_values ? instance->values[attr] : 0;
	/* if (!v) { num_undef++; continue; } */
	
	/* $totals{$v}++ */
	hash_entry = hv_fetch(totals, (char *)&v, sizeof(I32), 1);
	if (!SvIOK(*hash_entry)) sv_setiv(*hash_entry, 0);
	sv_setiv( *hash_entry, 1+SvIV(*hash_entry) );
	
	/* $tallies{$v}{$_->result_int}++ */
	hash_entry = hv_fetch(tallies, (char *)&v, sizeof(I32), 0);
	
	if (!hash_entry) {
	  hash_entry = hv_store(tallies, (char *)&v, sizeof(I32), newRV_noinc((SV*) newHV()), 0);
	}
	
	sub_hash_entry = hv_fetch((HV*) SvRV(*hash_entry), (char *)&(instance->result), sizeof(int), 1);
	if (!SvIOK(*sub_hash_entry)) sv_setiv(*sub_hash_entry, 0);
	sv_setiv( *sub_hash_entry, 1+SvIV(*sub_hash_entry) );
      }

      RETVAL = num_undef;

	/*  Old code:
      foreach (@$instances) {
	my $v = $_->value_int($all_attr->{$attr});
	next unless $v;
	$totals{ $v }++;
	$tallies{ $v }{ $_->result_int }++;
      }
	*/
    }
  OUTPUT:
    RETVAL

Instance/Makefile.PL  view on Meta::CPAN


use ExtUtils::MakeMaker;

my $module = 'AI::DecisionTree::Instance';
my ($name, $dir);
($name = $module) =~ s/.*::(\w+)/$1.pm/;
($dir  = $module) =~ s/::/-/g;

WriteMakefile
  (
   'NAME'       => $module,
   'VERSION_FROM' => $name, # finds $VERSION
  );

Instance/t/02-leaktest.t  view on Meta::CPAN

#!/usr/bin/perl

use Test;
BEGIN { plan tests => 4 }

use AI::DecisionTree::Instance;
ok(1);

my $x = 0;
{
  local *{"AI::DecisionTree::Instance::DESTROY"} = sub { $x = 1 };
  {
    my $i = new AI::DecisionTree::Instance([3,4], 4, "foo");
    ok $x, 0;
  }
  ok $x, 1;
}
ok $x, 1;

Instance/typemap  view on Meta::CPAN

TYPEMAP
Instance *		WRAPPED_C_OBJECT


INPUT
WRAPPED_C_OBJECT
	$var = (($type)SvIV(SvRV($arg)))

OUTPUT
WRAPPED_C_OBJECT
	/* 'class' must be an argument to any function that returns this datatype */
	sv_setref_pv($arg, class, (void*)$var);

LICENSE  view on Meta::CPAN

This software is copyright (c) 2012 by Ken Williams.

This is free software; you can redistribute it and/or modify it under
the same terms as the Perl 5 programming language system itself.

Terms of the Perl programming language system itself

a) the GNU General Public License as published by the Free
   Software Foundation; either version 1, or (at your option) any
   later version, or
b) the "Artistic License"

--- The GNU General Public License, Version 1, February 1989 ---

This software is Copyright (c) 2012 by Ken Williams.

This is free software, licensed under:

  The GNU General Public License, Version 1, February 1989

                    GNU GENERAL PUBLIC LICENSE
                     Version 1, February 1989

 Copyright (C) 1989 Free Software Foundation, Inc.
                    51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA

 Everyone is permitted to copy and distribute verbatim copies
 of this license document, but changing it is not allowed.

                            Preamble

  The license agreements of most software companies try to keep users
at the mercy of those companies.  By contrast, our General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users.  The
General Public License applies to the Free Software Foundation's
software and to any other program whose authors commit to using it.
You can use it for your programs, too.

  When we speak of free software, we are referring to freedom, not
price.  Specifically, the General Public License is designed to make
sure that you have the freedom to give away or sell copies of free
software, that you receive source code or can get it if you want it,
that you can change the software or use pieces of it in new free
programs; and that you know you can do these things.

  To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.

  For example, if you distribute copies of a such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have.  You must make sure that they, too, receive or can get the
source code.  And you must tell them their rights.

  We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.

  Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software.  If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.

  The precise terms and conditions for copying, distribution and
modification follow.

                    GNU GENERAL PUBLIC LICENSE
   TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION

  0. This License Agreement applies to any program or other work which
contains a notice placed by the copyright holder saying it may be
distributed under the terms of this General Public License.  The
"Program", below, refers to any such program or work, and a "work based
on the Program" means either the Program or any work containing the
Program or a portion of it, either verbatim or with modifications.  Each
licensee is addressed as "you".

  1. You may copy and distribute verbatim copies of the Program's source
code as you receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice and
disclaimer of warranty; keep intact all the notices that refer to this
General Public License and to the absence of any warranty; and give any
other recipients of the Program a copy of this General Public License
along with the Program.  You may charge a fee for the physical act of
transferring a copy.

  2. You may modify your copy or copies of the Program or any portion of
it, and copy and distribute such modifications under the terms of Paragraph
1 above, provided that you also do the following:

    a) cause the modified files to carry prominent notices stating that
    you changed the files and the date of any change; and

    b) cause the whole of any work that you distribute or publish, that
    in whole or in part contains the Program or any part thereof, either
    with or without modifications, to be licensed at no charge to all
    third parties under the terms of this General Public License (except
    that you may choose to grant warranty protection to some or all
    third parties, at your option).

    c) If the modified program normally reads commands interactively when
    run, you must cause it, when started running for such interactive use
    in the simplest and most usual way, to print or display an
    announcement including an appropriate copyright notice and a notice
    that there is no warranty (or else, saying that you provide a
    warranty) and that users may redistribute the program under these
    conditions, and telling the user how to view a copy of this General
    Public License.

    d) You may charge a fee for the physical act of transferring a
    copy, and you may at your option offer warranty protection in
    exchange for a fee.

Mere aggregation of another independent work with the Program (or its
derivative) on a volume of a storage or distribution medium does not bring
the other work under the scope of these terms.

  3. You may copy and distribute the Program (or a portion or derivative of
it, under Paragraph 2) in object code or executable form under the terms of
Paragraphs 1 and 2 above provided that you also do one of the following:

    a) accompany it with the complete corresponding machine-readable
    source code, which must be distributed under the terms of
    Paragraphs 1 and 2 above; or,

    b) accompany it with a written offer, valid for at least three
    years, to give any third party free (except for a nominal charge
    for the cost of distribution) a complete machine-readable copy of the
    corresponding source code, to be distributed under the terms of
    Paragraphs 1 and 2 above; or,

    c) accompany it with the information you received as to where the
    corresponding source code may be obtained.  (This alternative is
    allowed only for noncommercial distribution and only if you
    received the program in object code or executable form alone.)

Source code for a work means the preferred form of the work for making
modifications to it.  For an executable file, complete source code means
all the source code for all modules it contains; but, as a special
exception, it need not include source code for modules which are standard
libraries that accompany the operating system on which the executable
file runs, or for standard header files or definitions files that
accompany that operating system.

  4. You may not copy, modify, sublicense, distribute or transfer the
Program except as expressly provided under this General Public License.
Any attempt otherwise to copy, modify, sublicense, distribute or transfer
the Program is void, and will automatically terminate your rights to use
the Program under this License.  However, parties who have received
copies, or rights to use copies, from you under this General Public
License will not have their licenses terminated so long as such parties
remain in full compliance.

  5. By copying, distributing or modifying the Program (or any work based
on the Program) you indicate your acceptance of this license to do so,
and all its terms and conditions.

  6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the original
licensor to copy, distribute or modify the Program subject to these
terms and conditions.  You may not impose any further restrictions on the
recipients' exercise of the rights granted herein.

  7. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time.  Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.

Each version is given a distinguishing version number.  If the Program
specifies a version number of the license which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation.  If the Program does not specify a version number of
the license, you may choose any version ever published by the Free Software
Foundation.

  8. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission.  For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this.  Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.

                            NO WARRANTY

  9. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW.  EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.  THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU.  SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.

  10. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.

                     END OF TERMS AND CONDITIONS

        Appendix: How to Apply These Terms to Your New Programs

  If you develop a new program, and you want it to be of the greatest
possible use to humanity, the best way to achieve this is to make it
free software which everyone can redistribute and change under these
terms.

  To do so, attach the following notices to the program.  It is safest to
attach them to the start of each source file to most effectively convey
the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.

    <one line to give the program's name and a brief idea of what it does.>
    Copyright (C) 19yy  <name of author>

    This program is free software; you can redistribute it and/or modify
    it under the terms of the GNU General Public License as published by
    the Free Software Foundation; either version 1, or (at your option)
    any later version.

    This program is distributed in the hope that it will be useful,
    but WITHOUT ANY WARRANTY; without even the implied warranty of
    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    GNU General Public License for more details.

    You should have received a copy of the GNU General Public License
    along with this program; if not, write to the Free Software
    Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston MA  02110-1301 USA


Also add information on how to contact you by electronic and paper mail.

If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:

    Gnomovision version 69, Copyright (C) 19xx name of author
    Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
    This is free software, and you are welcome to redistribute it
    under certain conditions; type `show c' for details.

The hypothetical commands `show w' and `show c' should show the
appropriate parts of the General Public License.  Of course, the
commands you use may be called something other than `show w' and `show
c'; they could even be mouse-clicks or menu items--whatever suits your
program.

You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary.  Here a sample; alter the names:

  Yoyodyne, Inc., hereby disclaims all copyright interest in the
  program `Gnomovision' (a program to direct compilers to make passes
  at assemblers) written by James Hacker.

  <signature of Ty Coon>, 1 April 1989
  Ty Coon, President of Vice

That's all there is to it!


--- The Artistic License 1.0 ---

This software is Copyright (c) 2012 by Ken Williams.

This is free software, licensed under:

  The Artistic License 1.0

The Artistic License

Preamble

The intent of this document is to state the conditions under which a Package
may be copied, such that the Copyright Holder maintains some semblance of
artistic control over the development of the package, while giving the users of
the package the right to use and distribute the Package in a more-or-less
customary fashion, plus the right to make reasonable modifications.

Definitions:

  - "Package" refers to the collection of files distributed by the Copyright
    Holder, and derivatives of that collection of files created through
    textual modification. 
  - "Standard Version" refers to such a Package if it has not been modified,
    or has been modified in accordance with the wishes of the Copyright
    Holder. 
  - "Copyright Holder" is whoever is named in the copyright or copyrights for
    the package. 
  - "You" is you, if you're thinking about copying or distributing this Package.
  - "Reasonable copying fee" is whatever you can justify on the basis of media
    cost, duplication charges, time of people involved, and so on. (You will
    not be required to justify it to the Copyright Holder, but only to the
    computing community at large as a market that must bear the fee.) 
  - "Freely Available" means that no fee is charged for the item itself, though
    there may be fees involved in handling the item. It also means that
    recipients of the item may redistribute it under the same conditions they
    received it. 

1. You may make and give away verbatim copies of the source form of the
Standard Version of this Package without restriction, provided that you
duplicate all of the original copyright notices and associated disclaimers.

2. You may apply bug fixes, portability fixes and other modifications derived
from the Public Domain or from the Copyright Holder. A Package modified in such
a way shall still be considered the Standard Version.

3. You may otherwise modify your copy of this Package in any way, provided that
you insert a prominent notice in each changed file stating how and when you
changed that file, and provided that you do at least ONE of the following:

  a) place your modifications in the Public Domain or otherwise make them
     Freely Available, such as by posting said modifications to Usenet or an
     equivalent medium, or placing the modifications on a major archive site
     such as ftp.uu.net, or by allowing the Copyright Holder to include your
     modifications in the Standard Version of the Package.

  b) use the modified Package only within your corporation or organization.

  c) rename any non-standard executables so the names do not conflict with
     standard executables, which must also be provided, and provide a separate
     manual page for each non-standard executable that clearly documents how it
     differs from the Standard Version.

  d) make other distribution arrangements with the Copyright Holder.

4. You may distribute the programs of this Package in object code or executable
form, provided that you do at least ONE of the following:

  a) distribute a Standard Version of the executables and library files,
     together with instructions (in the manual page or equivalent) on where to
     get the Standard Version.

  b) accompany the distribution with the machine-readable source of the Package
     with your modifications.

  c) accompany any non-standard executables with their corresponding Standard
     Version executables, giving the non-standard executables non-standard
     names, and clearly documenting the differences in manual pages (or
     equivalent), together with instructions on where to get the Standard
     Version.

  d) make other distribution arrangements with the Copyright Holder.

5. You may charge a reasonable copying fee for any distribution of this
Package.  You may charge any fee you choose for support of this Package. You
may not charge a fee for this Package itself. However, you may distribute this
Package in aggregate with other (possibly commercial) programs as part of a
larger (possibly commercial) software distribution provided that you do not
advertise this Package as a product of your own.

6. The scripts and library files supplied as input to or produced as output
from the programs of this Package do not automatically fall under the copyright

META.yml  view on Meta::CPAN

---
abstract: 'Automatically Learns Decision Trees'
author:
  - 'Ken Williams <kwilliams@cpan.org>'
build_requires:
  English: 0
  Test: 0
  Test::More: 0
  warnings: 0
configure_requires:
  ExtUtils::MakeMaker: 6.30
dynamic_config: 0
generated_by: 'Dist::Zilla version 4.300009, CPAN::Meta::Converter version 2.120351'
license: perl
meta-spec:
  url: http://module-build.sourceforge.net/META-spec-v1.4.html
  version: 1.4
name: AI-DecisionTree
requires:
  Carp: 0
  GraphViz: 0
  strict: 0
  vars: 0
resources:
  bugtracker: http://rt.cpan.org/Public/Dist/Display.html?Name=AI-DecisionTree
  repository: git://github.com/kenahoo/perl-ai-decisiontree.git
version: 0.11

Makefile.PL  view on Meta::CPAN

use strict;
use warnings;



use ExtUtils::MakeMaker 6.30;



my %WriteMakefileArgs = (
  "ABSTRACT" => "Automatically Learns Decision Trees",
  "AUTHOR" => "Ken Williams <kwilliams\@cpan.org>",
  "BUILD_REQUIRES" => {
    "English" => 0,
    "Test" => 0,
    "Test::More" => 0,
    "warnings" => 0
  },
  "CONFIGURE_REQUIRES" => {
    "ExtUtils::MakeMaker" => "6.30"
  },
  "DISTNAME" => "AI-DecisionTree",
  "EXE_FILES" => [],
  "LICENSE" => "perl",
  "NAME" => "AI::DecisionTree",
  "PREREQ_PM" => {
    "Carp" => 0,
    "GraphViz" => 0,
    "strict" => 0,
    "vars" => 0
  },
  "VERSION" => "0.11",
  "test" => {
    "TESTS" => "t/*.t"
  }
);


unless ( eval { ExtUtils::MakeMaker->VERSION(6.56) } ) {
  my $br = delete $WriteMakefileArgs{BUILD_REQUIRES};
  my $pp = $WriteMakefileArgs{PREREQ_PM};
  for my $mod ( keys %$br ) {
    if ( exists $pp->{$mod} ) {
      $pp->{$mod} = $br->{$mod} if $br->{$mod} > $pp->{$mod};
    }
    else {
      $pp->{$mod} = $br->{$mod};
    }
  }
}

delete $WriteMakefileArgs{CONFIGURE_REQUIRES}
  unless eval { ExtUtils::MakeMaker->VERSION(6.52) };

WriteMakefile(%WriteMakefileArgs);



README  view on Meta::CPAN



This archive contains the distribution AI-DecisionTree,
version 0.11:

  Automatically Learns Decision Trees

This software is copyright (c) 2012 by Ken Williams.

This is free software; you can redistribute it and/or modify it under
the same terms as the Perl 5 programming language system itself.


SIGNATURE  view on Meta::CPAN

This file contains message digests of all files listed in MANIFEST,
signed via the Module::Signature module, version 0.68.

To verify the content in this distribution, first make sure you have
Module::Signature installed, then type:

    % cpansign -v

It will check each file's integrity, as well as the signature's
validity.  If "==> Signature verified OK! <==" is not displayed,
the distribution may already have been compromised, and you should
not run its Makefile.PL or Build.PL.

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

SHA1 e01a2ac643b28c7aa7d0d07a8185899a6aa32b13 Changes

eg/example.pl  view on Meta::CPAN

#!/usr/bin/perl

use AI::DecisionTree;

my @attributes = qw(outlook  temperature  humidity  wind    play_tennis);               
my @cases      = qw(
		    sunny    hot          high      weak    no
		    sunny    hot          high      strong  no
		    overcast hot          high      weak    yes
		    rain     mild         high      weak    yes
		    rain     cool         normal    weak    yes
		    rain     cool         normal    strong  no
		    overcast cool         normal    strong  yes
		    sunny    mild         high      weak    no
		    sunny    cool         normal    weak    yes
		    rain     mild         normal    weak    yes
		    sunny    mild         normal    strong  yes
		    overcast mild         high      strong  yes
		    overcast hot          normal    weak    yes
		    rain     mild         high      strong  no
		   );
my $outcome = pop @attributes;


my $dtree = new AI::DecisionTree;
while (@cases) {
  my @values = splice @cases, 0, 1 + scalar(@attributes);
  my $result = pop @values;
  my %pairs;
  @pairs{@attributes} = @values;

  $dtree->add_instance(attributes => \%pairs,
		       result => $result,
		      );
}
$dtree->train;


my $result;
# Try one of the training examples
$result = $dtree->get_result( attributes => {
					     outlook   => 'rain',
					     temperature => 'mild',
					     humidity => 'high',
					     wind => 'strong',
					    } );
print "Result 1: $result\n";  # no

# Try a new unseen example
$result = $dtree->get_result( attributes => {
					     outlook => 'sunny',
					     temperature => 'hot',
					     humidity => 'normal',
					     wind => 'strong',
					    } );
print "Result 2: $result\n";  # yes



# Show the created tree structure as rules
print map "$_\n", $dtree->rule_statements;


# Will barf on inconsistent data
my $t2 = new AI::DecisionTree;
$t2->add_instance( attributes => { foo => 'bar' },
		   result => 1 );
$t2->add_instance( attributes => { foo => 'bar' },
		   result => 0 );
eval {$t2->train};
print "$@\n";

lib/AI/DecisionTree.pm  view on Meta::CPAN

use strict;

package AI::DecisionTree;
{
  $AI::DecisionTree::VERSION = '0.11';
}

use AI::DecisionTree::Instance;
use Carp;
use vars qw(@ISA);


sub new {
  my $package = shift;
  return bless {
		noise_mode => 'fatal',
		prune => 1,
		purge => 1,
		verbose => 0,
		max_depth => 0,
		@_,
		nodes => 0,
		instances => [],
		name_gen => 0,
	       }, $package;
}

sub nodes      { $_[0]->{nodes} }
sub noise_mode { $_[0]->{noise_mode} }
sub depth      { $_[0]->{depth} }

sub add_instance {
  my ($self, %args) = @_;
  croak "Missing 'attributes' parameter" unless $args{attributes};
  croak "Missing 'result' parameter" unless defined $args{result};
  $args{name} = $self->{name_gen}++ unless exists $args{name};
  
  my @attributes;
  while (my ($k, $v) = each %{$args{attributes}}) {
    $attributes[ _hlookup($self->{attributes}, $k) ] = _hlookup($self->{attribute_values}{$k}, $v);
  }
  $_ ||= 0 foreach @attributes;
  
  push @{$self->{instances}}, AI::DecisionTree::Instance->new(\@attributes, _hlookup($self->{results}, $args{result}), $args{name});
}

sub _hlookup {
  $_[0] ||= {}; # Autovivify as a hash
  my ($hash, $key) = @_;
  unless (exists $hash->{$key}) {
    $hash->{$key} = 1 + keys %$hash;
  }
  return $hash->{$key};
}

sub _create_lookup_hashes {
  my $self = shift;
  my $h = $self->{results};
  $self->{results_reverse} = [ undef, sort {$h->{$a} <=> $h->{$b}} keys %$h ];
  
  foreach my $attr (keys %{$self->{attribute_values}}) {
    my $h = $self->{attribute_values}{$attr};
    $self->{attribute_values_reverse}{$attr} = [ undef, sort {$h->{$a} <=> $h->{$b}} keys %$h ];
  }
}

sub train {
  my ($self, %args) = @_;
  if (not @{ $self->{instances} }) {
    croak "Training data has been purged, can't re-train" if $self->{tree};
    croak "Must add training instances before calling train()";
  }
  
  $self->_create_lookup_hashes;
  local $self->{curr_depth} = 0;
  local $self->{max_depth} = $args{max_depth} if exists $args{max_depth};
  $self->{depth} = 0;
  $self->{tree} = $self->_expand_node( instances => $self->{instances} );
  $self->{total_instances} = @{$self->{instances}};
  
  $self->prune_tree if $self->{prune};
  $self->do_purge if $self->purge;
  return 1;
}

sub do_purge {
  my $self = shift;
  delete @{$self}{qw(instances attribute_values attribute_values_reverse results results_reverse)};
}

sub copy_instances {
  my ($self, %opt) = @_;
  croak "Missing 'from' parameter to copy_instances()" unless exists $opt{from};
  my $other = $opt{from};
  croak "'from' parameter is not a decision tree" unless UNIVERSAL::isa($other, __PACKAGE__);

  foreach (qw(instances attributes attribute_values results)) {
    $self->{$_} = $other->{$_};
  }
  $self->_create_lookup_hashes;
}

sub set_results {
  my ($self, $hashref) = @_;
  foreach my $instance (@{$self->{instances}}) {
    my $name = $instance->name;
    croak "No result given for instance '$name'" unless exists $hashref->{$name};
    $instance->set_result( $self->{results}{ $hashref->{$name} } );
  }
}

sub instances { $_[0]->{instances} }

sub purge {
  my $self = shift;
  $self->{purge} = shift if @_;
  return $self->{purge};
}

# Each node contains:
#  { split_on => $attr_name,
#    children => { $attr_value1 => $node1,
#                  $attr_value2 => $node2, ... }
#  }
# or
#  { result => $result }

sub _expand_node {
  my ($self, %args) = @_;
  my $instances = $args{instances};
  print STDERR '.' if $self->{verbose};
  
  $self->{depth} = $self->{curr_depth} if $self->{curr_depth} > $self->{depth};
  local $self->{curr_depth} = $self->{curr_depth} + 1;
  $self->{nodes}++;

  my %results;
  $results{$self->_result($_)}++ foreach @$instances;
  my @results = map {$_,$results{$_}} sort {$results{$b} <=> $results{$a}} keys %results;
  my %node = ( distribution => \@results, instances => scalar @$instances );

  foreach (keys %results) {
    $self->{prior_freqs}{$_} += $results{$_};
  }

  if (keys(%results) == 1) {
    # All these instances have the same result - make this node a leaf
    $node{result} = $self->_result($instances->[0]);
    return \%node;
  }
  
  # Multiple values are present - find the best predictor attribute and split on it
  my $best_attr = $self->best_attr($instances);

  croak "Inconsistent data, can't build tree with noise_mode='fatal'"
    if $self->{noise_mode} eq 'fatal' and !defined $best_attr;

  if ( !defined($best_attr)
       or $self->{max_depth} && $self->{curr_depth} > $self->{max_depth} ) {
    # Pick the most frequent result for this leaf
    $node{result} = (sort {$results{$b} <=> $results{$a}} keys %results)[0];
    return \%node;
  }
  
  $node{split_on} = $best_attr;
  
  my %split;
  foreach my $i (@$instances) {
    my $v = $self->_value($i, $best_attr);
    push @{$split{ defined($v) ? $v : '<undef>' }}, $i;
  }
  die ("Something's wrong: attribute '$best_attr' didn't split ",
       scalar @$instances, " instances into multiple buckets (@{[ keys %split ]})")
    unless keys %split > 1;

  foreach my $value (keys %split) {
    $node{children}{$value} = $self->_expand_node( instances => $split{$value} );
  }
  
  return \%node;
}

sub best_attr {
  my ($self, $instances) = @_;

  # 0 is a perfect score, entropy(#instances) is the worst possible score
  
  my ($best_score, $best_attr) = (@$instances * $self->entropy( map $_->result_int, @$instances ), undef);
  my $all_attr = $self->{attributes};
  foreach my $attr (keys %$all_attr) {

    # %tallies is correlation between each attr value and result
    # %total is number of instances with each attr value
    my (%totals, %tallies);
    my $num_undef = AI::DecisionTree::Instance::->tally($instances, \%tallies, \%totals, $all_attr->{$attr});
    next unless keys %totals; # Make sure at least one instance defines this attribute
    
    my $score = 0;
    while (my ($opt, $vals) = each %tallies) {
      $score += $totals{$opt} * $self->entropy2( $vals, $totals{$opt} );
    }

    ($best_attr, $best_score) = ($attr, $score) if $score < $best_score;
  }
  
  return $best_attr;
}

sub entropy2 {
  shift;
  my ($counts, $total) = @_;

  # Entropy is defined with log base 2 - we just divide by log(2) at the end to adjust.
  my $sum = 0;
  $sum += $_ * log($_) foreach values %$counts;
  return +(log($total) - $sum/$total)/log(2);
}

sub entropy {
  shift;

  my %count;
  $count{$_}++ foreach @_;

  # Entropy is defined with log base 2 - we just divide by log(2) at the end to adjust.
  my $sum = 0;
  $sum += $_ * log($_) foreach values %count;
  return +(log(@_) - $sum/@_)/log(2);
}

sub prune_tree {
  my $self = shift;

  # We use a minimum-description-length approach.  We calculate the
  # score of each node:
  #  n = number of nodes below
  #  r = number of results (categories) in the entire tree
  #  i = number of instances in the entire tree
  #  e = number of errors below this node

  # Hypothesis description length (MML):
  #  describe tree: number of nodes + number of edges
  #  describe exceptions: num_exceptions * log2(total_num_instances) * log2(total_num_results)
  
  my $r = keys %{ $self->{results} };
  my $i = $self->{tree}{instances};
  my $exception_cost = log($r) * log($i) / log(2)**2;

  # Pruning can turn a branch into a leaf
  my $maybe_prune = sub {
    my ($self, $node) = @_;
    return unless $node->{children};  # Can't prune leaves

    my $nodes_below = $self->nodes_below($node);
    my $tree_cost = 2 * $nodes_below - 1;  # $edges_below == $nodes_below - 1
    
    my $exceptions = $self->exceptions( $node );
    my $simple_rule_exceptions = $node->{instances} - $node->{distribution}[1];

    my $score = -$nodes_below - ($exceptions - $simple_rule_exceptions) * $exception_cost;
    #warn "Score = $score = -$nodes_below - ($exceptions - $simple_rule_exceptions) * $exception_cost\n";
    if ($score < 0) {
      delete @{$node}{'children', 'split_on', 'exceptions', 'nodes_below'};
      $node->{result} = $node->{distribution}[0];
      # XXX I'm not cleaning up 'exceptions' or 'nodes_below' keys up the tree
    }
  };

  $self->_traverse($maybe_prune);
}

sub exceptions {
  my ($self, $node) = @_;
  return $node->{exceptions} if exists $node->{exeptions};
  
  my $count = 0;
  if ( exists $node->{result} ) {
    $count = $node->{instances} - $node->{distribution}[1];
  } else {
    foreach my $child ( values %{$node->{children}} ) {
      $count += $self->exceptions($child);
    }
  }
  
  return $node->{exceptions} = $count;
}

sub nodes_below {
  my ($self, $node) = @_;
  return $node->{nodes_below} if exists $node->{nodes_below};

  my $count = 0;
  $self->_traverse( sub {$count++}, $node );

  return $node->{nodes_below} = $count - 1;
}

# This is *not* for external use, I may change it.
sub _traverse {
  my ($self, $callback, $node, $parent, $node_name) = @_;
  $node ||= $self->{tree};
  
  ref($callback) ? $callback->($self, $node, $parent, $node_name) : $self->$callback($node, $parent, $node_name);
  
  return unless $node->{children};
  foreach my $child ( keys %{$node->{children}} ) {
    $self->_traverse($callback, $node->{children}{$child}, $node, $child);
  }
}

sub get_result {
  my ($self, %args) = @_;
  croak "Missing 'attributes' or 'callback' parameter" unless $args{attributes} or $args{callback};

  $self->train unless $self->{tree};
  my $tree = $self->{tree};
  
  while (1) {
    if (exists $tree->{result}) {
      my $r = $tree->{result};
      return $r unless wantarray;

      my %dist = @{$tree->{distribution}};
      my $confidence = $tree->{distribution}[1] / $tree->{instances};

#      my $confidence = P(H|D) = [P(D|H)P(H)]/[P(D|H)P(H)+P(D|H')P(H')]
#                              = [P(D|H)P(H)]/P(D);
#      my $confidence = 
#      $confidence *= $self->{prior_freqs}{$r} / $self->{total_instances};
      
      return ($r, $confidence, \%dist);
    }
    
    my $instance_val = (exists $args{callback} ? $args{callback}->($tree->{split_on}) :
			exists $args{attributes}{$tree->{split_on}} ? $args{attributes}{$tree->{split_on}} :
			'<undef>');
    ## no critic (ProhibitExplicitReturnUndef)
    $tree = $tree->{children}{ $instance_val }
      or return undef;
  }
}

sub as_graphviz {
  my ($self, %args) = @_;
  my $colors = delete $args{leaf_colors} || {};
  require GraphViz;
  my $g = GraphViz->new(%args);

  my $id = 1;
  my $add_edge = sub {
    my ($self, $node, $parent, $node_name) = @_;
    # We use stringified reference names for node names, as a convenient hack.

    if ($node->{split_on}) {
      $g->add_node( "$node",
		    label => $node->{split_on},
		    shape => 'ellipse',
		  );
    } else {
      my $i = 0;
      my $distr = join ',', grep {$i++ % 2} @{$node->{distribution}};
      my %fill = (exists $colors->{$node->{result}} ?
		  (fillcolor => $colors->{$node->{result}},
		   style => 'filled') :
		  ()
		 );
      $g->add_node( "$node",
		    label => "$node->{result} ($distr)",
		    shape => 'box',
		    %fill,
		  );
    }
    $g->add_edge( "$parent" => "$node",
		  label => $node_name,
		) if $parent;
  };

  $self->_traverse( $add_edge );
  return $g;
}

sub rule_tree {
  my $self = shift;
  my ($tree) = @_ ? @_ : $self->{tree};
  
  # build tree:
  # [ question, { results => [ question, { ... } ] } ]
  
  return $tree->{result} if exists $tree->{result};
  
  return [
	  $tree->{split_on}, {
			      map { $_ => $self->rule_tree($tree->{children}{$_}) } keys %{$tree->{children}},
			     }
	 ];
}

sub rule_statements {
  my $self = shift;
  my ($stmt, $tree) = @_ ? @_ : ('', $self->{tree});
  return("$stmt -> '$tree->{result}'") if exists $tree->{result};
  
  my @out;
  my $prefix = $stmt ? "$stmt and" : "if";
  foreach my $val (keys %{$tree->{children}}) {
    push @out, $self->rule_statements("$prefix $tree->{split_on}='$val'", $tree->{children}{$val});
  }
  return @out;
}

### Some instance accessor stuff:

sub _result {
  my ($self, $instance) = @_;
  my $int = $instance->result_int;
  return $self->{results_reverse}[$int];
}

sub _delete_value {
  my ($self, $instance, $attr) = @_;
  my $val = $self->_value($instance, $attr);
  return unless defined $val;
  
  $instance->set_value($self->{attributes}{$attr}, 0);
  return $val;
}

sub _value {
  my ($self, $instance, $attr) = @_;
  return unless exists $self->{attributes}{$attr};
  my $val_int = $instance->value_int($self->{attributes}{$attr});
  return $self->{attribute_values_reverse}{$attr}[$val_int];
}



1;
__END__

=head1 NAME

AI::DecisionTree - Automatically Learns Decision Trees

=head1 VERSION

version 0.11

=head1 SYNOPSIS

  use AI::DecisionTree;
  my $dtree = new AI::DecisionTree;
  
  # A set of training data for deciding whether to play tennis
  $dtree->add_instance
    (attributes => {outlook     => 'sunny',
                    temperature => 'hot',
                    humidity    => 'high'},
     result => 'no');
  
  $dtree->add_instance
    (attributes => {outlook     => 'overcast',
                    temperature => 'hot',
                    humidity    => 'normal'},
     result => 'yes');

  ... repeat for several more instances, then:
  $dtree->train;
  
  # Find results for unseen instances
  my $result = $dtree->get_result
    (attributes => {outlook     => 'sunny',
                    temperature => 'hot',
                    humidity    => 'normal'});

=head1 DESCRIPTION

The C<AI::DecisionTree> module automatically creates so-called
"decision trees" to explain a set of training data.  A decision tree
is a kind of categorizer that use a flowchart-like process for
categorizing new instances.  For instance, a learned decision tree
might look like the following, which classifies for the concept "play
tennis":

                   OUTLOOK
                   /  |  \
                  /   |   \
                 /    |    \
           sunny/  overcast \rainy
               /      |      \
          HUMIDITY    |       WIND
          /  \       *no*     /  \
         /    \              /    \
    high/      \normal      /      \
       /        \    strong/        \weak
     *no*      *yes*      /          \
                        *no*        *yes*

(This example, and the inspiration for the C<AI::DecisionTree> module,
come directly from Tom Mitchell's excellent book "Machine Learning",
available from McGraw Hill.)

A decision tree like this one can be learned from training data, and
then applied to previously unseen data to obtain results that are
consistent with the training data.

The usual goal of a decision tree is to somehow encapsulate the

t/02-noisy.t  view on Meta::CPAN


my $dtree = AI::DecisionTree->new(noise_mode => 'pick_best');
ok $dtree;

my @names = split /, /, <DATA>;
chomp $names[-1];

# Train on first 600 instances
printf "Loading 600 training instances with %d attribute types each\n", scalar @names;
while (<DATA>) {
  last unless 2..601;
  
  chomp;
  my @values = split /, /, $_;
  my $result = pop @values;
  my %pairs = map {$names[$_], $values[$_]} 0..$#names;

  $dtree->add_instance(attributes => \%pairs,
		       result => $result,
		      );
}
print "Building decision tree\n";
$dtree->train;
ok(1);

# Test on rest of data, get at least 80%
print "Testing on remainder of data\n";
my ($good, $bad) = (0,0);
while (<DATA>) {
  chomp;
  my @values = split /, /, $_;
  my $result = pop @values;
  my %pairs = map {$names[$_], $values[$_]} 0..$#names;

  my ($guess, $confidence) = $dtree->get_result(attributes => \%pairs);
  $guess ||= '';  $confidence ||= '';
  ($guess eq $result ? $good : $bad)++;
  
  #print "$guess : $result : $confidence\n";
}
my $accuracy = $good/($good + $bad);
ok $accuracy > .8;
print "Accuracy=$accuracy\n";

#use YAML; print Dump($dtree->rule_tree);
#print map "$_\n", $dtree->rule_statements;

if (eval "use GraphViz; 1") {
  my $graphviz = $dtree->as_graphviz;
  ok $graphviz;

  if (0) {
    # Only works on Mac OS X
    my $file = '/tmp/tree2.png';
    open my($fh), "> $file" or die "$file: $!";
    print $fh $graphviz->as_png;
    close $fh;
    system('open', $file);
  }
} else {
  skip("Skipping: GraphViz is not installed", 0);
}

# The following data comes from the "C4.5" software package, in the
# "soybean.data" data file.  It is somewhat noisy.  I chose it because
# it was a pretty big data set, and because there are published
# results on it that I can compare to.  Since the data seemed to be in
# order from most-information to least-information or something in the
# C4.5 distribution, I randomized the order of the instances.  Note
# also that I'm treating the '?' value as a normal string value.

t/author-critic.t  view on Meta::CPAN

#!perl

BEGIN {
  unless ($ENV{AUTHOR_TESTING}) {
    require Test::More;
    Test::More::plan(skip_all => 'these tests are for testing by the author');
  }
}


use strict;
use warnings;

use Test::More;
use English qw(-no_match_vars);

eval "use Test::Perl::Critic";

 view all matches for this distribution
 view release on metacpan -  search on metacpan

( run in 1.358 second using v1.00-cache-2.02-grep-82fe00e-cpan-2c419f77a38b )