AI-DecisionTree
view release on metacpan or search on metacpan
lib/AI/DecisionTree.pm view on Meta::CPAN
If set to a true value, some status information will be output while
training a decision tree. Default is false.
=item purge
If set to a true value, the C<do_purge()> method will be invoked
during C<train()>. The default is true.
=item max_depth
Controls the maximum depth of the tree that will be created during
C<train()>. The default is 0, which means that trees of unlimited
depth can be constructed.
=back
=item add_instance(attributes => \%hash, result => $string, name => $string)
Adds a training instance to the set of instances which will be used to
form the tree. An C<attributes> parameter specifies a hash of
attribute-value pairs for the instance, and a C<result> parameter
specifies the result.
An optional C<name> parameter lets you give a unique name to each
training instance. This can be used in coordination with the
C<set_results()> method below.
=item train()
Builds the decision tree from the list of training instances. If a
numeric C<max_depth> parameter is supplied, the maximum tree depth can
be controlled (see also the C<new()> method).
=item get_result(attributes => \%hash)
Returns the most likely result (from the set of all results given to
C<add_instance()>) for the set of attribute values given. An
C<attributes> parameter specifies a hash of attribute-value pairs for
the instance. If the decision tree doesn't have enough information to
find a result, it will return C<undef>.
=item do_purge()
Purges training instances and their associated information from the
DecisionTree object. This can save memory after training, and since
the training instances are implemented as C structs, this turns the
DecisionTree object into a pure-perl data structure that can be more
easily saved with C<Storable.pm>, for instance.
=item purge()
Returns true or false depending on the value of the tree's C<purge>
property. An optional boolean argument sets the property.
=item copy_instances(from =E<gt> $other_tree)
Allows two trees to share the same set of training instances. More
commonly, this lets you train one tree, then re-use its instances in
another tree (possibly changing the instance C<result> values using
C<set_results()>), which is much faster than re-populating the second
tree's instances from scratch.
=item set_results(\%results)
Given a hash that relates instance names to instance result values,
change the result values as specified.
=back
=head2 Tree Introspection
=over 4
=item instances()
Returns a reference to an array of the training instances used to
build this tree.
=item nodes()
Returns the number of nodes in the trained decision tree.
=item depth()
Returns the depth of the tree. This is the maximum number of
decisions that would need to be made to classify an unseen instance,
i.e. the length of the longest path from the tree's root to a leaf. A
tree with a single node would have a depth of zero.
=item rule_tree()
Returns a data structure representing the decision tree. For
instance, for the tree diagram above, the following data structure
is returned:
[ 'outlook', {
'rain' => [ 'wind', {
'strong' => 'no',
'weak' => 'yes',
} ],
'sunny' => [ 'humidity', {
'normal' => 'yes',
'high' => 'no',
} ],
'overcast' => 'yes',
} ]
This is slightly remniscent of how XML::Parser returns the parsed
XML tree.
Note that while the ordering in the hashes is unpredictable, the
nesting is in the order in which the criteria will be checked at
decision-making time.
=item rule_statements()
Returns a list of strings that describe the tree in rule-form. For
instance, for the tree diagram above, the following list would be
returned (though not necessarily in this order - the order is
unpredictable):
if outlook='rain' and wind='strong' -> 'no'
if outlook='rain' and wind='weak' -> 'yes'
if outlook='sunny' and humidity='normal' -> 'yes'
if outlook='sunny' and humidity='high' -> 'no'
if outlook='overcast' -> 'yes'
This can be helpful for scrutinizing the structure of a tree.
Note that while the order of the rules is unpredictable, the order of
criteria within each rule reflects the order in which the criteria
will be checked at decision-making time.
=item as_graphviz()
Returns a C<GraphViz> object representing the tree. Requires that the
GraphViz module is already installed, of course. The object returned
will allow you to create PNGs, GIFs, image maps, or whatever graphical
representation of your tree you might want.
A C<leaf_colors> argument can specify a fill color for each leaf node
in the tree. The keys of the hash should be the same as the strings
appearing as the C<result> parameters given to C<add_instance()>, and
the values should be any GraphViz-style color specification.
Any additional arguments given to C<as_graphviz()> will be passed on
to GraphViz's C<new()> method. See the L<GraphViz> docs for more
info.
=back
=head1 LIMITATIONS
A few limitations exist in the current version. All of them could be
removed in future versions - especially with your help. =)
=over 4
=item No continuous attributes
In the current implementation, only discrete-valued attributes are
supported. This means that an attribute like "temperature" can have
values like "cool", "medium", and "hot", but using actual temperatures
like 87 or 62.3 is not going to work. This is because the values
would split the data too finely - the tree-building process would
probably think that it could make all its decisions based on the exact
temperature value alone, ignoring all other attributes, because each
temperature would have only been seen once in the training data.
The usual way to deal with this problem is for the tree-building
process to figure out how to place the continuous attribute values
into a set of bins (like "cool", "medium", and "hot") and then build
the tree based on these bin values. Future versions of
C<AI::DecisionTree> may provide support for this. For now, you have
to do it yourself.
=back
=head1 TO DO
All the stuff in the LIMITATIONS section. Also, revisit the pruning
algorithm to see how it can be improved.
=head1 AUTHOR
Ken Williams, ken@mathforum.org
=head1 SEE ALSO
Mitchell, Tom (1997). Machine Learning. McGraw-Hill. pp 52-80.
Quinlan, J. R. (1986). Induction of decision trees. Machine
Learning, 1(1), pp 81-106.
L<perl>, L<GraphViz>
=cut
( run in 0.644 second using v1.01-cache-2.11-cpan-39bf76dae61 )