AI-Perceptron-Simple

 view release on metacpan or  search on metacpan

Changes  view on Meta::CPAN


Version 1.04    17 SEPTEMBER 2021
* fixed some critical problems
    * yml nerve not loading back as an AI::Perceptron::Simple object
    * fix docs: missing parameter $nerve for:
        * save_perceptron
        * save_perceptron_yaml
* changed die to croak for file opening

Version 1.03    9 SEPTEMBER 2021
* data processing subroutine available:
    * shuffle data
    * added import tag ":process_data"
* added more useful data to the confusion matrix:
    * sum of column and rows to make it look more classic :)
    * more_stats option to show more stats:
        * precision, specificity, F1 score, negative predicted value, false negative rate, false positive rate
        * false discovery rate, false omission rate, balanced accuracy

Version 1.02    26 AUGUST 2021
* minimum perl version changed to 5.8.1 due to YAML
* fix test for display_confusion_matrix
    * modifier "n" ( >= perl 5.22 ) changed to primitive '?:', 5.22 is too high
    * fixed inaccurate test for output
* YAML (nerve file) for portability supported
    * make subroutines exportable, the names are too long
       * :local_data
       * :portable_data
* cleaned & refactored codes
    * &display_confusion_matrix
* improved the documentation

Version 1.01    24 AUGUST 2021
* Fixed some technical issues
    * fixed test scripts not run in correct sequence
        * must be creation -> train -> validate/test

Changes  view on Meta::CPAN

        * read only expected and predicted columns, line by line
        * return a hash of data
            * TP, TN, FP, FN
            * total entries
            * accuracy
            * sensitivity
    * display confusion matrix data to console
        * use Text:Matrix

    * synonyms
        * synonyms MUST call actual subroutines and not copy pasting!
        * train: tame, exercise
        * validate: take_mock_exam, take_lab_test
        * test:  take_real_exam, work_in_real_world
        * generate_confusion_matrix: get_exam_results
        * display_confusion_matrix: display_exam_results
        * save_perceptron: preserve
        * load_perceptron: revive



LICENSE  view on Meta::CPAN

(2)  You may Distribute verbatim copies of the Source form of the
Standard Version of this Package in any medium without restriction,
either gratis or for a Distributor Fee, provided that you duplicate
all of the original copyright notices and associated disclaimers.  At
your discretion, such verbatim copies may or may not include a
Compiled form of the Package.

(3)  You may apply any bug fixes, portability changes, and other
modifications made available from the Copyright Holder.  The resulting
Package will still be considered the Standard Version, and as such
will be subject to the Original License.


Distribution of Modified Versions of the Package as Source 

(4)  You may Distribute your Modified Version as Source (either gratis
or for a Distributor Fee, and with or without a Compiled form of the
Modified Version) provided that you clearly document how it differs
from the Standard Version, including, but not limited to, documenting
any non-standard features, executables, or modules, and provided that
you do at least ONE of the following:

LICENSE  view on Meta::CPAN

build stand-alone binary or bytecode versions of applications that
include the Package, and Distribute the result without restriction,
provided the result does not expose a direct interface to the Package.


Items That are Not Considered Part of a Modified Version 

(9) Works (including, but not limited to, modules and scripts) that
merely extend or make use of the Package, do not, by themselves, cause
the Package to be a Modified Version.  In addition, such works are not
considered parts of the Package itself, and are not subject to the
terms of this license.


General Provisions

(10)  Any use, modification, and distribution of the Standard or
Modified Versions is governed by this Artistic License. By using,
modifying or distributing the Package, you accept this license. Do not
use, modify, or distribute the Package, if you do not accept this
license.

README  view on Meta::CPAN

AI-Perceptron-Simple (v1.04)

A Newbie Friendly Module to Create, Train, Validate and Test Perceptrons / Neurons

This module provides methods to build, train, validate and test a perceptron. It can also save the data of the perceptron for future use for any actual AI programs.

This module is also aimed to help newbies grasp hold of the concept of perceptron, training, validation and testing as much as possible. Hence, all the methods and subroutines in this module are decoupled as much as possible so that the actual script...

INSTALLATION

To install this module, run the following commands:

	perl Makefile.PL
	make
	make test
	make install

docs/AI-Perceptron-Simple-1.04.html  view on Meta::CPAN

    # processing data
    use AI::Perceptron::Simple ":process_data";
    shuffle_stimuli ( ... )
    shuffle_data ( ORIGINAL_STIMULI, $new_file_1, $new_file_2, ... );
    shuffle_data ( $original_stimuli =&gt; $new_file_1, $new_file_2, ... );</code></pre>

<h1 id="EXPORT">EXPORT</h1>

<p>None by default.</p>

<p>All the subroutines from <code>DATA PROCESSING RELATED SUBROUTINES</code>, <code>NERVE DATA RELATED SUBROUTINES</code> and <code>NERVE PORTABILITY RELATED SUBROUTINES</code> sections are importable through tags or manually specifying them.</p>

<p>The tags available include the following:</p>

<dl>

<dt id="process_data---subroutines-under-DATA-PROCESSING-RELATED-SUBROUTINES-section"><code>:process_data</code> - subroutines under <code>DATA PROCESSING RELATED SUBROUTINES</code> section.</dt>
<dd>

</dd>
<dt id="local_data---subroutines-under-NERVE-DATA-RELATED-SUBROUTINES-section"><code>:local_data</code> - subroutines under <code>NERVE DATA RELATED SUBROUTINES</code> section.</dt>
<dd>

</dd>
<dt id="portable_data---subroutines-under-NERVE-PORTABILITY-RELATED-SUBROUTINES-section"><code>:portable_data</code> - subroutines under <code>NERVE PORTABILITY RELATED SUBROUTINES</code> section.</dt>
<dd>

</dd>
</dl>

<p>Most of the stuff are OO.</p>

<h1 id="DESCRIPTION">DESCRIPTION</h1>

<p>This module provides methods to build, train, validate and test a perceptron. It can also save the data of the perceptron for future use for any actual AI programs.</p>

<p>This module is also aimed to help newbies grasp hold of the concept of perceptron, training, validation and testing as much as possible. Hence, all the methods and subroutines in this module are decoupled as much as possible so that the actual scr...

<p>The implementation here is super basic as it only takes in input of the dendrites and calculate the output. If the output is higher than the threshold, the final result (category) will be 1 aka perceptron is activated. If not, then the result will...

<p>Depending on how you view or categorize the final result, the perceptron will fine tune itself (aka train) based on the learning rate until the desired result is met. Everything from here on is all mathematics and numbers which only makes sense to...

<p>Whenever the perceptron fine tunes itself, it will increase/decrease all the dendrites that is significant (attributes labelled 1) for each input. This means that even when the perceptron successfully fine tunes itself to suite all the data in you...

<h1 id="CONVENTIONS-USED">CONVENTIONS USED</h1>

<p>Please take note that not all subroutines/method must be used to make things work. All the subroutines and methods are listed out for the sake of writing the documentation.</p>

<p>Private methods/subroutines are prefixed with <code>_</code> or <code>&amp;_</code> and they aren&#39;t meant to be called directly. You can if you want to. There are quite a number of them to be honest, just ignore them if you happen to see them ...

<p>Synonyms are placed before the actual ie. technical subroutines/methods. You will see <code>...</code> as the parameters if they are synonyms. Move to the next subroutine/method until you find something like <code>\%options</code> as the parameter...

<h1 id="DATASET-STRUCTURE">DATASET STRUCTURE</h1>

<p><i>This module can only process CSV files.</i></p>

<p>Any field ie columns that will be used for processing must be binary ie. <code>0</code> or <code>1</code> only. Your dataset can contain other columns with non-binary data as long as they are not one of the dendrites.</p>

<p>There are soem sample dataset which can be found in the <code>t</code> directory. The original dataset can also be found in <code>docs/book_list.csv</code>. The files can also be found <a href="https://github.com/Ellednera/AI-Perceptron-Simple">he...

<h1 id="PERCEPTRON-DATA">PERCEPTRON DATA</h1>

<p>The perceptron/neuron data is stored using the <code>Storable</code> module.</p>

<p>See <code>Portability of Nerve Data</code> section below for more info on some known issues.</p>

<h1 id="DATA-PROCESSING-RELATED-SUBROUTINES">DATA PROCESSING RELATED SUBROUTINES</h1>

<p>These subroutines can be imported using the tag <code>:process_data</code>.</p>

<p>These subroutines should be called in the procedural way.</p>

<h2 id="shuffle_stimuli">shuffle_stimuli ( ... )</h2>

<p>The parameters and usage are the same as <code>shuffled_data</code>. See the next two subroutines.</p>

<h2 id="shuffle_data-original_data-shuffled_1-shuffled_2">shuffle_data ( $original_data =&gt; $shuffled_1, $shuffled_2, ... )</h2>

<h2 id="shuffle_data-ORIGINAL_DATA-shuffled_1-shuffled_2">shuffle_data ( ORIGINAL_DATA, $shuffled_1, $shuffled_2, ... )</h2>

<p>Shuffles <code>$original_data</code> or <code>ORIGINAL_DATA</code> and saves them to other files.</p>

<h1 id="CREATION-RELATED-SUBROUTINES-METHODS">CREATION RELATED SUBROUTINES/METHODS</h1>

<h2 id="new-options">new ( \%options )</h2>

docs/AI-Perceptron-Simple-1.04.html  view on Meta::CPAN

<p>If <code>$display_stats</code> is specified ie. set to <code>1</code>, then you <b>MUST</b> specify the <code>$identifier</code>. <code>$identifier</code> is the column / header name that is used to identify a specific row of data in <code>$stimul...

<h2 id="calculate_output-self-stimuli_hash">&amp;_calculate_output( $self, \%stimuli_hash )</h2>

<p>Calculates and returns the <code>sum(weightage*input)</code> for each individual row of data. Actually, it justs add up all the existing weight since the <code>input</code> is always 1 for now :)</p>

<p><code>%stimuli_hash</code> is the actual data to be used for training. It might contain useless columns.</p>

<p>This will get all the avaible dendrites using the <code>get_attributes</code> method and then use all the keys ie. headers to access the corresponding values.</p>

<p>This subroutine should be called in the procedural way for now.</p>

<h2 id="tune-self-stimuli_hash-tune_up_or_down">&amp;_tune( $self, \%stimuli_hash, $tune_up_or_down )</h2>

<p>Fine tunes the nerve. This will directly alter the attributes values in <code>$self</code> according to the attributes / dendrites specified in <code>new</code>.</p>

<p>The <code>%stimuli_hash</code> here is the same as the one in the <code>_calculate_output</code> method.</p>

<p><code>%stimuli_hash</code> will be used to determine which dendrite in <code>$self</code> needs to be fine-tuned. As long as the value of any key in <code>%stimuli_hash</code> returns true (1) then that dendrite in <code>$self</code> will be tuned...

<p>Tuning up or down depends on <code>$tune_up_or_down</code> specifed by the <code>train</code> method. The following constants can be used for <code>$tune_up_or_down</code>:</p>

docs/AI-Perceptron-Simple-1.04.html  view on Meta::CPAN


</dd>
<dt id="TUNE_DOWN">TUNE_DOWN</dt>
<dd>

<p>Value is <code>0</code></p>

</dd>
</dl>

<p>This subroutine should be called in the procedural way for now.</p>

<h1 id="VALIDATION-RELATED-METHODS">VALIDATION RELATED METHODS</h1>

<p>All the validation methods here have the same parameters as the actual <code>validate</code> method and they all do the same stuff. They are also used in the same way.</p>

<h2 id="take_mock_exam">take_mock_exam (...)</h2>

<h2 id="take_lab_test">take_lab_test (...)</h2>

<h2 id="validate-options">validate ( \%options )</h2>

docs/AI-Perceptron-Simple-1.04.html  view on Meta::CPAN

<p>This method works and behaves the same way as the <code>validate</code> method. See <code>validate</code> for the details.</p>

<p><i>*This method will call &amp;_real_validate_or_test to do the actual work.</i></p>

<h2 id="real_validate_or_test-data_hash_ref">_real_validate_or_test ( $data_hash_ref )</h2>

<p>This is where the actual validation or testing takes place.</p>

<p><code>$data_hash_ref</code> is the list of parameters passed into the <code>validate</code> or <code>test</code> methods.</p>

<p>This is a <b>method</b>, so use the OO way. This is one of the exceptions to the rules where private subroutines are treated as methods :)</p>

<h2 id="fill_predicted_values-self-stimuli_validate-predicted_index-aoa">&amp;_fill_predicted_values ( $self, $stimuli_validate, $predicted_index, $aoa )</h2>

<p>This is where the filling in of the predicted values takes place. Take note that the parameters naming are the same as the ones used in the <code>validate</code> and <code>test</code> method.</p>

<p>This subroutine should be called in the procedural way.</p>

<h1 id="RESULTS-RELATED-SUBROUTINES-METHODS">RESULTS RELATED SUBROUTINES/METHODS</h1>

<p>This part is related to generating the confusion matrix.</p>

<h2 id="get_exam_results">get_exam_results ( ... )</h2>

<p>The parameters and usage are the same as <code>get_confusion_matrix</code>. See the next method.</p>

<h2 id="get_confusion_matrix-options">get_confusion_matrix ( \%options )</h2>

docs/AI-Perceptron-Simple-1.04.html  view on Meta::CPAN

<p>Returns a list <code>( $matrix, $c_matrix )</code> which can directly be passed to <code>_print_extended_matrix</code>.</p>

<h2 id="print_extended_matrix-matrix-c_matrix">&amp;_print_extended_matrix ( $matrix, $c_matrix )</h2>

<p>Extends and outputs the matrix on the screen.</p>

<p><code>$matrix</code> and <code>$c_matrix</code> are the same as returned by <code>&amp;_build_matrix</code>.</p>

<h1 id="NERVE-DATA-RELATED-SUBROUTINES">NERVE DATA RELATED SUBROUTINES</h1>

<p>This part is about saving the data of the nerve. These subroutines can be imported using the <code>:local_data</code> tag.</p>

<p><b>The subroutines are to be called in the procedural way</b>. No checking is done currently.</p>

<p>See <code>PERCEPTRON DATA</code> and <code>KNOWN ISSUES</code> sections for more details on the subroutines in this section.</p>

<h2 id="preserve">preserve ( ... )</h2>

<p>The parameters and usage are the same as <code>save_perceptron</code>. See the next subroutine.</p>

<h2 id="save_perceptron-nerve-nerve_file">save_perceptron ( $nerve, $nerve_file )</h2>

<p>Saves the <code>AI::Perceptron::Simple</code> object into a <code>Storable</code> file. There shouldn&#39;t be a need to call this method manually since after every training process this will be called automatically.</p>

<h2 id="revive">revive (...)</h2>

<p>The parameters and usage are the same as <code>load_perceptron</code>. See the next subroutine.</p>

<h2 id="load_perceptron-nerve_file_to_load">load_perceptron ( $nerve_file_to_load )</h2>

<p>Loads the data and turns it into a <code>AI::Perceptron::Simple</code> object as the return value.</p>

<h1 id="NERVE-PORTABILITY-RELATED-SUBROUTINES">NERVE PORTABILITY RELATED SUBROUTINES</h1>

<p>These subroutines can be imported using the <code>:portable_data</code> tag.</p>

<p>The file type currently supported is YAML. Please be careful with the data as you won&#39;t want the nerve data accidentally modified.</p>

<h2 id="preserve_as_yaml">preserve_as_yaml ( ... )</h2>

<p>The parameters and usage are the same as <code>save_perceptron_yaml</code>. See the next subroutine.</p>

<h2 id="save_perceptron_yaml-nerve-yaml_nerve_file">save_perceptron_yaml ( $nerve, $yaml_nerve_file )</h2>

<p>Saves the <code>AI::Perceptron::Simple</code> object into a <code>YAML</code> file.</p>

<h2 id="revive_from_yaml">revive_from_yaml (...)</h2>

<p>The parameters and usage are the same as <code>load_perceptron</code>. See the next subroutine.</p>

<h2 id="load_perceptron_yaml-yaml_nerve_file">load_perceptron_yaml ( $yaml_nerve_file )</h2>

<p>Loads the YAML data and turns it into a <code>AI::Perceptron::Simple</code> object as the return value.</p>

<h1 id="TO-DO">TO DO</h1>

<p>These are the to-do&#39;s that <b>MIGHT</b> be done in the future. Don&#39;t put too much hope in them please :)</p>

<ul>

docs/AI-Perceptron-Simple-1.04.html  view on Meta::CPAN


</li>
</ul>

<h1 id="KNOWN-ISSUES">KNOWN ISSUES</h1>

<h2 id="Portability-of-Nerve-Data">Portability of Nerve Data</h2>

<p>Take note that the <code>Storable</code> nerve data is not compatible across different versions.</p>

<p>If you really need to send the nerve data to different computers with different versions of <code>Storable</code> module, see the docs of the following subroutines:</p>

<ul>

<li><p><code>&amp;preserve_as_yaml</code> or <code>&amp;save_perceptron_yaml</code> for storing data.</p>

</li>
<li><p><code>&amp;revive_from_yaml</code> or <code>&amp;load_perceptron_yaml</code> for retrieving the data.</p>

</li>
</ul>

docs/specifications.t  view on Meta::CPAN

#       [v] return a hash of data
#           [v] TP, TN, FP, FN
#           [v] accuracy
#           [v] sensitivity
#   [v] remove the return value for "train" method
#   [v] display confusion matrix data to console
#       [v] use Text:Matrix
#
# Version 0.04 / Version 1.0 - completed on 23 AUGUST 2021
#   [v] add synonyms
#       [v] synonyms MUST call actual subroutines and not copy pasting!
#       train: [v] tame  [v] exercise
#       validate: [v] take_mock_exam  [v] take_lab_test
#       test: [v] take_real_exam  [v] work_in_real_world
#       generate_confusion_matrix: [v] get_exam_results
#       display_confusion_matrix: [v] display_exam_results
#       save_perceptron: [v] preserve
#       load_perceptron: [v] revive
#
# Version 1.01
#   [v] fixed currently known issues as much as possible (see 'Changes')
#       - "long size integer" === "byte order not compatible"
#
# Version 1.02
#   [v] minimum perl version changed to 5.8 due to Test::Output
#   [v] YAML (nerve file) for portability
#       [v] make subroutines exportable, the names are too long
#           [v] :local_data
#           [v] :portable_data
#   [v] fix test for display_confusion_matrix
#       [v] modifier "n" (perl 5.22 and above) changed to primitive '?:', 5.22 is too high
#       [v] fixed inaccurate test for output part
#   [v] clean & refactor codes
#       [v] refactored &display_confusion_matrix
#   [v] improve the documentation
#
# Version 1.03

docs/specifications.t  view on Meta::CPAN

# these three steps could be done in seperated scripts if necessary
# &train and &validate could be put inside a loop or something
# the parameters make more sense when they are taken from @ARGV
    # so when it's the first time training, it will create the nerve_file,
    # the second time and up it will directly overrride that file since everything is read from it
    # ... anyway :) afterall training stage wasn't meant to be a fully working program, so it shouldnt be a problem
# just assume that 
$perceptron->train( $stimuli_train, $save_nerve_to_file ); 
    # reads training stimuli from csv
    # tune attributes based on csv data
        # calls the same subroutine to do the calculation
    # shouldn't give any output upon completion
    # should save a copy of itselt into a new file
    # returns the nerve's data filename to be used in validate()
        # these two can go into a loop with conditions checking
        # which means that we can actuall write this
            # $perceptron->validate( $stimuli_validate, 
            #                        $perceptron->train( $stimuli_train, $save_nerve_to_file ) 
            #                       );
            # and then check the confusion matrix, if not satisfied, run the loop again :)
$perceptron->validate( $stimuli_validate, $nerve_data_to_read );
$perceptron->test( $stimuli_test ); # loads nerve data from data file, turn into a object, then do the following:
    # reads from csv :
        # validation stimuli
        # testing stimuli
    # both will call the same subroutine to do calculation
    # both will write predicted data into the original data file

# show results ie confusion matrix (TP-true positive, TN-true negative, FP-false positive, FN-false negative)
# this should only be done during validation and testing
$perceptron->generate_confusion_matrix( { 1 => $csv_header_true, 0 => $csv_header_false } );
    # calculates the 4 thingy based on the current data on hand (RAM), don't read from file again, it shouldn't be a problem
        # returns a hash
    # ie it must be used together with validate() and test() to avoid problems
        # ie validate() and test() must be in different scripts, which makes sense
        # unless, create 3 similar objects to do the work in one go

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

    # processing data
    use AI::Perceptron::Simple ":process_data";
    shuffle_stimuli ( ... )
    shuffle_data ( ORIGINAL_STIMULI, $new_file_1, $new_file_2, ... );
    shuffle_data ( $original_stimuli => $new_file_1, $new_file_2, ... );

=head1 EXPORT

None by default.

All the subroutines from C<DATA PROCESSING RELATED SUBROUTINES>, C<NERVE DATA RELATED SUBROUTINES> and C<NERVE PORTABILITY RELATED SUBROUTINES> sections are importable through tags or manually specifying them.

The tags available include the following:

=over 4

=item C<:process_data> - subroutines under C<DATA PROCESSING RELATED SUBROUTINES> section.

=item C<:local_data> - subroutines under C<NERVE DATA RELATED SUBROUTINES> section.

=item C<:portable_data> - subroutines under C<NERVE PORTABILITY RELATED SUBROUTINES> section.

=back

Most of the stuff are OO.

=cut

use Exporter qw( import );
our @EXPORT_OK = qw( 
    shuffle_data shuffle_stimuli

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

our %EXPORT_TAGS = ( 
    process_data => [ qw( shuffle_data shuffle_stimuli ) ],
    local_data => [ qw( preserve save_perceptron revive load_perceptron ) ],
    portable_data => [ qw( preserve_as_yaml save_perceptron_yaml revive_from_yaml load_perceptron_yaml ) ],
);

=head1 DESCRIPTION

This module provides methods to build, train, validate and test a perceptron. It can also save the data of the perceptron for future use for any actual AI programs.

This module is also aimed to help newbies grasp hold of the concept of perceptron, training, validation and testing as much as possible. Hence, all the methods and subroutines in this module are decoupled as much as possible so that the actual script...

The implementation here is super basic as it only takes in input of the dendrites and calculate the output. If the output is higher than the threshold, the final result (category) will 
be 1 aka perceptron is activated. If not, then the result will be 0 (not activated).

Depending on how you view or categorize the final result, the perceptron will fine tune itself (aka train) based on the learning rate until the desired result is met. Everything from 
here on is all mathematics and numbers which only makes sense to the computer and not humans anymore.

Whenever the perceptron fine tunes itself, it will increase/decrease all the dendrites that is significant (attributes labelled 1) for each input. This means that even when the 
perceptron successfully fine tunes itself to suite all the data in your file for the first round, the perceptron might still get some of the things wrong for the next round of training. 
Therefore, the perceptron should be trained for as many rounds as possible. The more "confusion" the perceptron is able to correctly handle, the more "mature" the perceptron is. 
No one defines how "mature" it is except the programmer himself/herself :)

=head1 CONVENTIONS USED

Please take note that not all subroutines/method must be used to make things work. All the subroutines and methods are listed out for the sake of writing the documentation. 

Private methods/subroutines are prefixed with C<_> or C<&_> and they aren't meant to be called directly. You can if you want to. There are quite a number of them to be honest, just ignore them if you happen to see them :)

Synonyms are placed before the actual ie. technical subroutines/methods. You will see C<...> as the parameters if they are synonyms. Move to the next subroutine/method until you find something like C<\%options> as the parameter or anything that isn't...

=head1 DATASET STRUCTURE

I<This module can only process CSV files.>

Any field ie columns that will be used for processing must be binary ie. C<0> or C<1> only. Your dataset can contain other columns with non-binary data as long as they are not one of the dendrites.

There are soem sample dataset which can be found in the C<t> directory. The original dataset can also be found in C<docs/book_list.csv>. The files can also be found L<here|https://github.com/Ellednera/AI-Perceptron-Simple>.

=head1 PERCEPTRON DATA

The perceptron/neuron data is stored using the C<Storable> module. 

See C<Portability of Nerve Data> section below for more info on some known issues.

=head1 DATA PROCESSING RELATED SUBROUTINES

These subroutines can be imported using the tag C<:process_data>.

These subroutines should be called in the procedural way.

=head2 shuffle_stimuli ( ... )

The parameters and usage are the same as C<shuffled_data>. See the next two subroutines.

=head2 shuffle_data ( $original_data => $shuffled_1, $shuffled_2, ... )

=head2 shuffle_data ( ORIGINAL_DATA, $shuffled_1, $shuffled_2, ... )

Shuffles C<$original_data> or C<ORIGINAL_DATA> and saves them to other files.

=cut

sub shuffle_stimuli {
    shuffle_data( @_ );
}

sub shuffle_data {
    my $stimuli = shift or croak "Please specify the original file name";
    my @shuffled_stimuli_names = @_ 
        or croak "Please specify the output files for the shuffled data";
    
    my @aoa;
    for ( @shuffled_stimuli_names ) {
        # copied from _real_validate_or_test
        # open for shuffling
        my $aoa = csv (in => $stimuli, encoding => ":encoding(utf-8)");
        my $attrib_array_ref = shift @$aoa; # 'remove' the header, it's annoying :)

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

Optional. The default is C<0.5>

This is the passing rate to determine the neuron output (C<0> or C<1>).

Generally speaking, this value is usually between C<0> and C<1>. However, it all depend on your combination of numbers for the other options.

=back

=cut

sub new {
    my $class = shift;
    
    my $data_ref = shift;
    my %data = %{ $data_ref };
    
    # check keys
    $data{ learning_rate } = LEARNING_RATE if not exists $data{ learning_rate };
    $data{ threshold } = THRESHOLD if not exists $data{ threshold };
    
    #####
    # don't pack this key checking process into a subroutine for now
    # this is also used in &_real_validate_or_test
    my @missing_keys;
    for ( qw( initial_value attribs ) ) {
        push @missing_keys, $_ unless exists $data{ $_ };
    }
    
    croak "Missing keys: @missing_keys" if @missing_keys;
    #####
    
    # continue to process the rest of the data

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

    
    bless \%processed_data, $class;
}

=head2 get_attributes

Obtains a hash of all the attributes of the perceptron

=cut

sub get_attributes {
    my $self = shift;
    %{ $self->{attributes_hash_ref} };
}

=head2 learning_rate ( $value )

=head2 learning_rate

If C<$value> is given, sets the learning rate to C<$value>. If not, then it returns the learning rate.

=cut

sub learning_rate {
    my $self = shift;
    if ( @_ ) {
        $self->{learning_rate} = shift;
    } else {
        $self->{learning_rate}
    }
}

=head2 threshold ( $value )

=head2 threshold

If C<$value> is given, sets the threshold / passing rate to C<$value>. If not, then it returns the passing rate.

=cut

sub threshold {
    my $self = shift;
    if ( @_ ) {
        $self->{ threshold } = shift;
    } else {
        $self->{ threshold };
    }
}

=head1 TRAINING RELATED SUBROUTINES/METHODS

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

=item new sum

The new sum of all C<weightage * input> after fine-tuning the nerve

=back

If C<$display_stats> is specified ie. set to C<1>, then you B<MUST> specify the C<$identifier>. C<$identifier> is the column / header name that is used to identify a specific row of data in C<$stimuli_train_csv>.

=cut

sub tame {
    train( @_ );
}

sub exercise {
    train( @_ );
}

sub train {
    my $self = shift;
    my( $stimuli_train_csv, $expected_output_header, $save_nerve_to_file, $display_stats, $identifier ) = @_;
    
    $display_stats = 0 if not defined $display_stats;
    if ( $display_stats and not defined $identifier ) {
        croak "Please specifiy a string for \$identifier if you are trying to display stats";
    }
    
    # CSV processing is all according to the documentation of Text::CSV
    open my $data_fh, "<:encoding(UTF-8)", $stimuli_train_csv 

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

}

=head2 &_calculate_output( $self, \%stimuli_hash )

Calculates and returns the C<sum(weightage*input)> for each individual row of data. Actually, it justs add up all the existing weight since the C<input> is always 1 for now :)

C<%stimuli_hash> is the actual data to be used for training. It might contain useless columns.

This will get all the avaible dendrites using the C<get_attributes> method and then use all the keys ie. headers to access the corresponding values.

This subroutine should be called in the procedural way for now.

=cut

sub _calculate_output {
    my $self = shift; 
    my $stimuli_hash_ref = shift;
    
    my %dendrites = $self->get_attributes;
    my $sum; # this is the output
    
    for ( keys %dendrites ) {
        # if input is 1 for a dendrite, then calculate it
        if ( $stimuli_hash_ref->{ $_ } ) {
            # $sum += $dendrites{ $_ } * 1; # no need, if 1 then it is always the value itself

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

=item TUNE_UP

Value is C<1>

=item TUNE_DOWN

Value is C<0>

=back

This subroutine should be called in the procedural way for now.

=cut

sub _tune {
    my $self = shift; 
    my ( $stimuli_hash_ref, $tuning_status ) = @_;

    my %dendrites = $self->get_attributes;

    for ( keys %dendrites ) {
        if ( $tuning_status == TUNE_DOWN ) {
            
            if ( $stimuli_hash_ref->{ $_ } ) { # must check this one, it must be 1 before we can alter the actual dendrite size in the nerve :)
                $self->{ attributes_hash_ref }{ $_ } -= $self->learning_rate;

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

Optional.

The default behaviour will write the predicted output back into C<stimuli_validate> ie the original data. The sequence of the data will be maintained.

=back

I<*This method will call C<_real_validate_or_test> to do the actual work.>

=cut

sub take_mock_exam {
    my ( $self, $data_hash_ref ) = @_;
    $self->_real_validate_or_test( $data_hash_ref );
}

sub take_lab_test {
    my ( $self, $data_hash_ref ) = @_;
    $self->_real_validate_or_test( $data_hash_ref );
}

sub validate {
    my ( $self, $data_hash_ref ) = @_;
    $self->_real_validate_or_test( $data_hash_ref );
}

=head1 TESTING RELATED SUBROUTINES/METHODS

All the testing methods here have the same parameters as the actual C<test> method and they all do the same stuff. They are also used in the same way.

=head2 take_real_exam (...)

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

This method is used to put the trained nerve to the test. You can think of it as deploying the nerve for the actual work or maybe putting the nerve into an empty brain and see how 
well the brain survives :)

This method works and behaves the same way as the C<validate> method. See C<validate> for the details.

I<*This method will call &_real_validate_or_test to do the actual work.>

=cut

# redirect to _real_validate_or_test
sub take_real_exam {
    my ( $self, $data_hash_ref ) = @_;
    $self->_real_validate_or_test( $data_hash_ref );
}

sub work_in_real_world {
    my ( $self, $data_hash_ref ) = @_;
    $self->_real_validate_or_test( $data_hash_ref );
}

sub test {
    my ( $self, $data_hash_ref ) = @_;
    $self->_real_validate_or_test( $data_hash_ref );
}

=head2 _real_validate_or_test ( $data_hash_ref )

This is where the actual validation or testing takes place. 

C<$data_hash_ref> is the list of parameters passed into the C<validate> or C<test> methods.

This is a B<method>, so use the OO way. This is one of the exceptions to the rules where private subroutines are treated as methods :)

=cut

sub _real_validate_or_test {

    my $self = shift;   my $data_hash_ref = shift;
    
    #####
    my @missing_keys;
    for ( qw( stimuli_validate predicted_column_index ) ) {
        push @missing_keys, $_ unless exists $data_hash_ref->{ $_ };
    }
    
    croak "Missing keys: @missing_keys" if @missing_keys;

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

    print "Saving data to $output_file\n";
    csv( in => $aoa, out => $output_file, encoding => ":encoding(utf-8)" );
    print "Done saving!\n";

}

=head2 &_fill_predicted_values ( $self, $stimuli_validate, $predicted_index, $aoa )

This is where the filling in of the predicted values takes place. Take note that the parameters naming are the same as the ones used in the C<validate> and C<test> method.

This subroutine should be called in the procedural way.

=cut

sub _fill_predicted_values {
    my ( $self, $stimuli_validate, $predicted_index, $aoa ) = @_;

    # CSV processing is all according to the documentation of Text::CSV
    open my $data_fh, "<:encoding(UTF-8)", $stimuli_validate 
        or croak "Can't open $stimuli_validate: $!";
    
    my $csv = Text::CSV->new( {auto_diag => 1, binary => 1} );
    
    my $attrib = $csv->getline($data_fh);
    

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

=item more_stats => 1

Optional.

Setting it to C<1> will process more stats that are usually not so important eg. C<precision>, C<specificity> and C<F1_Score>

=back

=cut

sub get_exam_results {

    my ( $self, $info ) = @_;
    
    $self->get_confusion_matrix( $info );
}

sub get_confusion_matrix {

    my ( $self, $info ) = @_;

    my %c_matrix = _collect_stats( $info ); # processes total_entries, accuracy, sensitivity etc
    
    %c_matrix;
}


=head2 &_collect_stats ( \%options )

Generates a hash of confusion matrix based on C<%options> given in the C<get_confusion_matrix> method.

=cut

sub _collect_stats {
    my $info = shift;
    my $file = $info->{ full_data_file };
    my $actual_header = $info->{ actual_output_header };
    my $predicted_header = $info->{ predicted_output_header };
    my $more_stats = defined ( $info->{ more_stats } ) ? 1 : 0;
    
    my %c_matrix = ( 
        true_positive => 0, true_negative => 0, false_positive => 0, false_negative => 0,
        accuracy => 0, sensitivity => 0
    );

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

    my $csv = Text::CSV->new( {auto_diag => 1, binary => 1} );
    
    my $attrib = $csv->getline($data_fh); # get the row of headers, can't specify any column
    # shouldn't be a problem, since we're reading line by line :)

    $csv->column_names( $attrib );

    # individual row
    while ( my $row = $csv->getline_hr($data_fh) ) {
        
        # don't pack this part into another subroutine, number of rows can be very big
        if ( $row->{ $actual_header } == 1 and $row->{ $predicted_header } == 1 ) {

            # true positive
            $c_matrix{ true_positive }++;
            
        } elsif ( $row->{ $actual_header } == 0 and $row->{ $predicted_header } == 0 ) {
            
            # true negative
            $c_matrix{ true_negative }++;
            

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

    
    %c_matrix;
}

=head2 &_calculate_total_entries ( $c_matrix_ref )

Calculates and adds the data for the C<total_entries> key in the confusion matrix hash.

=cut

sub _calculate_total_entries {

    my $c_matrix = shift;
    my $total = $c_matrix->{ true_negative } + $c_matrix->{ false_positive };
       $total += $c_matrix->{ false_negative } + $c_matrix->{ true_positive };

    $c_matrix->{ total_entries } = $total;

}

=head2 &_calculate_accuracy ( $c_matrix_ref )

Calculates and adds the data for the C<accuracy> key in the confusion matrix hash.

=cut

sub _calculate_accuracy {

    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ true_positive } + $c_matrix->{ true_negative };
    my $denominator = $numerator + $c_matrix->{ false_positive } + $c_matrix->{ false_negative };
    
    $c_matrix->{ accuracy } = $numerator / $denominator * 100;
    
    # no need to return anything, we're using ref
}

=head2 &_calculate_sensitivity ( $c_matrix_ref )

Calculates and adds the data for the C<sensitivity> key in the confusion matrix hash.

=cut

sub _calculate_sensitivity {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ true_positive };
    my $denominator = $numerator + $c_matrix->{ false_negative };
    
    $c_matrix->{ sensitivity } = $numerator / $denominator * 100;

    # no need to return anything, we're using ref
}

=head2 &_calculate_precision ( $c_matrix_ref )

Calculates and adds the data for the C<precision> key in the confusion matrix hash.

=cut

sub _calculate_precision {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ true_positive };
    my $denominator = $numerator + $c_matrix->{ false_positive };
    
    $c_matrix->{ precision } = $numerator / $denominator * 100;
}

=head2 &_calculate_specificity ( $c_matrix_ref )

Calculates and adds the data for the C<specificity> key in the confusion matrix hash.

=cut

sub _calculate_specificity {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ true_negative };
    my $denominator = $numerator + $c_matrix->{ false_positive };
    
    $c_matrix->{ specificity } = $numerator / $denominator * 100;
}

=head2 &_calculate_f1_score ( $c_matrix_ref )

Calculates and adds the data for the C<F1_Score> key in the confusion matrix hash.

=cut

sub _calculate_f1_score {
    my $c_matrix = shift;
    
    my $numerator = 2 * $c_matrix->{ true_positive };
    my $denominator = $numerator + $c_matrix->{ false_positive } + $c_matrix->{ false_negative };
    
    $c_matrix->{ F1_Score } = $numerator / $denominator * 100;
}       

=head2  &_calculate_negative_predicted_value( $c_matrix_ref )

Calculates and adds the data for the C<negative_predicted_value> key in the confusion matrix hash.

=cut

sub _calculate_negative_predicted_value {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ true_negative };
    my $denominator = $numerator + $c_matrix->{ false_negative };
    
    $c_matrix->{ negative_predicted_value } = $numerator / $denominator * 100;
}

=head2  &_calculate_false_negative_rate( $c_matrix_ref )

Calculates and adds the data for the C<false_negative_rate> key in the confusion matrix hash.

=cut

sub _calculate_false_negative_rate {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ false_negative };
    my $denominator = $numerator + $c_matrix->{ true_positive };
    
    $c_matrix->{ false_negative_rate } = $numerator / $denominator * 100;
}

=head2  &_calculate_false_positive_rate( $c_matrix_ref )

Calculates and adds the data for the C<false_positive_rate> key in the confusion matrix hash.

=cut

sub _calculate_false_positive_rate {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ false_positive };
    my $denominator = $numerator + $c_matrix->{ true_negative };
    
    $c_matrix->{ false_positive_rate } = $numerator / $denominator * 100;
}

=head2  &_calculate_false_discovery_rate( $c_matrix_ref )

Calculates and adds the data for the C<false_discovery_rate> key in the confusion matrix hash.

=cut

sub _calculate_false_discovery_rate {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ false_positive };
    my $denominator = $numerator + $c_matrix->{ true_positive };
    
    $c_matrix->{ false_discovery_rate } = $numerator / $denominator * 100;
}

=head2  &_calculate_false_omission_rate( $c_matrix_ref )

Calculates and adds the data for the C<false_omission_rate> key in the confusion matrix hash.

=cut

sub _calculate_false_omission_rate {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ false_negative };
    my $denominator = $numerator + $c_matrix->{ true_negative };
    
    $c_matrix->{ false_omission_rate } = $numerator / $denominator * 100;
}

=head2  &_calculate_balanced_accuracy( $c_matrix_ref )

Calculates and adds the data for the C<balanced_accuracy> key in the confusion matrix hash.

=cut

sub _calculate_balanced_accuracy {
    my $c_matrix = shift;
    
    my $numerator = $c_matrix->{ sensitivity } + $c_matrix->{ specificity };
    my $denominator = 2;
    
    $c_matrix->{ balanced_accuracy } = $numerator / $denominator; # numerator already in %
}

=head2 display_exam_results ( ... )

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

=item one_as => $category_one_name

=back

Please take note that non-ascii characters ie. non-English alphabets B<might> cause the output to go off :)

For the C<%labels>, there is no need to enter "actual X", "predicted X" etc. It will be prefixed with C<A: > for actual and C<P: > for the predicted values by default.

=cut

sub display_exam_results {

    my ( $self, $c_matrix, $labels ) = @_;
    
    $self->display_confusion_matrix( $c_matrix, $labels );
}

sub display_confusion_matrix {
    my ( $self, $c_matrix, $labels ) = @_;
    
    #####
    my @missing_keys;
    for ( qw( zero_as one_as ) ) {
        push @missing_keys, $_ unless exists $labels->{ $_ };
    }
    
    croak "Missing keys: @missing_keys" if @missing_keys;
    #####

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

=head2 &_build_matrix ( $c_matrix, $labels )

Builds the matrix using C<Text::Matrix> module.

C<$c_matrix> and C<$labels> are the same as the ones passed to C<display_exam_results> and C<>display_confusion_matrix.

Returns a list C<( $matrix, $c_matrix )> which can directly be passed to C<_print_extended_matrix>.

=cut

sub _build_matrix {

    my ( $c_matrix, $labels ) = @_;

    my $predicted_columns = [ "P: ".$labels->{ zero_as }, "P: ".$labels->{ one_as }, "Sum" ];
    my $actual_rows = [ "A: ".$labels->{ zero_as }, "A: ".$labels->{ one_as }, "Sum"];
    
    # row sum
    my $actual_0_sum = $c_matrix->{ true_negative } + $c_matrix->{ false_positive };
    my $actual_1_sum = $c_matrix->{ false_negative } + $c_matrix->{ true_positive };
    # column sum

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

}

=head2 &_print_extended_matrix ( $matrix, $c_matrix )

Extends and outputs the matrix on the screen.

C<$matrix> and C<$c_matrix> are the same as returned by C<&_build_matrix>.

=cut

sub _print_extended_matrix {

    my ( $matrix, $c_matrix ) = @_;
    
    print "~~" x24, "\n";
    print "CONFUSION MATRIX (A:actual  P:predicted)\n";
    print "~~" x24, "\n";

    print $matrix->matrix();

    print "~~" x24, "\n";

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

    print "  False Negative Rate: $c_matrix->{ false_negative_rate } %\n" if exists $c_matrix->{ false_negative_rate };
    print "  False Positive Rate: $c_matrix->{ false_positive_rate } %\n" if exists $c_matrix->{ false_positive_rate };
    print "  False Discovery Rate: $c_matrix->{ false_discovery_rate } %\n" if exists $c_matrix->{ false_discovery_rate };
    print "  False Omission Rate: $c_matrix->{ false_omission_rate } %\n" if exists $c_matrix->{ false_omission_rate };
    print "  Balanced Accuracy: $c_matrix->{ balanced_accuracy } %\n" if exists $c_matrix->{ balanced_accuracy };
    print "~~" x24, "\n";
}

=head1 NERVE DATA RELATED SUBROUTINES

This part is about saving the data of the nerve. These subroutines can be imported using the C<:local_data> tag.

B<The subroutines are to be called in the procedural way>. No checking is done currently.

See C<PERCEPTRON DATA> and C<KNOWN ISSUES> sections for more details on the subroutines in this section.

=head2 preserve ( ... )

The parameters and usage are the same as C<save_perceptron>. See the next subroutine.

=head2 save_perceptron ( $nerve, $nerve_file )

Saves the C<AI::Perceptron::Simple> object into a C<Storable> file. There shouldn't be a need to call this method manually since after every training 
process this will be called automatically.

=cut

sub preserve {
    save_perceptron( @_ );
}

sub save_perceptron {
    my $self = shift;
    my $nerve_file = shift;
    use Storable;
    store $self, $nerve_file;
    no Storable;
}

=head2 revive (...)

The parameters and usage are the same as C<load_perceptron>. See the next subroutine.

=head2 load_perceptron ( $nerve_file_to_load )

Loads the data and turns it into a C<AI::Perceptron::Simple> object as the return value.

=cut

sub revive {
    load_perceptron( @_ );
}

sub load_perceptron {
    my $nerve_file_to_load = shift;
    use Storable;
    my $loaded_nerve = retrieve( $nerve_file_to_load );
    no Storable;
    
    $loaded_nerve;
}

=head1 NERVE PORTABILITY RELATED SUBROUTINES

These subroutines can be imported using the C<:portable_data> tag.

The file type currently supported is YAML. Please be careful with the data as you won't want the nerve data accidentally modified.

=head2 preserve_as_yaml ( ... )

The parameters and usage are the same as C<save_perceptron_yaml>. See the next subroutine.

=head2 save_perceptron_yaml ( $nerve, $yaml_nerve_file )

Saves the C<AI::Perceptron::Simple> object into a C<YAML> file.

=cut

sub preserve_as_yaml {
    save_perceptron_yaml( @_ );
}

sub save_perceptron_yaml {
    my $self = shift;
    my $nerve_file = shift;
    use YAML;
    YAML::DumpFile( $nerve_file, $self );
    no YAML;
}

=head2 revive_from_yaml (...)

The parameters and usage are the same as C<load_perceptron>. See the next subroutine.

=head2 load_perceptron_yaml ( $yaml_nerve_file )

Loads the YAML data and turns it into a C<AI::Perceptron::Simple> object as the return value.

=cut

sub revive_from_yaml {
    load_perceptron_yaml( @_ );
}

sub load_perceptron_yaml {
    my $nerve_file_to_load = shift;
    use YAML;
    local $YAML::LoadBlessed = 1;
    my $loaded_nerve = YAML::LoadFile( $nerve_file_to_load );
    no YAML;
    
    $loaded_nerve;
}

=head1 TO DO

lib/AI/Perceptron/Simple.pm  view on Meta::CPAN

=item * and something yet to be known...

=back

=head1 KNOWN ISSUES

=head2 Portability of Nerve Data

Take note that the C<Storable> nerve data is not compatible across different versions.

If you really need to send the nerve data to different computers with different versions of C<Storable> module, see the docs of the following subroutines: 

=over 4

=item * C<&preserve_as_yaml> or C<&save_perceptron_yaml> for storing data.

=item * C<&revive_from_yaml> or C<&load_perceptron_yaml> for retrieving the data.

=back

=head1 AUTHOR

t/02-creation.t  view on Meta::CPAN

is( $perceptron->threshold, 0.85, "Correct custom passing rate -> ".$perceptron->threshold );

# get_attributes()
my %attributes = $perceptron->get_attributes;
for ( @attributes ) {
    ok( $attributes{ $_ }, "Attribute \'$_\' present" );
    is( $attributes{ $_ }, $initial_value, "Correct initial value (".$attributes{$_}.") for  \'$_\'" );
}

# don't try to use Test::Carp, it won't work, it only tests for direct calling of carp and croak etc
subtest "Caught missing mandatory parameters" => sub {
    eval {
        my $no_attribs = AI::Perceptron::Simple->new( { initial_value => $initial_value} );
    };
    like( $@, qr/attribs/, "Caught missing attribs" );
    
    eval {
        my $perceptron = AI::Perceptron::Simple->new( { attribs => \@attributes} );
    };
    like($@, qr/initial_value/, "Caught missing initial_value");
    

t/02-state_portable.t  view on Meta::CPAN


my @attributes = qw ( has_trees trees_coverage_more_than_half has_other_living_things );

my $total_headers = scalar @attributes;

my $perceptron = AI::Perceptron::Simple->new( {
    initial_value => 0.01,
    attribs => \@attributes
} );

subtest "All data related subroutines found" => sub {
    # this only checks if the subroutines are contained in the package
    ok( AI::Perceptron::Simple->can("preserve_as_yaml"), "&preserve_as_yaml is present" );
    ok( AI::Perceptron::Simple->can("save_perceptron_yaml"), "&save_perceptron_yaml is persent" );

    ok( AI::Perceptron::Simple->can("revive_from_yaml"), "&revive_from_yaml is present" );
    ok( AI::Perceptron::Simple->can("load_perceptron_yaml"), "&load_perceptron_yaml is present" );

};

my $yaml_nerve_file = $FindBin::Bin . "/portable_nerve.yaml";

t/04-train.t  view on Meta::CPAN

local $@ = "";
eval { $perceptron->train( TRAINING_DATA, "brand", $nerve_file, WANT_STATS, IDENTIFIER) };
is ( $@, "", "No problem with \'train\' method (verbose) so far" );
}

ok ( $perceptron->train( TRAINING_DATA, "brand", $nerve_file), "No problem with \'train\' method (non-verbose) so far" );

# no longer returns the file anymore since v0.03
# is ( $perceptron->train( TRAINING_DATA, "brand", $nerve_file), $nerve_file, "\'train\' method returns the correct value" );

subtest "Data related subroutine found" => sub {
    ok( AI::Perceptron::Simple->can("save_perceptron"), "&save_perceptron is persent" );
    ok( AI::Perceptron::Simple->can("load_perceptron"), "&loaded_perceptron is present" );
};


ok( save_perceptron( $perceptron, $nerve_file ), "save_perceptron is working good so far" );
ok( -e $nerve_file, "Found the perceptron file" );

ok( load_perceptron( $nerve_file ), "Perceptron loaded" );
my $loaded_perceptron = load_perceptron( $nerve_file );

t/08-confusion_matrix.t  view on Meta::CPAN

    
    eval {
        $perceptron->display_confusion_matrix( \%c_matrix );
    };
    
    like ( $@, qr/zero_as one_as/, "Both keys not found" );
}

# more_stats enabled

subtest "More stats" => sub {

    my %c_matrix_more_stats = $perceptron->get_confusion_matrix( { 
            full_data_file => TEST_FILE, 
            actual_output_header => "brand",
            predicted_output_header => "predicted",
            more_stats => 1,
        } );

    like ( $c_matrix_more_stats{ precision }, qr/66.66/, "Precision seems correct to me" );
    is ( $c_matrix_more_stats{ specificity }, 80, "Specificity seems correct to me" );

t/08-confusion_matrix_synonyms.t  view on Meta::CPAN

    local $@;
    
    eval {
        $perceptron->display_exam_results( \%c_matrix );
    };
    
    like ( $@, qr/zero_as one_as/, "Both keys not found" );
}
# more_stats enabled

subtest "More stats" => sub {

    my %c_matrix_more_stats = $perceptron->get_confusion_matrix( { 
            full_data_file => TEST_FILE, 
            actual_output_header => "brand",
            predicted_output_header => "predicted",
            more_stats => 1,
        } );

    like ( $c_matrix_more_stats{ precision }, qr/66.66/, "Precision seems correct to me" );
    is ( $c_matrix_more_stats{ specificity }, 80, "Specificity seems correct to me" );

xt/boilerplate.t  view on Meta::CPAN

#!perl
use 5.006;
use strict;
use warnings;
use Test::More;

plan tests => 3;

sub not_in_file_ok {
    my ($filename, %regex) = @_;
    open( my $fh, '<', $filename )
        or die "couldn't open $filename for reading: $!";

    my %violated;

    while (my $line = <$fh>) {
        while (my ($desc, $regex) = each %regex) {
            if ($line =~ $regex) {
                push @{$violated{$desc}||=[]}, $.;

xt/boilerplate.t  view on Meta::CPAN

    }

    if (%violated) {
        fail("$filename contains boilerplate text");
        diag "$_ appears on lines @{$violated{$_}}" for keys %violated;
    } else {
        pass("$filename contains no boilerplate text");
    }
}

sub module_boilerplate_ok {
    my ($module) = @_;
    not_in_file_ok($module =>
        'the great new $MODULENAME'   => qr/ - The great new /,
        'boilerplate description'     => qr/Quick summary of what the module/,
        'stub function definition'    => qr/function[12]/,
    );
}

TODO: {
  local $TODO = "Need to replace the boilerplate text";



( run in 1.428 second using v1.01-cache-2.11-cpan-88abd93f124 )