AI-NeuralNet-Mesh

 view release on metacpan or  search on metacpan

Mesh.pm  view on Meta::CPAN

		my $self	=	shift;
		my $file	=	shift;
		eval('use PCX::Loader');
		if(@_) {
			$self->{error}="Cannot load PCX::Loader module: @_";
			return undef;
		}
		return PCX::Loader->new($self,$file);
	}	
	
	# Crunch a string of words into a map
	sub crunch {
		my $self	=	shift;
		my @ws 		=	split(/[\s\t]/,shift);
		my (@map,$ic);
		for my $a (0..$#ws) {
			$ic=$self->crunched($ws[$a]);
			if(!defined $ic) {
				$self->{_crunched}->{list}->[$self->{_crunched}->{_length}++]=$ws[$a];
				$map[$a]=$self->{_crunched}->{_length};
			} else {
				$map[$a]=$ic;
            }
		}
		return \@map;
	}
	
	# Finds if a word has been crunched.
	# Returns undef on failure, word index for success.
	sub crunched {
		my $self	=	shift;
		for my $a (0..$self->{_crunched}->{_length}-1) {
			return $a+1 if($self->{_crunched}->{list}->[$a] eq $_[0]);
		}
		$self->{error} = "Word \"$_[0]\" not found.";
		return undef;
	}
	
	# Alias for crunched(), above
	sub word { crunched(@_) }
	
	# Uncrunches a map (array ref) into an array of words (not an array ref) 
	# and returns array
	sub uncrunch {
		my $self	=	shift;
		my $map = shift;
		my ($c,$el,$x);
		foreach $el (@{$map}) {
			$c .= $self->{_crunched}->{list}->[$el-1].' ';
		}
		return $c;
	}
	
	# Sets/gets randomness facter in the network. Setting a value of 0 
	# disables random factors.
	sub random {
		my $self	=	shift;
		my $rand	=	shift;

Mesh.pm  view on Meta::CPAN

	}
	
	# Used to format array ref into columns
	# Usage: 
	#	join_cols(\@array,$row_length_in_elements,$high_state_character,$low_state_character);
	# Can also be called as method of your neural net.
	# If $high_state_character is null, prints actual numerical values of each element.
	sub join_cols {
		no strict 'refs';
		shift if(substr($_[0],0,4) eq 'AI::'); 
		my $map		=	shift;
		my $break   =	shift;
		my $a		=	shift;
		my $b		=	shift;
		my $x;
		foreach my $el (@{$map}) { 
			my $str = ((int($el))?$a:$b);
			$str=$el."\0" if(!$a);
			print $str;	$x++;
			if($x>$break-1) { print "\n"; $x=0;	}
		}
		print "\n";
	}
	
	# Returns percentage difference between all elements of two
	# array refs of exact same length (in elements).

Mesh.pm  view on Meta::CPAN

Three of the activation syntaxes are shown in the first constructor above, the "linear",
"sigmoid" and code ref types.

You can also set the activation and threshold values after network creation with the
activation() and threshold() methods. 

	



=item $net->learn($input_map_ref, $desired_result_ref [, options ]);

NOTE: learn_set() now has increment-degrading turned OFF by default. See note
on the degrade flag, below.

This will 'teach' a network to associate an new input map with a desired 
result. It will return a string containg benchmarking information. 

You can also specify strings as inputs and ouputs to learn, and they will be 
crunched automatically. Example:

	$net->learn('corn', 'cob');
	
	
Note, the old method of calling crunch on the values still works just as well.	

Mesh.pm  view on Meta::CPAN

	my @data = (
		[ 0,1 ], [ 1 ],
		[ 1,0 ], [ 0 ]
	);
	
	$net->learn_set(\@data);
	
Same effect as above, but not the same data (obviously).


=item $net->run($input_map_ref);

This method will apply the given array ref at the input layer of the neural network, and
it will return an array ref to the output of the network. run() will now automatically crunch() 
a string given as an input (See the crunch() method for info on crunching).

Example Usage:
	
	my $inputs  = [ 1,1,0,1 ];
	my $outputs = $net->run($inputs);

You can also do this with a string:
                                                                                  
	my $outputs = $net->run('cloudy - wind is 5 MPH NW');
	

See also run_uc() and run_set() below.


=item $net->run_uc($input_map_ref);

This method does the same thing as this code:
	
	$net->uncrunch($net->run($input_map_ref));

All that run_uc() does is that it automatically calls uncrunch() on the output, regardless
of whether the input was crunch() -ed or not.
	

=item $net->run_set($set);
                                                                                    
This takes an array ref of the same structure as the learn_set() method, above. It returns
an array ref. Each element in the returned array ref represents the output for the corresponding
element in the dataset passed. Uses run() internally.

Mesh.pm  view on Meta::CPAN

This will dump a simple listing of all the weights of all the connections of every neuron
in the network to STDIO.




=item $net->crunch($string);

This splits a string passed with /[\s\t]/ into an array ref containing unique indexes
to the words. The words are stored in an intenal array and preserved across load() and save()
calls. This is designed to be used to generate unique maps sutible for passing to learn() and 
run() directly. It returns an array ref.

The words are not duplicated internally. For example:

	$net->crunch("How are you?");

Will probably return an array ref containing 1,2,3. A subsequent call of:

    $net->crunch("How is Jane?");

Will probably return an array ref containing 1,4,5. Notice, the first element stayed
the same. That is because it already stored the word "How". So, each word is stored
only once internally and the returned array ref reflects that.


=item $net->uncrunch($array_ref);

Uncrunches a map (array ref) into an scalar string of words seperated by ' ' and returns the 
string. This is ment to be used as a counterpart to the crunch() method, above, possibly to 
uncrunch() the output of a run() call. Consider the below code (also in ./examples/ex1.pl):
                           
	use AI::NeuralNet::Mesh;
	my $net = AI::NeuralNet::Mesh->new(2,3);
	
	for (0..3) {
		$net->learn_set([
			$net->crunch("I love chips."),  $net->crunch("That's Junk Food!")),
			$net->crunch("I love apples."), $net->crunch("Good, Healthy Food.")),

Mesh.pm  view on Meta::CPAN

that you can retrive with the error() method, below.

=item $net->word($word);

A function alias for crunched().


=item $net->col_width($width);

This is useful for formating the debugging output of Level 4 if you are learning simple 
bitmaps. This will set the debugger to automatically insert a line break after that many
elements in the map output when dumping the currently run map during a learn loop.

It will return the current width when called with a 0 or undef value.

The column width is preserved across load() and save() calls.


=item $net->random($rand);

This will set the randomness factor from the network. Default is 0. When called 
with no arguments, or an undef value, it will return current randomness value. When

examples/ex_alpha.pl  view on Meta::CPAN

    
    File:	examples/ex_alpha.pl
	Author: Josiah Bryan, <jdb@wcoil.com>
	Desc: 

		This demonstrates the ability of a neural net to 
		generalize and predict what the correct result is 
		for inputs that it has never seen before.
		
		This teaches the network to classify some twenty-
		nine seperate 35-byte bitmaps, and then it inputs
		an never-before-seen bitmap and displays the 
		classification the network gives for the unknown bitmap.

=cut

	use AI::NeuralNet::Mesh;

	# Create a new network with 35 neurons in input layer and 1 output neuron
	my $net = new AI::NeuralNet::Mesh([
		{
			nodes		=>	35,
			activation	=>	linear

examples/ex_alpha.pl  view on Meta::CPAN

        0,0,1,0,2
        ]
     ];
	
	if(!$net->load("alpha.mesh")) {
		#$net->range(0..29);
		$net->learn_set($letters);
		$net->save("alpha.mesh");
	}
			
	# Build a test map 
	my $tmp	=	[0,1,1,1,0,
				 1,0,0,0,1,
				 1,0,0,0,1,
				 1,1,1,1,1,
				 1,0,0,0,1,
				 1,0,0,0,1,
				 1,0,0,0,1];
	
	# Display test map
	print "\nTest map:\n";
	$net->join_cols($tmp,5);
	
	# Display network results
	print "Letter index matched: ",$net->run($tmp)->[0],"\n";
	

examples/ex_bmp.pl  view on Meta::CPAN

=begin

	File:   examples/ex_bmp.pl
	Author: Josiah Bryan, <jdb@wcoil.com>
	Desc:
	
		This demonstrates simple classification of 6x6 bitmaps.

=cut

	use AI::NeuralNet::Mesh;
	use Benchmark;

	# Set resolution
	my $xres=5;
	my $yres=5;
	

examples/ex_bmp2.pl  view on Meta::CPAN

=begin
    
    File:	examples/ex_bmp2.pl
	Author: Josiah Bryan, <jdb@wcoil.com>
	Desc: 

		This demonstrates the ability of a neural net 
		to generalize and predict what the correct 
		result is for inputs that it has never seen before.
				
		This teaches a network to recognize a 5x7 bitmap of 
		the letter "J" then it presents the network with a 
		corrupted "J" and displays the results of the networks 
		output.

=cut

    use AI::NeuralNet::Mesh;

	# Create a new network with 2 layers and 35 neurons in each layer.
    my $net = new AI::NeuralNet::Mesh(1,35,1);
	
	# Debug level of 4 gives JUST learn loop iteteration benchmark and comparrison data 
	# as learning progresses.
	$net->debug(4);
	
	# Create our model input
	my @map	=	(1,1,1,1,1,
				 0,0,1,0,0,
				 0,0,1,0,0,
				 0,0,1,0,0,
				 1,0,1,0,0,
				 1,0,1,0,0,
				 1,1,1,0,0);
				 
	
	print "\nLearning started...\n";
	
	print $net->learn(\@map,'J');
	
	print "Learning done.\n";
		
	# Build a test map 
	my @tmp	=	(0,0,1,1,1,
				 1,1,1,0,0,
				 0,0,0,1,0,
				 0,0,0,1,0,
				 0,0,0,1,0,                                          
				 0,0,0,0,0,
				 0,1,1,0,0);
	
	# Display test map
	print "\nTest map:\n";
	$net->join_cols(\@tmp,5,'');
	
	print "Running test...\n";
		                    
	# Run the actual test and get network output
	print "Result: ",$net->run_uc(\@tmp),"\n";
	
	print "Test run complete.\n";
	
	

examples/ex_pcx.pl  view on Meta::CPAN

=begin
	
	File:   examplex/ex_pcx.pl
	Author: Josiah Bryan, <jdb@wcoil.com>
	Desc:
		This teaches the network to classify 
		10x10 bitmaps from a PCX file based on their 
		whiteness. (This was taught on a b&w 320x200 
		PCX of the author at an early age :-)
=cut
	
	use AI::NeuralNet::Mesh;
	
	# Set block sizes
	my ($bx,$by)=(10,10);
	
	print "Creating Neural Net...";
	my $net=AI::NeuralNet::Mesh->new(1,$bx*$by,1);
	$net->{col_width} = $bx;
	print "Done!\n";
	
	print "Loading bitmap...";
	my $img = $net->load_pcx("josiah.pcx");             
	print "Done!\n";
	
	print "Comparing blocks...\n";
	my $white = $img->get_block([0,0,$bx,$by]);
	
	my ($x,$y,$tmp,@scores,$s,@blocks,$b);
	for ($x=0;$x<320;$x+=$bx) {
		for ($y=0;$y<200;$y+=$by) {
			$blocks[$b++]=$img->get_block([$x,$y,$x+$bx,$y+$by]);

examples/ex_pcx.pl  view on Meta::CPAN

	
	print "Result: ",$net->run($blocks[rand()*$b])->[0],"\n";
	
	print "Bencmark for run: ", $net->benchmarked(), "\n";
	
	$net->save("pcx2.net");
	
	sub print_ref {
		no strict 'refs';
		shift if(substr($_[0],0,4) eq 'AI::'); 
		my $map		=	shift;
		my $break   =	shift;
		my $x;
		my @els = (' ','.',',',':',';','%','#');
		foreach my $el (@{$map}) { 
			$str=$el/255*6;
			print $els[$str];
			$x++;
			if($x>$break-1) {
				print "\n";
				$x=0;
			}
		}
		print "\n";
	}

mesh.htm  view on Meta::CPAN

The code ref also has access to all the data of that node through the second argument,
a blessed hash refrence to that node.</P>
<P>See CUSTOM ACTIVATION FUNCTIONS for information on several included activation functions
other than the ones listed above.</P>
<P>Three of the activation syntaxes are shown in the first constructor above, the ``linear'',
``sigmoid'' and code ref types.</P>
<P>You can also set the activation and threshold values after network creation with the
<A HREF="#item_activation"><CODE>activation()</CODE></A> and <A HREF="#item_threshold"><CODE>threshold()</CODE></A> methods.</P>
<P></P>
<P></P>
<DT><STRONG><A NAME="item_learn">$net-&gt;learn($input_map_ref, $desired_result_ref [, options ]);</A></STRONG><BR>
<DD>
NOTE: <A HREF="#item_learn_set"><CODE>learn_set()</CODE></A> now has increment-degrading turned OFF by default. See note
on the degrade flag, below.
<P>This will 'teach' a network to associate an new input map with a desired 
result. It will return a string containg benchmarking information.</P>
<P>You can also specify strings as inputs and ouputs to learn, and they will be 
crunched automatically. Example:</P>
<PRE>
        $net-&gt;learn('corn', 'cob');
</PRE>
<P>Note, the old method of calling crunch on the values still works just as well.</P>
<P>The first two arguments may be array refs (or now, strings), and they may be 
of different lengths.</P>
<P>Options should be written on hash form. There are three options:

mesh.htm  view on Meta::CPAN

prediction for the next month. A more simple set defenition would be as such:</P>
<PRE>
        my @data = (
                [ 0,1 ], [ 1 ],
                [ 1,0 ], [ 0 ]
        );

        $net-&gt;learn_set(\@data);</PRE>
<P>Same effect as above, but not the same data (obviously).</P>
<P></P>
<DT><STRONG><A NAME="item_run">$net-&gt;run($input_map_ref);</A></STRONG><BR>
<DD>
This method will apply the given array ref at the input layer of the neural network, and
it will return an array ref to the output of the network. <A HREF="#item_run"><CODE>run()</CODE></A> will now automatically <A HREF="#item_crunch"><CODE>crunch()</CODE></A> 
a string given as an input (See the <A HREF="#item_crunch"><CODE>crunch()</CODE></A> method for info on crunching).
<P>Example Usage:
</P>
<PRE>
        my $inputs  = [ 1,1,0,1 ];
        my $outputs = $net-&gt;run($inputs);</PRE>
<P>You can also do this with a string:
</P>
<PRE>
        my $outputs = $net-&gt;run('cloudy - wind is 5 MPH NW');</PRE>
<P>See also <A HREF="#item_run_uc"><CODE>run_uc()</CODE></A> and <A HREF="#item_run_set"><CODE>run_set()</CODE></A> below.</P>
<P></P>
<DT><STRONG><A NAME="item_run_uc">$net-&gt;run_uc($input_map_ref);</A></STRONG><BR>
<DD>
This method does the same thing as this code:

<PRE>
        $net-&gt;uncrunch($net-&gt;run($input_map_ref));</PRE>
<P>All that <A HREF="#item_run_uc"><CODE>run_uc()</CODE></A> does is that it automatically calls <A HREF="#item_uncrunch"><CODE>uncrunch()</CODE></A> on the output, regardless
of whether the input was <A HREF="#item_crunch"><CODE>crunch()</CODE></A> -ed or not.</P>
<P></P>
<DT><STRONG><A NAME="item_run_set">$net-&gt;run_set($set);</A></STRONG><BR>
<DD>
<P>This takes an array ref of the same structure as the learn_set() method, above. It returns
an array ref. Each element in the returned array ref represents the output for the corresponding
element in the dataset passed. Uses run() internally.</P>
<DT><STRONG><A NAME="item_get_outs">$net-&gt;get_outs($set);</A></STRONG><BR>
<DD>

mesh.htm  view on Meta::CPAN

<P></P>
<DT><STRONG><A NAME="item_show">$net-&gt;show();</A></STRONG><BR>
<DD>
This will dump a simple listing of all the weights of all the connections of every neuron
in the network to STDIO.
<P></P>
<DT><STRONG><A NAME="item_crunch">$net-&gt;crunch($string);</A></STRONG><BR>
<DD>
This splits a string passed with /[\s\t]/ into an array ref containing unique indexes
to the words. The words are stored in an intenal array and preserved across <A HREF="#item_load"><CODE>load()</CODE></A> and <A HREF="#item_save"><CODE>save()</CODE></A>
calls. This is designed to be used to generate unique maps sutible for passing to <A HREF="#item_learn"><CODE>learn()</CODE></A> and 
<A HREF="#item_run"><CODE>run()</CODE></A> directly. It returns an array ref.
<P>The words are not duplicated internally. For example:</P>
<PRE>
        $net-&gt;crunch(&quot;How are you?&quot;);</PRE>
<P>Will probably return an array ref containing 1,2,3. A subsequent call of:</P>
<PRE>
    $net-&gt;crunch(&quot;How is Jane?&quot;);</PRE>
<P>Will probably return an array ref containing 1,4,5. Notice, the first element stayed
the same. That is because it already stored the word ``How''. So, each word is stored
only once internally and the returned array ref reflects that.</P>
<P></P>
<DT><STRONG><A NAME="item_uncrunch">$net-&gt;uncrunch($array_ref);</A></STRONG><BR>
<DD>
Uncrunches a map (array ref) into an scalar string of words seperated by ' ' and returns the 
string. This is ment to be used as a counterpart to the <A HREF="#item_crunch"><CODE>crunch()</CODE></A> method, above, possibly to 
<A HREF="#item_uncrunch"><CODE>uncrunch()</CODE></A> the output of a <A HREF="#item_run"><CODE>run()</CODE></A> call. Consider the below code (also in ./examples/ex1.pl):

<PRE>
        use AI::NeuralNet::Mesh;
        my $net = AI::NeuralNet::Mesh-&gt;new(2,3);

        for (0..3) {
                $net-&gt;learn_set([
                        $net-&gt;crunch(&quot;I love chips.&quot;),  $net-&gt;crunch(&quot;That's Junk Food!&quot;)),

mesh.htm  view on Meta::CPAN

<P>If the word is not in the list, it will set the internal error value with a text message
that you can retrive with the <A HREF="#item_error"><CODE>error()</CODE></A> method, below.</P>
<P></P>
<DT><STRONG><A NAME="item_word">$net-&gt;word($word);</A></STRONG><BR>
<DD>
A function alias for crunched().
<P></P>
<DT><STRONG><A NAME="item_col_width">$net-&gt;col_width($width);</A></STRONG><BR>
<DD>
This is useful for formating the debugging output of Level 4 if you are learning simple 
bitmaps. This will set the debugger to automatically insert a line break after that many
elements in the map output when dumping the currently run map during a learn loop.
<P>It will return the current width when called with a 0 or undef value.</P>
<P>The column width is preserved across <A HREF="#item_load"><CODE>load()</CODE></A> and <A HREF="#item_save"><CODE>save()</CODE></A> calls.</P>
<P></P>
<DT><STRONG><A NAME="item_random">$net-&gt;random($rand);</A></STRONG><BR>
<DD>
This will set the randomness factor from the network. Default is 0. When called 
with no arguments, or an undef value, it will return current randomness value. When
called with a 0 value, it will disable randomness in the network. The randomness factor
is preserved across <A HREF="#item_load"><CODE>load()</CODE></A> and <A HREF="#item_save"><CODE>save()</CODE></A> calls.
<P></P>



( run in 0.495 second using v1.01-cache-2.11-cpan-49f99fa48dc )