AI-NeuralNet-Mesh

 view release on metacpan or  search on metacpan

Mesh.pm  view on Meta::CPAN

    
    use strict;
    use Benchmark; 

   	# See POD for usage of this variable.
	$AI::NeuralNet::Mesh::Connector = '_c';
	
	# Debugging subs
	$AI::NeuralNet::Mesh::DEBUG  = 0;
	sub whowasi { (caller(1))[3] . '()' }
	sub debug { shift; $AI::NeuralNet::Mesh::DEBUG = shift || 0; } 
	sub d { shift if(substr($_[0],0,4) eq 'AI::'); my ($a,$b,$c)=(shift,shift,$AI::NeuralNet::Mesh::DEBUG); print $a if($c == $b); return $c }
	sub verbose {debug @_};
	sub verbosity {debug @_};
	sub v {debug @_};
	
	
	# Return version of ::ID string passed or current version of this
	# module if no string is passed. Used in load() to detect file versions.
	sub version {
		shift if(substr($_[0],0,4) eq 'AI::');
		substr((split(/\s/,(shift || $AI::NeuralNet::Mesh::ID)))[2],1);
	}                                  
	
	# Rounds a floating-point to an integer with int() and sprintf()

Mesh.pm  view on Meta::CPAN

	
	# Sets/gets randomness facter in the network. Setting a value of 0 
	# disables random factors.
	sub random {
		my $self	=	shift;
		my $rand	=	shift;
		return $self->{random}	if(!(defined $rand));
		$self->{random}	=	$rand;
	}
	
	# Sets/gets column width for printing lists in debug modes 1,3, and 4.
	sub col_width {
		my $self	=	shift;
		my $width	=	shift;
		return $self->{col_width}	if(!$width);
		$self->{col_width}	=	$width;
	} 

	# Sets/gets run const. facter in the network. Setting a value of 0 
	# disables run const. factor. 
	sub const {

Mesh.pm  view on Meta::CPAN

	print "Last learn() took ",$net->benchmark(),"\n";



=item $net->verbose($level);

=item $net->verbosity($level);

=item $net->v($level);

=item $net->debug($level)

Note: verbose(), verbosity(), and v() are all functional aliases for debug().

Toggles debugging off if called with $level = 0 or no arguments. There are several levels
of debugging. 

NOTE: Debugging verbosity has been toned down somewhat from AI::NeuralNet::BackProp,
but level 4 still prints the same amount of information as you were used to. The other
levels, however, are mostly for  advanced use. Not much explanation in the other
levels, but they are included for those of you that feel daring (or just plain bored.)

Level 0 ($level = 0) : Default, no debugging information printed. All printing is 
left to calling script.

Level 1 ($level = 1) : Displays the activity between nodes, prints what values were
received and what they were weighted to.

Level 2 ($level = 2) : Just prints info from the learn() loop, in the form of "got: X, wanted Y"
type of information. This is about the third most useful debugging level, after level 12 and
level 4.

Level 3 ($level = 3) : I don't think I included any level 3 debugs in this version.

Level 4 ($level = 4) : This level is the one I use most. It is only used during learning. It
displays the current error (difference between actual outputs and the target outputs you
asked for), as well as the current loop number and the benchmark time for the last learn cycle.
Also printed are the actual outputs and the target outputs below the benchmark times.

Level 12 ($level = 12) : Level 12 prints a dot (period) [.] after each learning loop is
complete. This is useful for letting the user know that stuff is happening, but without
having to display any of the internal variables. I use this in the ex_aln.pl demo,
as well as the ex_agents.pl demo.

Toggles debuging off when called with no arguments. 



=item $net->save($filename);

This will save the complete state of the network to disk, including all weights and any
words crunched with crunch() . Also saves the layer size and activations of the network.

NOTE: The only activation type NOT saved is the CODE ref type, which must be set again
after loading.

Mesh.pm  view on Meta::CPAN

If the word is not in the list, it will set the internal error value with a text message
that you can retrive with the error() method, below.

=item $net->word($word);

A function alias for crunched().


=item $net->col_width($width);

This is useful for formating the debugging output of Level 4 if you are learning simple 
bitmaps. This will set the debugger to automatically insert a line break after that many
elements in the map output when dumping the currently run map during a learn loop.

It will return the current width when called with a 0 or undef value.

The column width is preserved across load() and save() calls.


=item $net->random($rand);

This will set the randomness factor from the network. Default is 0. When called 

Mesh.pm  view on Meta::CPAN

	$AI::NeuralNet::Mesh::Connector = 'main::tree'
	
The tree() function is called as a blessed method when it is used internally, providing
access to the bless refrence in the first argument. See notes on CUSTOM NETWORK CONNECTORS,
below, for more information on creating your own custom connector.


=item $AI::NeuralNet::Mesh::DEBUG

This variable controls the verbosity level. It will not hurt anything to set this 
directly, yet most people find it easier to set it using the debug() method, or 
any of its aliases.


=head1 CUSTOM NETWORK CONNECTORS

Creating custom network connectors is step up from average use of this module. 
However, it can be very useful in creating other styles of neural networks, other
than the default fully-connected feed-foward network. 

You create a custom connector by setting the variable $AI::NeuralNet::Mesh::Connector

examples/ex_add2.pl  view on Meta::CPAN

	  }
	 }
	}
	
	#....................................................
	sub addnet
	{
	 print "\nCreate a new net with $layers layers, 3 inputs, and 1 output\n";
	 my $net = AI::NeuralNet::Mesh->new($layers,3,1);
	
	 # Disable debugging
	 $net->debug(0);
	
	
	 my @data = (
	  [   2633, 2665, 2685],  [ 2633 + 2665 + 2685 ],
	  [   2623, 2645, 2585],  [ 2623 + 2645 + 2585 ],
	  [  2627, 2633, 2579],  [ 2627 + 2633 + 2579 ],
	  [   2611, 2627, 2563],  [ 2611 + 2627 + 2563 ],
	  [  2640, 2637, 2592],  [ 2640 + 2637 + 2592 ]
	 );
	

examples/ex_alpha.pl  view on Meta::CPAN

			activation	=>	linear
		},
		{
			nodes		=>	1,
			activation	=>	linear,
		}
	]);
	
	# Debug level of 4 gives JUST learn loop iteteration benchmark and comparrison data 
	# as learning progresses.
	$net->debug(4);

	my $letters = [            # All prototype inputs        
        [
        0,1,1,1,0,             # Inputs are   
        1,0,0,0,1,             #  5*7 digitalized caracters 
        1,0,0,0,1,              
        1,1,1,1,1,
        1,0,0,0,1,             # This is the alphabet of the
        1,0,0,0,1,             # HP 28S                      
        1,0,0,0,1,

examples/ex_bmp.pl  view on Meta::CPAN

	use AI::NeuralNet::Mesh;
	use Benchmark;

	# Set resolution
	my $xres=5;
	my $yres=5;
	
	# Create a new net with 3 layes, $xres*$yres inputs, and 1 output
	my $net = AI::NeuralNet::Mesh->new(1,$xres*$yres,1);
	
	# Enable debugging
	$net->debug(4);
	
	# Create datasets.
	my @data = ( 
		[	0,1,1,0,0,
			0,0,1,0,0,
			0,0,1,0,0,
			0,0,1,0,0,
			0,1,1,1,2	],		[	1	],
		
		[	1,1,1,0,0,

examples/ex_bmp2.pl  view on Meta::CPAN


=cut

    use AI::NeuralNet::Mesh;

	# Create a new network with 2 layers and 35 neurons in each layer.
    my $net = new AI::NeuralNet::Mesh(1,35,1);
	
	# Debug level of 4 gives JUST learn loop iteteration benchmark and comparrison data 
	# as learning progresses.
	$net->debug(4);
	
	# Create our model input
	my @map	=	(1,1,1,1,1,
				 0,0,1,0,0,
				 0,0,1,0,0,
				 0,0,1,0,0,
				 1,0,1,0,0,
				 1,0,1,0,0,
				 1,1,1,0,0);
				 

examples/ex_crunch.pl  view on Meta::CPAN

	my $bad  = $net->crunch("That's Junk Food!");
	my $good = $net->crunch("Good, Healthy Food.");
	
	my $set  = [
		"I love chips.",	$bad,
		"I love apples.",	$good,
		"I love pop.",		$bad,
		"I love oranges.",	$good
	];
	
	#$net->debug(4);
	for (0..2) {
		my $f = $net->learn_set($set);
		print "Forgotten: $f%\n";
	}
	
	# run() automatically crunches the string (run_uc() uses run() internally) and
	# run_uc() automatically uncrunches the results.
	print $net->run_uc("I love pop-tarts.");

examples/ex_dow.pl  view on Meta::CPAN

		AI::NeuralNet::Mesh module.

=cut

    use AI::NeuralNet::Mesh;
	use Benchmark;

	# Create a new net with 5 layes, 9 inputs, and 1 output
        my $net = AI::NeuralNet::Mesh->new(2,9,1);
	
	# Disable debugging
        $net->debug(2);
	
	# Create datasets.
	#	Note that these are ficticious values shown for illustration purposes
	#	only.  In the example, CPI is a certain month's consumer price
	#	index, CPI-1 is the index one month before, CPI-3 is the the index 3
	#	months before, etc.

	my @data = ( 
		#	Mo  CPI  CPI-1 CPI-3 	Oil  Oil-1 Oil-3    Dow   Dow-1 Dow-3   Dow Ave (output)
		[	1, 	229, 220,  146, 	20.0, 21.9, 19.5, 	2645, 2652, 2597], 	[	2647  ],

examples/ex_pcx.pl  view on Meta::CPAN

			print "Block at [$x,$y], index [$s] scored ".$score[$s-1]."%\n";
		}
	}
	print "Done!";
	
	print "High score:\n";
	print_ref($blocks[$net->high(\@score)],$bx); 
	print "Low score:\n";
	print_ref($blocks[$net->low(\@score)],$bx); 
	
	$net->debug(4);
	
	if(!$net->load("pcx.mesh")) {
		print "Learning high block...\n";
		print $net->learn($blocks[$net->high(\@score)],"highest");
		
		$net->save("pcx.mesh");
		
		print "Learning low block...\n";
		$net->learn($blocks[$net->low(\@score)],"lowest");
	}

examples/ex_wine.pl  view on Meta::CPAN

    use AI::NeuralNet::Mesh;
	use Benchmark;

	# Create a new net
    my $net = AI::NeuralNet::Mesh->new([13,45,1]);

	# Set activation on output node to contstrain values
	# to a specific range of values.    
    $net->activation(2,range(1..3));
	
	# Enable debugging
	$net->verbose(4);

	# Load the data set
	my $data = $net->load_set('wine.dat',0);
	
	# Seperate data based on class
	my $sets=[];
	for my $i (0..$#{$data}/2) {
		my $c = $data->[$i*2+1]->[0];
		print "Class of set $i: $c                                  \r";

mesh.htm  view on Meta::CPAN

It is easily printed as a string, as following:
<PRE>
        print &quot;Last learn() took &quot;,$net-&gt;benchmark(),&quot;\n&quot;;</PRE>
<P></P>
<DT><STRONG><A NAME="item_verbose">$net-&gt;verbose($level);</A></STRONG><BR>
<DD>
<DT><STRONG><A NAME="item_verbosity">$net-&gt;verbosity($level);</A></STRONG><BR>
<DD>
<DT><STRONG><A NAME="item_v">$net-&gt;v($level);</A></STRONG><BR>
<DD>
<DT><STRONG><A NAME="item_debug">$net-&gt;debug($level)</A></STRONG><BR>
<DD>
Note: verbose(), verbosity(), and <A HREF="#item_v"><CODE>v()</CODE></A> are all functional aliases for debug().
<P>Toggles debugging off if called with $level = 0 or no arguments. There are several levels
of debugging.</P>
<P>NOTE: Debugging verbosity has been toned down somewhat from AI::NeuralNet::BackProp,
but level 4 still prints the same amount of information as you were used to. The other
levels, however, are mostly for  advanced use. Not much explanation in the other
levels, but they are included for those of you that feel daring (or just plain bored.)</P>
<P>Level 0 ($level = 0) : Default, no debugging information printed. All printing is 
left to calling script.</P>
<P>Level 1 ($level = 1) : Displays the activity between nodes, prints what values were
received and what they were weighted to.</P>
<P>Level 2 ($level = 2) : Just prints info from the <A HREF="#item_learn"><CODE>learn()</CODE></A> loop, in the form of ``got: X, wanted Y''
type of information. This is about the third most useful debugging level, after level 12 and
level 4.</P>
<P>Level 3 ($level = 3) : I don't think I included any level 3 debugs in this version.</P>
<P>Level 4 ($level = 4) : This level is the one I use most. It is only used during learning. It
displays the current error (difference between actual outputs and the target outputs you
asked for), as well as the current loop number and the benchmark time for the last learn cycle.
Also printed are the actual outputs and the target outputs below the benchmark times.</P>
<P>Level 12 ($level = 12) : Level 12 prints a dot (period) [.] after each learning loop is
complete. This is useful for letting the user know that stuff is happening, but without
having to display any of the internal variables. I use this in the ex_aln.pl demo,
as well as the ex_agents.pl demo.</P>
<P>Toggles debuging off when called with no arguments.</P>
<P></P>
<DT><STRONG><A NAME="item_save">$net-&gt;save($filename);</A></STRONG><BR>
<DD>
This will save the complete state of the network to disk, including all weights and any
words crunched with <A HREF="#item_crunch"><CODE>crunch()</CODE></A> . Also saves the layer size and activations of the network.
<P>NOTE: The only activation type NOT saved is the CODE ref type, which must be set again
after loading.</P>
<P>This uses a simple flat-file text storage format, and therefore the network files should
be fairly portable.</P>
<P>This method will return undef if there was a problem with writing the file. If there is an

mesh.htm  view on Meta::CPAN

index of the word if it exists in the crunch list.
<P>If the word is not in the list, it will set the internal error value with a text message
that you can retrive with the <A HREF="#item_error"><CODE>error()</CODE></A> method, below.</P>
<P></P>
<DT><STRONG><A NAME="item_word">$net-&gt;word($word);</A></STRONG><BR>
<DD>
A function alias for crunched().
<P></P>
<DT><STRONG><A NAME="item_col_width">$net-&gt;col_width($width);</A></STRONG><BR>
<DD>
This is useful for formating the debugging output of Level 4 if you are learning simple 
bitmaps. This will set the debugger to automatically insert a line break after that many
elements in the map output when dumping the currently run map during a learn loop.
<P>It will return the current width when called with a 0 or undef value.</P>
<P>The column width is preserved across <A HREF="#item_load"><CODE>load()</CODE></A> and <A HREF="#item_save"><CODE>save()</CODE></A> calls.</P>
<P></P>
<DT><STRONG><A NAME="item_random">$net-&gt;random($rand);</A></STRONG><BR>
<DD>
This will set the randomness factor from the network. Default is 0. When called 
with no arguments, or an undef value, it will return current randomness value. When
called with a 0 value, it will disable randomness in the network. The randomness factor
is preserved across <A HREF="#item_load"><CODE>load()</CODE></A> and <A HREF="#item_save"><CODE>save()</CODE></A> calls.

mesh.htm  view on Meta::CPAN

<PRE>
        $AI::NeuralNet::Mesh::Connector = 'main::tree'
</PRE>
<P>The tree() function is called as a blessed method when it is used internally, providing
access to the bless refrence in the first argument. See notes on CUSTOM NETWORK CONNECTORS,
below, for more information on creating your own custom connector.</P>
<P></P>
<DT><STRONG><A NAME="item_%24AI%3A%3ANeuralNet%3A%3AMesh%3A%3ADEBUG">$AI::NeuralNet::Mesh::DEBUG</A></STRONG><BR>
<DD>
This variable controls the verbosity level. It will not hurt anything to set this 
directly, yet most people find it easier to set it using the <A HREF="#item_debug"><CODE>debug()</CODE></A> method, or 
any of its aliases.
<P></P></DL>
<P>
<HR>
<H1><A NAME="custom network connectors">CUSTOM NETWORK CONNECTORS</A></H1>
<P>Creating custom network connectors is step up from average use of this module. 
However, it can be very useful in creating other styles of neural networks, other
than the default fully-connected feed-foward network.</P>
<P>You create a custom connector by setting the variable $AI::NeuralNet::Mesh::Connector
to the fully qualified name of the function used to make the actual connections



( run in 1.940 second using v1.01-cache-2.11-cpan-49f99fa48dc )