AI-NeuralNet-Mesh

 view release on metacpan or  search on metacpan

Mesh.pm  view on Meta::CPAN

		$self->_c($n,$n+$self->{layers}->[$layer-1],$#{$self->{mesh}}-$more+1,$#{$self->{mesh}});
		$self->_c($#{$self->{mesh}}-$more+1,$#{$self->{mesh}},$n+$self->{layers}->[$layer],$n+$self->{layers}->[$layer]+$self->{layers}->[$layer+1]);
    }
        
        
    # See POD for usage
    sub run {
    	my $self	=	shift;
    	my $inputs	=	shift;
    	my $const	=	$self->{const};
    	#my $start	=	new Benchmark;
    	$inputs		=	$self->crunch($inputs) if($inputs == 0);
    	no strict 'refs';
    	for my $x (0..$#{$inputs}) {
    		last if($x>$self->{inputs});
    		d("inputing $inputs->[$x] at index $x with ID $self->{input}->{IDs}->[$x].\n",1);
    		$self->{mesh}->[$x]->input($inputs->[$x]+$const,$self->{input}->{IDs}->[$x]);
    	}
    	if($#{$inputs}<$self->{inputs}-1) {
	    	for my $x ($#{$inputs}+1..$self->{inputs}-1) {
	 	    	d("inputing 1 at index $x with ID $self->{input}->{IDs}->[$x].\n",1);
	    		$self->{mesh}->[$x]->input(1,$self->{input}->{IDs}->[$x]);
	    	}
	    }
    	#$self->{benchmark} = timestr(timediff(new Benchmark, $start));
    	return $self->{output}->get_outputs();
    }    
    
    # See POD for usage
    sub run_uc {
    	$_[0]->uncrunch(run(@_));
    }

	# See POD for usage
	sub learn {
    	my $self	=	shift;					
    	my $inputs	=	shift;					# input set
    	my $outputs	=	shift;					# target outputs
    	my %args	=	@_;						# get args into hash
    	my $inc		=	$args{inc} || 0.002;	# learning gradient
    	my $max     =   $args{max} || 1024;     # max iteterations
    	my $degrade =   $args{degrade} || 0;    # enable gradient degrading
		my $error   = 	($args{error}>-1 && defined $args{error}) ? $args{error} : -1;
  		my $dinc	=	0.0002;					# amount to adjust gradient by
		my $diff	=	100;					# error magin between results
		my $start	=	new Benchmark;			
		$inputs		=	$self->crunch($inputs)  if($inputs == 0); 
		$outputs	=	$self->crunch($outputs) if($outputs == 0);
		my ($flag,$ldiff,$cdiff,$_mi,$loop,$y); 
		while(!$flag && ($max ? $loop<$max : 1)) {
    		my $b	=	new Benchmark;
    		my $got	=	$self->run($inputs);
    		$diff 	=	pdiff($got,$outputs);
		    $flag	=	1;
    		    		
		    if(($error>-1 ? $diff<$error : 0) || !$diff) {

Mesh.pm  view on Meta::CPAN

   			d("Benchmark: ".timestr(timediff(new Benchmark,$b))."\n",4);
   			d("============================Results, [$loop]===================================\n",4);
   			d("Actual: ",4);	
   			join_cols($got,($self->{col_width})?$self->{col_width}:5) if(d()==4);
   			d("Target: ",4);	
   			join_cols($outputs,($self->{col_width})?$self->{col_width}:5) if(d()==4);
   			d("\n",4);
   			d('.',12);
   			d('['.join(',',@{$got})."-".join(',',@{$outputs}).']',13);
   		}  
   		my $str = "Learning took $loop loops and ".timestr(timediff(new Benchmark,$start))."\n";
   		d($str,3); $self->{benchmark} = "$loop loops and ".timestr(timediff(new Benchmark,$start))."\n";
   		return $str;
   	}


	# See POD for usage
	sub learn_set {
		my $self	=	shift;
		my $data	=	shift;
		my %args	=	@_;
		my $len		=	$#{$data}/2;

Mesh.pm  view on Meta::CPAN

format ... but it is the only one I could find specs for to write it in Perl. If 
anyone can get specs for any other formats, or could write a loader for them, I 
would be very grateful!) Anyways, a PCX-format file that is exactly 320x200 with 8 bits 
per pixel, with pure Perl. It returns a blessed refrence to a PCX::Loader object, which 
supports the following routinges/members. See example files ex_pcx.pl and ex_pcxl.pl in 
the ./examples/ directory.

See C<perldoc PCX::Loader> for information on the methods of the object returned.

You can download PCX::Loader from 
	http://www.josiah.countystart.com/modules/get.pl?pcx-loader:mpod


=head1 CUSTOM ACTIVATION FUNCTIONS 

Included in this package are four custom activation functions meant to be used
as a guide to create your own, as well as to be useful to you in normal use of the
module. There is only one function exported by default into your namespace, which
is the range() functions. These are not meant to be used as methods, but as functions.
These functions return code refs to a Perl closure which does the actual work when
the time comes.

Mesh.pm  view on Meta::CPAN

	}
	
This is a very simple example. It feeds the outputs	of every node in the first layer
to the node directly above it, as well as the nodes on either side of the node directly
above it, checking for range sides, of course.

The network is stored internally as one long array of node objects. The goal here
is to connect one range of nodes in that array to another range of nodes. The calling
function has already calculated the indices into the array, and it passed it to you
as the four arguments after the $self refrence. The first two arguments we will call
$r1a and $r1b. These define the start and end indices of the first range, or "layer." Likewise,
the next two arguemnts, $r2a and $r2b, define the start and end indices of the second
layer. We also grab a refrence to the mesh array so we dont have to type the $self
refrence over and over.

The loop that folows the arguments in the above example is very simple. It opens
a for() loop over the range of numbers, calculating the size instead of just going
$r1a..$r1b because we use the loop index with the next layer up as well.

$y + $r1a give the index into the mesh array of the current node to connect the output FROM.
We need to connect this nodes output lines to the next layers input nodes. We do this
with a simple method of the outputing node (the node at $y+$r1a), called add_output_node().

Mesh.pm  view on Meta::CPAN

	Rodin Porrata, rodin@ursa.llnl.gov
	Randal L. Schwartz, merlyn@stonehedge.com
	Michiel de Roo, michiel@geo.uu.nl
	
Thanks to Randal and Michiel for spoting some documentation and makefile bugs in the last release.
Thanks to Rodin for continual suggetions and questions about the module and more.

=head1 DOWNLOAD

You can always download the latest copy of AI::NeuralNet::Mesh
from http://www.josiah.countystart.com/modules/get.pl?mesh:pod


=head1 MAILING LIST

A mailing list has been setup for AI::NeuralNet::Mesh and AI::NeuralNet::BackProp. 
The list is for discussion of AI and neural net related topics as they pertain to 
AI::NeuralNet::BackProp and AI::NeuralNet::mesh. I will also announce in the group
each time a new release of AI::NeuralNet::Mesh is available.

The list address is at:

README  view on Meta::CPAN

not math theory. Any die-hard neural networking gurus out 
there? Let me know how far off I am with
this code! :-)
	
Regards,

        ~ Josiah Bryan, <jdb@wcoil.com>

Latest Version:

        http://www.josiah.countystart.com/modules/get.pl?mesh:README

examples/ex_add2.pl  view on Meta::CPAN

	
	
	 my @data = (
	  [   2633, 2665, 2685],  [ 2633 + 2665 + 2685 ],
	  [   2623, 2645, 2585],  [ 2623 + 2645 + 2585 ],
	  [  2627, 2633, 2579],  [ 2627 + 2633 + 2579 ],
	  [   2611, 2627, 2563],  [ 2611 + 2627 + 2563 ],
	  [  2640, 2637, 2592],  [ 2640 + 2637 + 2592 ]
	 );
	
	 print "Learning started, will cycle $top times with inc = $inc\n";
	
	  # Make it learn the whole dataset $top times
	  my @list;
	
	 my $t1=new Benchmark;
	 for my $a (1..$top)
	 {
	  print "Outer Loop: $a : ";
	
	  $forgetfulness = $net->learn_set( \@data,

examples/ex_bmp.pl  view on Meta::CPAN

			1,0,0,0,0,
			1,1,1,1,0,
			0,0,0,1,0,
			1,1,1,1,2	],		[	5	],
		
	);
    
    
	# If we havnt saved the net already, do the learning
	if(!$net->load('images.mesh')) {
		print "\nLearning started...\n";
		
		# Make it learn the whole dataset $top times
		my @list;
		my $top=3;
		for my $a (0..$top) {
			my $t1=new Benchmark;
			print "\n\nOuter Loop: $a\n";
			
			# Test fogetfullness
			my $f = $net->learn_set(\@data,	inc => 0.1);

examples/ex_bmp2.pl  view on Meta::CPAN

	# Create our model input
	my @map	=	(1,1,1,1,1,
				 0,0,1,0,0,
				 0,0,1,0,0,
				 0,0,1,0,0,
				 1,0,1,0,0,
				 1,0,1,0,0,
				 1,1,1,0,0);
				 
	
	print "\nLearning started...\n";
	
	print $net->learn(\@map,'J');
	
	print "Learning done.\n";
		
	# Build a test map 
	my @tmp	=	(0,0,1,1,1,
				 1,1,1,0,0,
				 0,0,0,1,0,
				 0,0,0,1,0,

examples/ex_dow.pl  view on Meta::CPAN

		[	3, 	244, 235,  164, 	19.6, 19.8, 18.1, 	2627, 2633, 2579], 	[	2630  ],
		[	4, 	261, 244,  181, 	19.6, 19.6, 18.1, 	2611, 2627, 2563], 	[	2620  ],
		[	5, 	276, 261,  196, 	19.5, 19.6, 18.0, 	2630, 2611, 2582], 	[	2638  ],
		[	6, 	287, 276,  207, 	19.5, 19.5, 18.0, 	2637, 2630, 2589], 	[	2635  ],
		[	7, 	296, 287,  212, 	19.3, 19.5, 17.8, 	2640, 2637, 2592], 	[	2641  ] 		
	);
    
    
	# If we havnt saved the net already, do the learning
        if(!$net->load('DOW.mesh')) {
		print "\nLearning started...\n";
		
		# Make it learn the whole dataset $top times
		my @list;
		my $top=1;
		for my $a (0..$top) {
			my $t1=new Benchmark;
			print "\n\nOuter Loop: $a\n";
			
			# Test fogetfullness
			my $f = $net->learn_set(\@data,	inc		=>	0.2,	

examples/ex_wine.pl  view on Meta::CPAN

	}                                  
	
			
	for(0..$#{$sets}) {
		next if(!defined $sets->[$_]->[0]);
		print "Size of set $_: ",$#{$sets->[$_]}/2,"\n";
	}
	
	# If we havnt saved the net already, do the learning
    if(!$net->load('wine.mesh')) {
		print "\nLearning started...\n";
		
		# Make it learn the whole dataset $top times
		my @list;
		my $top=5;
		for my $a (0..$top) {
			print "\n\nOuter Loop: $a\n";
			
			for(0..$#{$sets}) {
				next if(!defined $sets->[$_]->[0]);
				my $t1=new Benchmark;

mesh.htm  view on Meta::CPAN

PCX::Loader installed, it will return undef and store an error for you to retrive with 
the <A HREF="#item_error"><CODE>error()</CODE></A> method, below.
<P>This is a treat... this routine will load a PCX-format file (yah, I know ... ancient 
format ... but it is the only one I could find specs for to write it in Perl. If 
anyone can get specs for any other formats, or could write a loader for them, I 
would be very grateful!) Anyways, a PCX-format file that is exactly 320x200 with 8 bits 
per pixel, with pure Perl. It returns a blessed refrence to a PCX::Loader object, which 
supports the following routinges/members. See example files ex_pcx.pl and ex_pcxl.pl in 
the ./examples/ directory.</P>
<P>See <CODE>perldoc PCX::Loader</CODE> for information on the methods of the object returned.</P>
<P>You can download PCX::Loader from <A HREF="http://www.josiah.countystart.com/modules/get.pl?pcx-loader:mpod">http://www.josiah.countystart.com/modules/get.pl?pcx-loader:mpod</A></P>
<P></P></DL>
<P>
<HR>
<H1><A NAME="custom activation functions">CUSTOM ACTIVATION FUNCTIONS</A></H1>
<P>Included in this package are four custom activation functions meant to be used
as a guide to create your own, as well as to be useful to you in normal use of the
module. There is only one function exported by default into your namespace, which
is the <A HREF="#item_range"><CODE>range()</CODE></A> functions. These are not meant to be used as methods, but as functions.
These functions return code refs to a Perl closure which does the actual work when
the time comes.</P>

mesh.htm  view on Meta::CPAN

                        $mesh-&gt;[$y+$r1a]-&gt;add_output_node($mesh-&gt;[$y+$r2a+1]) if($y&lt;($r2b-$r2a));
                }
        }</PRE>
<P>This is a very simple example. It feeds the outputs     of every node in the first layer
to the node directly above it, as well as the nodes on either side of the node directly
above it, checking for range sides, of course.</P>
<P>The network is stored internally as one long array of node objects. The goal here
is to connect one range of nodes in that array to another range of nodes. The calling
function has already calculated the indices into the array, and it passed it to you
as the four arguments after the $self refrence. The first two arguments we will call
$r1a and $r1b. These define the start and end indices of the first range, or ``layer.'' Likewise,
the next two arguemnts, $r2a and $r2b, define the start and end indices of the second
layer. We also grab a refrence to the mesh array so we dont have to type the $self
refrence over and over.</P>
<P>The loop that folows the arguments in the above example is very simple. It opens
a <CODE>for()</CODE> loop over the range of numbers, calculating the size instead of just going
$r1a..$r1b because we use the loop index with the next layer up as well.</P>
<P>$y + $r1a give the index into the mesh array of the current node to connect the output FROM.
We need to connect this nodes output lines to the next layers input nodes. We do this
with a simple method of the outputing node (the node at $y+$r1a), called add_output_node().</P>
<P><CODE>add_output_node()</CODE> takes one simple arguemnt: A blessed refrence to a node that it is supposed
to output its final value TO. We get this blessed refrence with more simple addition.</P>

mesh.htm  view on Meta::CPAN

        Michiel de Roo, michiel@geo.uu.nl
</PRE>
<PRE>

Thanks to Randal and Michiel for spoting some documentation and makefile bugs in the last release.
Thanks to Rodin for continual suggetions and questions about the module and more.</PRE>
<P>
<HR>
<H1><A NAME="download">DOWNLOAD</A></H1>
<P>You can always download the latest copy of AI::NeuralNet::Mesh
from <A HREF="http://www.josiah.countystart.com/modules/get.pl?mesh:pod">http://www.josiah.countystart.com/modules/get.pl?mesh:pod</A></P>
<P>
<HR>
<H1><A NAME="mailing list">MAILING LIST</A></H1>
<P>A mailing list has been setup for AI::NeuralNet::Mesh and AI::NeuralNet::BackProp. 
The list is for discussion of AI and neural net related topics as they pertain to 
AI::NeuralNet::BackProp and AI::NeuralNet::mesh. I will also announce in the group
each time a new release of AI::NeuralNet::Mesh is available.</P>
The list address is: <A HREF="mailto:ai-neuralnet-backprop@egroups.com">ai-neuralnet-backprop@egroups.com</A> <BR>
To subscribe, send a blank email to: <A HREF="mailto:ai-neuralnet-backprop-subscribe@egroups.com">ai-neuralnet-backprop-subscribe@egroups.com</A> 

<BR><BR><BR>
<HR>
<A HREF="http://www.josiah.countystart.com/modules/get.pl?mesh:(c)"><B>AI::NeuralNet::Mesh</B></A> - An optimized, accurate neural network Mesh. By <A HREF="mailto:jdb@wcoil.com"><B>Josiah Bryan</B></A>.


</BODY>

</HTML>



( run in 0.422 second using v1.01-cache-2.11-cpan-0d8aa00de5b )