AI-NeuralNet-Mesh

 view release on metacpan or  search on metacpan

Mesh.pm  view on Meta::CPAN

the $Connection variable and using custom activation functions, as
well as basic package inheritance, you can simulate many different
types of neural network structures with very little new code written
by you.

In this module I have included a more accurate form of "learning" for the
mesh. This form preforms descent toward a local error minimum (0) on a 
directional delta, rather than the desired value for that node. This allows
for better, and more accurate results with larger datasets. This module also
uses a simpler recursion technique which, suprisingly, is more accurate than
the original technique that I've used in other ANNs.

=head1 EXPORTS

This module exports three functions by default:

	range
	intr
	pdiff
	
See range() intr() and pdiff() for description of their respective functions.

examples/ex_aln.pl  view on Meta::CPAN

						$self->{mesh}->[$n]->{_inputs}->[$z]->{weight} = $l[$z];
					}
					my $z = $self->{layers}->[$x-1];
					$self->{mesh}->[$n]->{activation} = $l[$z];
					$self->{mesh}->[$n]->{threshold}  = $l[$z+1];
					$self->{mesh}->[$n]->{mean}       = $l[$z+2];
					$n++;
				}
			}
			
	    	$self->extend($self->{_original_specs});
	
			return $self;
	    };
	    
		# If $leavesis a string, then it will be numerically equal to 0, so 
		# try to load it as a network file.
		if($leaves == 0) {  
		    # We use a "1" flag as the second argument to indicate that we 
		    # want load() to call the new constructor to make a network the
		    # same size as in the file and return a refrence to the network,

examples/ex_aln.pl  view on Meta::CPAN

			} else {
				$self->{_inputs}->[0]->{node}->adjust_weight($inc,$target) if($self->{_inputs}->[1]->{node});
			}
		};

	    # Set our custom node connector
		$AI::NeuralNet::Mesh::Connector = 'main::_c_tree'; 
		
		# Create a new network from our specs
		my $net = AI::NeuralNet::Mesh->new($specs);
		$net->{_original_specs} = $specs;
		
		# Return our new network
		return $net;
	}
	
	# Our custom node connector for the tree function, above.
	# This connects every two nodes from the first range
	# to one node of the second range. This is only meant
	# to be used in a factored layer mesh, such as one with a
	# [8,4,2,1] node specification array. You should never

mesh.htm  view on Meta::CPAN

neural network simulation toolkit. Through a combination of setting
the $Connection variable and using custom activation functions, as
well as basic package inheritance, you can simulate many different
types of neural network structures with very little new code written
by you.</P>
<P>In this module I have included a more accurate form of ``learning'' for the
mesh. This form preforms descent toward a local error minimum (0) on a 
directional delta, rather than the desired value for that node. This allows
for better, and more accurate results with larger datasets. This module also
uses a simpler recursion technique which, suprisingly, is more accurate than
the original technique that I've used in other ANNs.</P>
<P>
<HR>
<H1><A NAME="exports">EXPORTS</A></H1>
<P>This module exports three functions by default:</P>
<PRE>
        range
        intr
        pdiff
</PRE>
<P>See range() intr() and pdiff() for description of their respective functions.</P>



( run in 0.336 second using v1.01-cache-2.11-cpan-f985c23238c )