AI-NeuralNet-Mesh

 view release on metacpan or  search on metacpan

MANIFEST  view on Meta::CPAN

Changes
Mesh.pm
Makefile.PL
MANIFEST
README
test.pl

Mesh.pm  view on Meta::CPAN


    # Create a mesh with 2 layers, 2 nodes/layer, and one output node.
	my $net = new AI::NeuralNet::Mesh(2,2,1);
	
	# Teach the network the AND function
	$net->learn([0,0],[0]);
	$net->learn([0,1],[0]);
	$net->learn([1,0],[0]);
	$net->learn([1,1],[1]);
	
	# Present it with two test cases
	my $result_bit_1 = $net->run([0,1])->[0];
	my $result_bit_2 = $net->run([1,1])->[0];
	
	# Display the results
	print "AND test with inputs (0,1): $result_bit_1\n";
	print "AND test with inputs (1,1): $result_bit_2\n";
	

=head1 VERSION & UPDATES

This is version B<0.44>, an update release for version 0.43.

This fixed the usage conflict with perl 5.3.3.

With this version I have gone through and tuned up many area
of this module, including the descent algorithim in learn(),

Mesh.pm  view on Meta::CPAN


$degrade_increment_flag is a simple flag used to allow/dissalow increment degrading
during learning based on a product of the error difference with several other factors.
$degrade_increment_flag is off by default. Setting $degrade_increment_flag to a true
value turns increment degrading on. 

In previous module releases $degrade_increment_flag was not used, as increment degrading
was always on. In this release I have looked at several other network types as well
as several texts and decided that it would be better to not use increment degrading. The
option is still there for those that feel the inclination to use it. I have found some areas
that do need the degrade flag to work at a faster speed. See test.pl for an example. If
the degrade flag wasn't in test.pl, it would take a very long time to learn.



=item $net->learn_set(\@set, [ options ]);

This takes the same options as learn() (learn_set() uses learn() internally) 
and allows you to specify a set to learn, rather than individual patterns. 
A dataset is an array refrence with at least two elements in the array, 
each element being another array refrence (or now, a scalar string). For 
each pattern to learn, you must specify an input array ref, and an ouput 

Mesh.pm  view on Meta::CPAN


	Rodin Porrata, rodin@ursa.llnl.gov
	Randal L. Schwartz, merlyn@stonehedge.com
	Michiel de Roo, michiel@geo.uu.nl
	
Thanks to Randal and Michiel for spoting some documentation and makefile bugs in the last release.
Thanks to Rodin for continual suggetions and questions about the module and more.

=head1 DOWNLOAD

You can always download the latest copy of AI::NeuralNet::Mesh
from http://www.josiah.countystart.com/modules/get.pl?mesh:pod


=head1 MAILING LIST

A mailing list has been setup for AI::NeuralNet::Mesh and AI::NeuralNet::BackProp. 
The list is for discussion of AI and neural net related topics as they pertain to 
AI::NeuralNet::BackProp and AI::NeuralNet::mesh. I will also announce in the group
each time a new release of AI::NeuralNet::Mesh is available.

README  view on Meta::CPAN

anything else. Don't expect a classicist view of nerual 
networking here. I simply wrote from operating theory, 
not math theory. Any die-hard neural networking gurus out 
there? Let me know how far off I am with
this code! :-)
	
Regards,

        ~ Josiah Bryan, <jdb@wcoil.com>

Latest Version:

        http://www.josiah.countystart.com/modules/get.pl?mesh:README

examples/ex_add2.pl  view on Meta::CPAN

=begin
    
    File:	examples/ex_add2.pl
	Author: Rodin Porrata, <rodin@ursa.llnl.gov>
	Desc: 

		This script runs a test of the networks ability to add 
		and remember data sets, as well as testing the optimum "inc" to 
		learn and the optimum number of layers for a network.

=cut

	use AI::NeuralNet::Mesh;
	use Benchmark;
	use English;
	
	my $ofile = "addnet_data.txt";
	

examples/ex_add2.pl  view on Meta::CPAN

	
	 $runtime = timediff($t2,$t1);
	 print "run took ",timestr($runtime),"\n";
	
	
	 my @input = ( [ 2222, 3333, 3200 ],
	      [ 1111, 1222, 3211 ],
	      [ 2345, 2543, 3000 ],
	      [ 2654, 2234, 2534 ] );
	
	    test_net( $net, @input );
	}
	#.....................................................................
	 sub test_net {
	  my @set;
	  my $fb;
	  my $net = shift;
	  my @data = @_;
	  undef @percent_diff; #@answers; undef @predictions;
	
	  for( $i=0; defined( $data[$i] ); $i++ ){
	   @set = @{ $data[$i] };
	   $fb = $net->run(\@set)->[0];
	   # Print output

examples/ex_alpha.pl  view on Meta::CPAN

        0,0,1,0,2
        ]
     ];
	
	if(!$net->load("alpha.mesh")) {
		#$net->range(0..29);
		$net->learn_set($letters);
		$net->save("alpha.mesh");
	}
			
	# Build a test map 
	my $tmp	=	[0,1,1,1,0,
				 1,0,0,0,1,
				 1,0,0,0,1,
				 1,1,1,1,1,
				 1,0,0,0,1,
				 1,0,0,0,1,
				 1,0,0,0,1];
	
	# Display test map
	print "\nTest map:\n";
	$net->join_cols($tmp,5);
	
	# Display network results
	print "Letter index matched: ",$net->run($tmp)->[0],"\n";
	

examples/ex_bmp2.pl  view on Meta::CPAN

				 1,0,1,0,0,
				 1,1,1,0,0);
				 
	
	print "\nLearning started...\n";
	
	print $net->learn(\@map,'J');
	
	print "Learning done.\n";
		
	# Build a test map 
	my @tmp	=	(0,0,1,1,1,
				 1,1,1,0,0,
				 0,0,0,1,0,
				 0,0,0,1,0,
				 0,0,0,1,0,                                          
				 0,0,0,0,0,
				 0,1,1,0,0);
	
	# Display test map
	print "\nTest map:\n";
	$net->join_cols(\@tmp,5,'');
	
	print "Running test...\n";
		                    
	# Run the actual test and get network output
	print "Result: ",$net->run_uc(\@tmp),"\n";
	
	print "Test run complete.\n";
	
	

examples/ex_synop.pl  view on Meta::CPAN


	# Make a data set from the array refs
	my @phrases = (
		$phrase1, $phrase2,
		$phrase3, $phrase4
	);

	# Learn the data set	
	$net->learn_set(\@phrases);
	
	# Run a test phrase through the network
	my $test_phrase = $net->crunch("I love neural networking!");
	my $result = $net->run($test_phrase);
	
	# Get this, it prints "Jay Leno is  networking!" ...  LOL!
	print $net->uncrunch($result),"\n";

mesh.htm  view on Meta::CPAN


        # Create a mesh with 2 layers, 2 nodes/layer, and one output node.
        my $net = new AI::NeuralNet::Mesh(2,2,1);

        # Teach the network the AND function
        $net-&gt;learn([0,0],[0]);
        $net-&gt;learn([0,1],[0]);
        $net-&gt;learn([1,0],[0]);
        $net-&gt;learn([1,1],[1]);

        # Present it with two test cases
        my $result_bit_1 = $net-&gt;run([0,1])-&gt;[0];
        my $result_bit_2 = $net-&gt;run([1,1])-&gt;[0];

        # Display the results
        print &quot;AND test with inputs (0,1): $result_bit_1\n&quot;;
        print &quot;AND test with inputs (1,1): $result_bit_2\n&quot;;

</PRE>
<P>
<HR>
<H1><A NAME="version & updates">VERSION &amp; UPDATES</A></H1>
<P>This is version <STRONG>0.43</STRONG>, the second release of this module.</P>
<P>With this version I have gone through and tuned up many area
of this module, including the descent algorithim in learn(),
as well as four custom activation functions, and several export 
tag sets. With this release, I have also included a few

mesh.htm  view on Meta::CPAN

then <A HREF="#item_learn"><CODE>learn()</CODE></A> will not return until it gets an exact match for the desired result OR it
reaches $maximum_iterations.</P>
<P>$degrade_increment_flag is a simple flag used to allow/dissalow increment degrading
during learning based on a product of the error difference with several other factors.
$degrade_increment_flag is off by default. Setting $degrade_increment_flag to a true
value turns increment degrading on.</P>
<P>In previous module releases $degrade_increment_flag was not used, as increment degrading
was always on. In this release I have looked at several other network types as well
as several texts and decided that it would be better to not use increment degrading. The
option is still there for those that feel the inclination to use it. I have found some areas
that do need the degrade flag to work at a faster speed. See test.pl for an example. If
the degrade flag wasn't in test.pl, it would take a very long time to learn.</P>
<P></P>
<DT><STRONG><A NAME="item_learn_set">$net-&gt;learn_set(\@set, [ options ]);</A></STRONG><BR>
<DD>
This takes the same options as <A HREF="#item_learn"><CODE>learn()</CODE></A> (learn_set() uses <A HREF="#item_learn"><CODE>learn()</CODE></A> internally) 
and allows you to specify a set to learn, rather than individual patterns. 
A dataset is an array refrence with at least two elements in the array, 
each element being another array refrence (or now, a scalar string). For 
each pattern to learn, you must specify an input array ref, and an ouput 
array ref as the next element. Example:

mesh.htm  view on Meta::CPAN

        Randal L. Schwartz, merlyn@stonehedge.com
        Michiel de Roo, michiel@geo.uu.nl
</PRE>
<PRE>

Thanks to Randal and Michiel for spoting some documentation and makefile bugs in the last release.
Thanks to Rodin for continual suggetions and questions about the module and more.</PRE>
<P>
<HR>
<H1><A NAME="download">DOWNLOAD</A></H1>
<P>You can always download the latest copy of AI::NeuralNet::Mesh
from <A HREF="http://www.josiah.countystart.com/modules/get.pl?mesh:pod">http://www.josiah.countystart.com/modules/get.pl?mesh:pod</A></P>
<P>
<HR>
<H1><A NAME="mailing list">MAILING LIST</A></H1>
<P>A mailing list has been setup for AI::NeuralNet::Mesh and AI::NeuralNet::BackProp. 
The list is for discussion of AI and neural net related topics as they pertain to 
AI::NeuralNet::BackProp and AI::NeuralNet::mesh. I will also announce in the group
each time a new release of AI::NeuralNet::Mesh is available.</P>
The list address is: <A HREF="mailto:ai-neuralnet-backprop@egroups.com">ai-neuralnet-backprop@egroups.com</A> <BR>
To subscribe, send a blank email to: <A HREF="mailto:ai-neuralnet-backprop-subscribe@egroups.com">ai-neuralnet-backprop-subscribe@egroups.com</A> 

test.pl  view on Meta::CPAN

# Before `make install' is performed this script should be runnable with
# `make test'. After `make install' it should work as `perl test.pl'

BEGIN { $| = 1; print "1..13\n"; }
END {print "not ok 1\n" unless $loaded;}

sub t { my $f=shift;$t++;my $str=($f)?"ok $t":"not ok $t";print $str,"\n";}

use AI::NeuralNet::Mesh;
$loaded = 1;
t 1;



( run in 0.303 second using v1.01-cache-2.11-cpan-3cd7ad12f66 )