AI-PredictionClient

 view release on metacpan or  search on metacpan

lib/AI/PredictionClient.pm  view on Meta::CPAN


=head3 In the examples above, the following points are demonstrated:

If you don't have an image handy --debug_camel will provide a sample image to send to the server. 
The image file argument still needs to be provided to make the command line parser happy.

If you don't have a server to talk to, but want to see if most everything else is working use 
the --debug_loopback_interface. This will provide a sample response you can test the client with. 
The module can use the same loopback interface for debugging your bespoke clients.

The --debug_verbose option will dump the data structures of the request and response to allow
you to see what is going on.

=head3 The response from a live server to the camel image looks like this:

 Inception.pl --image_file=zzzzz --debug_camel --host=107.170.xx.xxx --port=9000    
 Sending image zzzzz to server at host:107.170.xx.xxx  port:9000
 .===========================================================================.
 | Class                                                     | Score         |
 |-----------------------------------------------------------+---------------|
 | Arabian camel, dromedary, Camelus dromedarius             | 11.968746     |

lib/AI/PredictionClient.pm  view on Meta::CPAN


 $ cd /serving
 $ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=inception --model_base_path=inception-export &> inception_log &

A longer article on setting up a server is here:

 https://www.tomstall.com/content/create-a-globally-distributed-tensorflow-serving-cluster-with-nearly-no-pain/

=head1 ADDITIONAL INFO

The design of this client is to be fairly easy for a developer to see how the data is formed and received. 
The TensorFlow interface is based on Protocol Buffers and gRPC. 
That implementation is built on a complex architecture of nested protofiles.

In this design I flattened the architecture out and where the native data handling of Perl is best, 
the modules use plain old Perl data structures rather than creating another layer of accessors.

The Tensor interface is used repetitively so this package includes a simplified Tensor class 
to pack and unpack data to and from the models.

In the case of most clients, the Tensor class is simply sending and receiving rank one tensors - vectors. 
In the case of higher rank tensors, the tensor data is sent and received flattened. 
The size property would be used for importing/exporting the tensors in/out of a math package.   

The design takes advantage of the native JSON serialization capabilities built into the C++ Protocol Buffers. 
Serialization allows a much simpler more robust interface to be created between the Perl environment 
and the C++ environment. 
One of the biggest advantages is for the developer who would like to quickly extend what this package does. 
You can see how the data structures are built and directly manipulate them in Perl. 
Of course, if you can be more forward looking, building the proper roles and classes and contributing them would be great. 

=head1 DEPENDENCIES

This module is dependent on gRPC. This module will use the cpan module Alien::Google::GRPC to 
either use an existing gRPC installation on your system or if not found, the Alien::Google::GRPC
module will download and build a private copy.

The system dependencies needed for this module to build are most often already installed. 
If not, the following dependencies need to be installed.

lib/AI/PredictionClient/Docs/Overview.pod  view on Meta::CPAN


=head3 In the examples above, the following points are demonstrated:

If you don't have an image handy --debug_camel will provide a sample image to send to the server. 
The image file argument still needs to be provided to make the command line parser happy.

If you don't have a server to talk to, but want to see if most everything else is working use 
the --debug_loopback_interface. This will provide a sample response you can test the client with. 
The module can use the same loopback interface for debugging your bespoke clients.

The --debug_verbose option will dump the data structures of the request and response to allow
you to see what is going on.

=head3 The response from a live server to the camel image looks like this:

 Inception.pl --image_file=zzzzz --debug_camel --host=107.170.xx.xxx --port=9000    
 Sending image zzzzz to server at host:107.170.xx.xxx  port:9000
 .===========================================================================.
 | Class                                                     | Score         |
 |-----------------------------------------------------------+---------------|
 | Arabian camel, dromedary, Camelus dromedarius             | 11.968746     |

lib/AI/PredictionClient/Docs/Overview.pod  view on Meta::CPAN


 $ cd /serving
 $ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=inception --model_base_path=inception-export &> inception_log &

A longer article on setting up a server is here:

 https://www.tomstall.com/content/create-a-globally-distributed-tensorflow-serving-cluster-with-nearly-no-pain/

=head1 ADDITIONAL INFO

The design of this client is to be fairly easy for a developer to see how the data is formed and received. 
The TensorFlow interface is based on Protocol Buffers and gRPC. 
That implementation is built on a complex architecture of nested protofiles.

In this design I flattened the architecture out and where the native data handling of Perl is best, 
the modules use plain old Perl data structures rather than creating another layer of accessors.

The Tensor interface is used repetitively so this package includes a simplified Tensor class 
to pack and unpack data to and from the models.

In the case of most clients, the Tensor class is simply sending and receiving rank one tensors - vectors. 
In the case of higher rank tensors, the tensor data is sent and received flattened. 
The size property would be used for importing/exporting the tensors in/out of a math package.   

The design takes advantage of the native JSON serialization capabilities built into the C++ Protocol Buffers. 
Serialization allows a much simpler more robust interface to be created between the Perl environment 
and the C++ environment. 
One of the biggest advantages is for the developer who would like to quickly extend what this package does. 
You can see how the data structures are built and directly manipulate them in Perl. 
Of course, if you can be more forward looking, building the proper roles and classes and contributing them would be great. 

=head1 DEPENDENCIES

This module is dependent on gRPC. This module will use the cpan module Alien::Google::GRPC to 
either use an existing gRPC installation on your system or if not found, the Alien::Google::GRPC
module will download and build a private copy.

The system dependencies needed for this module to build are most often already installed. 
If not, the following dependencies need to be installed.

lib/AI/PredictionClient/Testing/PredictionLoopback.pm  view on Meta::CPAN

  if (@_ == 1 && !ref $_[0]) {
    return $class->$orig(server_port => $_[0]);
  } else {
    return $class->$orig(@_);
  }
};

has server_port => (is => 'rw',);

sub callPredict {
  my ($self, $request_data) = @_;

  my $test_return01
    = '{"outputs":{"classes":{"dtype":"DT_STRING","tensorShape":{"dim":[{"size":"1"},{"size":"6"}]},"stringVal":["bG9vcGJhY2sgdGVzdCBkYXRhCg==","bWlsaXRhcnkgdW5pZm9ybQ==","Ym93IHRpZSwgYm93LXRpZSwgYm93dGll","bW9ydGFyYm9hcmQ=","c3VpdCwgc3VpdCBvZiBjbG90...

  my $test_return02
    = '{"outputs":{"classes":{"dtype":"DT_STRING","tensorShape":{"dim":[{"size":"1"},{"size":"5"}]},"stringVal":["bG9hZCBpdAo=","Y2hlY2sgaXQK","cXVpY2sgLSByZXdyaXRlIGl0Cg==","dGVjaG5vbG9naWMK","dGVjaG5vbG9naWMK"]},"scores":{"dtype":"DT_FLOAT","tensor...

  my $return_ser = '{"Status": "OK", ';
  $return_ser .= '"StatusCode": "42", ';
  $return_ser .= '"StatusMessage": "", ';
  $return_ser .= '"DebugRequestLoopback": ' . $request_data . ', ';

  if ($self->server_port eq 'technologic:2004') {
    $return_ser .= '"Result": ' . $test_return02 . '}';
  } else {
    $return_ser .= '"Result": ' . $test_return01 . '}';
  }

  return $return_ser;
}



( run in 0.388 second using v1.01-cache-2.11-cpan-8d75d55dd25 )