AI-PredictionClient

 view release on metacpan or  search on metacpan

bin/Inception.pl  view on Meta::CPAN


my $default_host            = '127.0.0.1';
my $default_port            = '9000';
my $default_model           = 'inception';
my $default_model_signature = 'predict_images';

option image_file => (
  is       => 'ro',
  required => 1,
  format   => 's',
  doc      => '* Required: Path to image to be processed'
);
option host => (
  is       => 'ro',
  required => 0,
  format   => 's',
  default  => $default_host,
  doc      => "IP address of the server [Default: $default_host]"
);
option port => (
  is       => 'ro',
  required => 0,
  format   => 's',
  default  => $default_port,
  doc      => "Port number of the server [Default: $default_port]"
);
option model_name => (
  is       => 'ro',
  required => 0,
  format   => 's',
  default  => $default_model,
  doc      => "Model to process image [Default: $default_model]"
);
option model_signature => (
  is       => 'ro',
  required => 0,
  format   => 's',
  default  => $default_model_signature,
  doc      => "API signature for model [Default: $default_model_signature]"
);
option debug_verbose => (is => 'ro', doc => 'Verbose output');
option debug_loopback_interface => (

lib/AI/PredictionClient.pm  view on Meta::CPAN


 $ Inception.pl 
 image_file is missing
 USAGE: Inception.pl [-h] [long options ...]

    --debug_camel               Test using camel image
    --debug_loopback_interface  Test loopback through dummy server
    --debug_verbose             Verbose output
    --host=String               IP address of the server [Default:
                                127.0.0.1]
    --image_file=String         * Required: Path to image to be processed
    --model_name=String         Model to process image [Default: inception]
    --model_signature=String    API signature for model [Default:
                                predict_images]
    --port=String               Port number of the server [Default: 9000]
    -h                          show a compact help message

Some typical command line examples include:

 Inception.pl --image_file=anything --debug_camel --host=xx7.x11.xx3.x14 --port=9000
 Inception.pl --image_file=grace_hopper.jpg --host=xx7.x11.xx3.x14 --port=9000
 Inception.pl --image_file=anything --debug_camel --debug_loopback --port 2004 --host technologic

lib/AI/PredictionClient/Docs/Overview.pod  view on Meta::CPAN


 $ Inception.pl 
 image_file is missing
 USAGE: Inception.pl [-h] [long options ...]

    --debug_camel               Test using camel image
    --debug_loopback_interface  Test loopback through dummy server
    --debug_verbose             Verbose output
    --host=String               IP address of the server [Default:
                                127.0.0.1]
    --image_file=String         * Required: Path to image to be processed
    --model_name=String         Model to process image [Default: inception]
    --model_signature=String    API signature for model [Default:
                                predict_images]
    --port=String               Port number of the server [Default: 9000]
    -h                          show a compact help message

Some typical command line examples include:

 Inception.pl --image_file=anything --debug_camel --host=xx7.x11.xx3.x14 --port=9000
 Inception.pl --image_file=grace_hopper.jpg --host=xx7.x11.xx3.x14 --port=9000
 Inception.pl --image_file=anything --debug_camel --debug_loopback --port 2004 --host technologic



( run in 0.327 second using v1.01-cache-2.11-cpan-8d75d55dd25 )