AI-Ollama-Client

 view release on metacpan or  search on metacpan

lib/AI/Ollama/GenerateCompletionRequest.pm  view on Meta::CPAN


The model name.

Model names follow a `model:tag` format. Some examples are `orca-mini:3b-q4_1` and `llama2:70b`. The tag is optional and, if not provided, will default to `latest`. The tag is used to identify a specific version.

=cut

has 'model' => (
    is       => 'ro',
    isa      => Str,
    required => 1,
);

=head2 C<< options >>

Additional model parameters listed in the documentation for the Modelfile such as `temperature`.

=cut

has 'options' => (
    is       => 'ro',
    isa      => HashRef,
);

=head2 C<< prompt >>

The prompt to generate a response.

=cut

has 'prompt' => (
    is       => 'ro',
    isa      => Str,
    required => 1,
);

=head2 C<< raw >>

If `true` no formatting will be applied to the prompt and no context will be returned.

You may choose to use the `raw` parameter if you are specifying a full templated prompt in your request to the API, and are managing history yourself.

=cut

has 'raw' => (
    is       => 'ro',
);

=head2 C<< stream >>

If `false` the response will be returned as a single response object, otherwise the response will be streamed as a series of objects.

=cut

has 'stream' => (
    is       => 'ro',
);

=head2 C<< system >>

The system prompt to (overrides what is defined in the Modelfile).

=cut

has 'system' => (
    is       => 'ro',
    isa      => Str,
);

=head2 C<< template >>

The full prompt or prompt template (overrides what is defined in the Modelfile).

=cut

has 'template' => (
    is       => 'ro',
    isa      => Str,
);


1;



( run in 0.810 second using v1.01-cache-2.11-cpan-39bf76dae61 )