Enum LlmModelSettings.Parameters.OutputTokenLimit (0.89.0)

public enum LlmModelSettings.Parameters.OutputTokenLimit extends Enum<LlmModelSettings.Parameters.OutputTokenLimit> implements ProtocolMessageEnum

The output token limits for 1 LLM call. The limits are subject to change. For the limit of each model, see https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models for more information.

Protobuf enum google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit

Implements

ProtocolMessageEnum

Static Fields

Name Description
OUTPUT_TOKEN_LIMIT_LONG

Input token limit up to 2k.

OUTPUT_TOKEN_LIMIT_LONG = 3;

OUTPUT_TOKEN_LIMIT_LONG_VALUE

Input token limit up to 2k.

OUTPUT_TOKEN_LIMIT_LONG = 3;

OUTPUT_TOKEN_LIMIT_MEDIUM

Input token limit up to 1k.

OUTPUT_TOKEN_LIMIT_MEDIUM = 2;

OUTPUT_TOKEN_LIMIT_MEDIUM_VALUE

Input token limit up to 1k.

OUTPUT_TOKEN_LIMIT_MEDIUM = 2;

OUTPUT_TOKEN_LIMIT_SHORT

Input token limit up to 512 tokens.

OUTPUT_TOKEN_LIMIT_SHORT = 1;

OUTPUT_TOKEN_LIMIT_SHORT_VALUE

Input token limit up to 512 tokens.

OUTPUT_TOKEN_LIMIT_SHORT = 1;

OUTPUT_TOKEN_LIMIT_UNSPECIFIED

Limit not specified.

OUTPUT_TOKEN_LIMIT_UNSPECIFIED = 0;

OUTPUT_TOKEN_LIMIT_UNSPECIFIED_VALUE

Limit not specified.

OUTPUT_TOKEN_LIMIT_UNSPECIFIED = 0;

UNRECOGNIZED

Static Methods

Name Description
forNumber(int value)
getDescriptor()
internalGetValueMap()
valueOf(Descriptors.EnumValueDescriptor desc)
valueOf(int value)

Deprecated. Use #forNumber(int) instead.

valueOf(String name)
values()

Methods

Name Description
getDescriptorForType()
getNumber()
getValueDescriptor()