Class LlmModelSettings.Parameters.Builder (0.94.0)

public static final class LlmModelSettings.Parameters.Builder extends GeneratedMessage.Builder<LlmModelSettings.Parameters.Builder> implements LlmModelSettings.ParametersOrBuilder

Generative model parameters to control the model behavior.

Protobuf type google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters

Static Methods

getDescriptor()

public static final Descriptors.Descriptor getDescriptor()
Returns
Type Description
Descriptor

Methods

build()

public LlmModelSettings.Parameters build()
Returns
Type Description
LlmModelSettings.Parameters

buildPartial()

public LlmModelSettings.Parameters buildPartial()
Returns
Type Description
LlmModelSettings.Parameters

clear()

public LlmModelSettings.Parameters.Builder clear()
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

clearInputTokenLimit()

public LlmModelSettings.Parameters.Builder clearInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

clearOutputTokenLimit()

public LlmModelSettings.Parameters.Builder clearOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

clearTemperature()

public LlmModelSettings.Parameters.Builder clearTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

getDefaultInstanceForType()

public LlmModelSettings.Parameters getDefaultInstanceForType()
Returns
Type Description
LlmModelSettings.Parameters

getDescriptorForType()

public Descriptors.Descriptor getDescriptorForType()
Returns
Type Description
Descriptor
Overrides

getInputTokenLimit()

public LlmModelSettings.Parameters.InputTokenLimit getInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
LlmModelSettings.Parameters.InputTokenLimit

The inputTokenLimit.

getInputTokenLimitValue()

public int getInputTokenLimitValue()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
int

The enum numeric value on the wire for inputTokenLimit.

getOutputTokenLimit()

public LlmModelSettings.Parameters.OutputTokenLimit getOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
LlmModelSettings.Parameters.OutputTokenLimit

The outputTokenLimit.

getOutputTokenLimitValue()

public int getOutputTokenLimitValue()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
int

The enum numeric value on the wire for outputTokenLimit.

getTemperature()

public float getTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
float

The temperature.

hasInputTokenLimit()

public boolean hasInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
boolean

Whether the inputTokenLimit field is set.

hasOutputTokenLimit()

public boolean hasOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
boolean

Whether the outputTokenLimit field is set.

hasTemperature()

public boolean hasTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
boolean

Whether the temperature field is set.

internalGetFieldAccessorTable()

protected GeneratedMessage.FieldAccessorTable internalGetFieldAccessorTable()
Returns
Type Description
FieldAccessorTable
Overrides

isInitialized()

public final boolean isInitialized()
Returns
Type Description
boolean
Overrides

mergeFrom(LlmModelSettings.Parameters other)

public LlmModelSettings.Parameters.Builder mergeFrom(LlmModelSettings.Parameters other)
Parameter
Name Description
other LlmModelSettings.Parameters
Returns
Type Description
LlmModelSettings.Parameters.Builder

mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)

public LlmModelSettings.Parameters.Builder mergeFrom(CodedInputStream input, ExtensionRegistryLite extensionRegistry)
Parameters
Name Description
input CodedInputStream
extensionRegistry ExtensionRegistryLite
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides
Exceptions
Type Description
IOException

mergeFrom(Message other)

public LlmModelSettings.Parameters.Builder mergeFrom(Message other)
Parameter
Name Description
other Message
Returns
Type Description
LlmModelSettings.Parameters.Builder
Overrides

setInputTokenLimit(LlmModelSettings.Parameters.InputTokenLimit value)

public LlmModelSettings.Parameters.Builder setInputTokenLimit(LlmModelSettings.Parameters.InputTokenLimit value)

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Parameter
Name Description
value LlmModelSettings.Parameters.InputTokenLimit

The inputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setInputTokenLimitValue(int value)

public LlmModelSettings.Parameters.Builder setInputTokenLimitValue(int value)

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Parameter
Name Description
value int

The enum numeric value on the wire for inputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setOutputTokenLimit(LlmModelSettings.Parameters.OutputTokenLimit value)

public LlmModelSettings.Parameters.Builder setOutputTokenLimit(LlmModelSettings.Parameters.OutputTokenLimit value)

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Parameter
Name Description
value LlmModelSettings.Parameters.OutputTokenLimit

The outputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setOutputTokenLimitValue(int value)

public LlmModelSettings.Parameters.Builder setOutputTokenLimitValue(int value)

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Parameter
Name Description
value int

The enum numeric value on the wire for outputTokenLimit to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.

setTemperature(float value)

public LlmModelSettings.Parameters.Builder setTemperature(float value)

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Parameter
Name Description
value float

The temperature to set.

Returns
Type Description
LlmModelSettings.Parameters.Builder

This builder for chaining.