Reference documentation and code samples for the Gemini Enterprise for Customer Experience V1 API class Google::Cloud::Ces::V1::ModelSettings.
Model settings contains various configurations for the LLM model.
Inherits
- Object
Extended By
- Google::Protobuf::MessageExts::ClassMethods
Includes
- Google::Protobuf::MessageExts
Methods
#model
def model() -> ::String
Returns
- (::String) — Optional. The LLM model that the agent should use. If not set, the agent will inherit the model from its parent agent.
#model=
def model=(value) -> ::String
Parameter
- value (::String) — Optional. The LLM model that the agent should use. If not set, the agent will inherit the model from its parent agent.
Returns
- (::String) — Optional. The LLM model that the agent should use. If not set, the agent will inherit the model from its parent agent.
#temperature
def temperature() -> ::Float
Returns
- (::Float) — Optional. If set, this temperature will be used for the LLM model. Temperature controls the randomness of the model's responses. Lower temperatures produce responses that are more predictable. Higher temperatures produce responses that are more creative.
#temperature=
def temperature=(value) -> ::Float
Parameter
- value (::Float) — Optional. If set, this temperature will be used for the LLM model. Temperature controls the randomness of the model's responses. Lower temperatures produce responses that are more predictable. Higher temperatures produce responses that are more creative.
Returns
- (::Float) — Optional. If set, this temperature will be used for the LLM model. Temperature controls the randomness of the model's responses. Lower temperatures produce responses that are more predictable. Higher temperatures produce responses that are more creative.