Vertex AI V1 API - Class Google::Cloud::AIPlatform::V1::EventMetadata (v1.37.0)

Reference documentation and code samples for the Vertex AI V1 API class Google::Cloud::AIPlatform::V1::EventMetadata.

Metadata relating to a LLM response event.

Inherits

  • Object

Extended By

  • Google::Protobuf::MessageExts::ClassMethods

Includes

  • Google::Protobuf::MessageExts

Methods

#branch

def branch() -> ::String
Returns
  • (::String) — Optional. The branch of the event. The format is like agent_1.agent_2.agent_3, where agent_1 is the parent of agent_2, and agent_2 is the parent of agent_3. Branch is used when multiple child agents shouldn't see their siblings' conversation history.

#branch=

def branch=(value) -> ::String
Parameter
  • value (::String) — Optional. The branch of the event. The format is like agent_1.agent_2.agent_3, where agent_1 is the parent of agent_2, and agent_2 is the parent of agent_3. Branch is used when multiple child agents shouldn't see their siblings' conversation history.
Returns
  • (::String) — Optional. The branch of the event. The format is like agent_1.agent_2.agent_3, where agent_1 is the parent of agent_2, and agent_2 is the parent of agent_3. Branch is used when multiple child agents shouldn't see their siblings' conversation history.

#custom_metadata

def custom_metadata() -> ::Google::Protobuf::Struct
Returns

#custom_metadata=

def custom_metadata=(value) -> ::Google::Protobuf::Struct
Parameter
Returns

#grounding_metadata

def grounding_metadata() -> ::Google::Cloud::AIPlatform::V1::GroundingMetadata
Returns

#grounding_metadata=

def grounding_metadata=(value) -> ::Google::Cloud::AIPlatform::V1::GroundingMetadata
Parameter
Returns

#input_transcription

def input_transcription() -> ::Google::Cloud::AIPlatform::V1::Transcription
Returns

#input_transcription=

def input_transcription=(value) -> ::Google::Cloud::AIPlatform::V1::Transcription
Parameter
Returns

#interrupted

def interrupted() -> ::Boolean
Returns
  • (::Boolean) — Optional. Flag indicating that LLM was interrupted when generating the content. Usually it's due to user interruption during a bidi streaming.

#interrupted=

def interrupted=(value) -> ::Boolean
Parameter
  • value (::Boolean) — Optional. Flag indicating that LLM was interrupted when generating the content. Usually it's due to user interruption during a bidi streaming.
Returns
  • (::Boolean) — Optional. Flag indicating that LLM was interrupted when generating the content. Usually it's due to user interruption during a bidi streaming.

#long_running_tool_ids

def long_running_tool_ids() -> ::Array<::String>
Returns
  • (::Array<::String>) — Optional. Set of ids of the long running function calls. Agent client will know from this field about which function call is long running. Only valid for function call event.

#long_running_tool_ids=

def long_running_tool_ids=(value) -> ::Array<::String>
Parameter
  • value (::Array<::String>) — Optional. Set of ids of the long running function calls. Agent client will know from this field about which function call is long running. Only valid for function call event.
Returns
  • (::Array<::String>) — Optional. Set of ids of the long running function calls. Agent client will know from this field about which function call is long running. Only valid for function call event.

#output_transcription

def output_transcription() -> ::Google::Cloud::AIPlatform::V1::Transcription
Returns

#output_transcription=

def output_transcription=(value) -> ::Google::Cloud::AIPlatform::V1::Transcription
Parameter
Returns

#partial

def partial() -> ::Boolean
Returns
  • (::Boolean) — Optional. Indicates whether the text content is part of a unfinished text stream. Only used for streaming mode and when the content is plain text.

#partial=

def partial=(value) -> ::Boolean
Parameter
  • value (::Boolean) — Optional. Indicates whether the text content is part of a unfinished text stream. Only used for streaming mode and when the content is plain text.
Returns
  • (::Boolean) — Optional. Indicates whether the text content is part of a unfinished text stream. Only used for streaming mode and when the content is plain text.

#turn_complete

def turn_complete() -> ::Boolean
Returns
  • (::Boolean) — Optional. Indicates whether the response from the model is complete. Only used for streaming mode.

#turn_complete=

def turn_complete=(value) -> ::Boolean
Parameter
  • value (::Boolean) — Optional. Indicates whether the response from the model is complete. Only used for streaming mode.
Returns
  • (::Boolean) — Optional. Indicates whether the response from the model is complete. Only used for streaming mode.