You can use Agent Studio to design, test, and manage prompts for Google's Gemini large language models (LLMs) and partner and open models. Agent Studio supports certain partner and open models that are offered on Agent Platform as models as a service (MaaS), such as Anthropic Claude models and Meta's Llama models.
In this quickstart, you:
- Send these prompts to the Gemini API using samples from the
generative AI prompt gallery, including the following:
- A summarization text prompt
- A code generation prompt
- View the code used to generate the responses
Sample prompts in Agent Studio
A prompt is a natural language request submitted to a language model that generates a response. Prompts can contain questions, instructions, contextual information, few-shot examples, and partial input for the model to complete. After the model receives a prompt, depending on the type of model used, it can generate text, embeddings, code, images, videos, music, and more.
The sample prompts in Agent Studio Prompt Gallery are predesigned to help demonstrate model capabilities. Each prompt is preconfigured with specified model and parameter values so you can open the sample prompt and click Submit to generate a response.
Test the Gemini flash model using a summarization text prompt
Send a summarization text prompt to the Gemini API in Colab Enterprise. A summarization task extracts the most important information from text. You can provide information in the prompt to help the model create a summary, or ask the model to create a summary on its own.
Go to the Prompt Gallery page from the Agent Platform section in the Google Cloud console.
Go to Prompt GalleryIn the Tasks drop-down menu, select Summarize.
Open the Audio Summarization card.
This sample prompt includes an audio file and requests a summary of the file contents in a bulleted list.
Notice that in the settings panel, the model's default value is set to gemini-3.1-pro-preview. You can choose a different Gemini model by clicking Switch model.
Click Submit to generate the summary.
The output is displayed in the response.
To view the Gemini Enterprise Agent Platform code used to generate the transcript summary, click Code.
In the Get code panel, you can choose your preferred language to get the sample code for the prompt, or you can open the Python code in a Colab Enterprise notebook.
Test the Gemini flash model using a code generation prompt
Send a code generation prompt to the Gemini API in Colab Enterprise. A code generation task generates code using a natural language description.
Go to the Prompt Gallery page from the Agent Platform section in the Google Cloud console.
Go to Prompt GalleryIn the Tasks drop-down menu, select Code.
Open the Generate code from comments card.
This sample prompt includes a system instruction that tells the model how to respond and some incomplete Java methods.
Notice that in the settings panel, the model's default value is set to gemini-3.1-pro-preview. You can choose a different Gemini model by clicking Switch model.
To complete each method by generating code in the areas marked
<WRITE CODE HERE>, click Submit .The output is displayed in the response.
To view the Gemini Enterprise Agent Platform code used to generate the transcript summary, click Code.
In the Get code panel, you can choose your preferred language to get the sample code for the prompt, or you can open the Python code in a Colab Enterprise notebook.
What's next
Deploy your prompt as a web application
Follow the guide to deploy your prompt as a web application from Agent Studio.