A Model Context Protocol (MCP) server acts as a proxy between an external service that provides context, data, or capabilities to a Large Language Model (LLM) or AI application. MCP servers connect AI applications to external systems such as databases and web services, translating their responses into a format that the AI application can understand.
Server Setup
You must enable MCP servers and set up authentication before use. For more information about using Google and Google Cloud remote MCP servers, see Google Cloud MCP servers overview.
An MCP server for interacting with Cloud Composer resources.
Server Endpoints
An MCP service endpoint is the network address and communication interface (usually a URL) of the MCP server that an AI application (the Host for the MCP client) uses to establish a secure, standardized connection. It is the point of contact for the LLM to request context, call a tool, or access a resource. Google MCP endpoints can be global or regional.
The composer.googleapis.com MCP server uses only regional endpoints:
- https://composer.{region}.rep.googleapis.com/mcp
MCP Tools
An MCP tool is a function or executable capability that an MCP server exposes to a LLM or AI application to perform an action in the real world.
The composer.googleapis.com MCP server has the following tools:
| MCP Tools | |
|---|---|
| create_environment |
Initiates the creation of a new Cloud Composer environment with default settings. This tool creates a small private IP environment with a default configuration, including default networking settings. Creating public IP environments, or customizing most settings (such as network, subnetwork, network attachment, environment size, CMEK or custom environment bucket) is not supported. If the user requests settings that aren't supported by the tool parameters, inform them that this tool can't create an environment in the requested configuration, and that they should use gcloud CLI or Cloud Console instead. Some environment configuration parameters are mandatory. Always ask about them:
This tool creates an environment without pre-installed PyPI packages. Use the This tool triggers the environment creation process, which is a long-running operation. The tool returns an operation object. Use the |
| get_environment |
Retrieves details about a specific Cloud Composer environment, including its configuration and status. Use this tool to check the current state of an environment, its Airflow version, Airflow UI URL, environment's bucket, or other configuration details. |
| list_environments |
Lists all Cloud Composer environments in a given project and location. Use this tool to when you need to know which environments exist in a specific project and location. Results are paginated; use |
| delete_environment |
Initiates the deletion of a Cloud Composer environment. Warning: This is a destructive action permanently deletes an environment and can't be undone. Always ask the user to confirm the deletion before proceeding. This tool triggers the environment deletion process, which is a long-running operation that typically takes 10-20 minutes. The tool returns an operation object. Use the |
| manage_pypi_packages |
Manages PyPI packages in a Cloud Composer environment. Use this tool to install and uninstall PyPI packages in the Cloud Composer environment, or change PyPI package versions or extras. To install packages or change versions or extras, use the This tool triggers an environment update, which is a long-running operation that can take several minutes. The tool returns an operation object. Use the |
| get_operation |
Gets the latest status of a long-running operation. Several tools like Provide the A completed operation may include the details specific to the operation, described in the description of the tool that initiates the operation. |
| get_dag_source_code |
Retrieves the DAG source code from a Cloud Composer environment. Use this tool when you want to understand DAG implementation details, for example to understand why a DAG run has failed, or to debug DAG's behavior. The tool returns the latest version of the source code. |
| get_dag_run |
Retrieves details of a specific DAG run from a Cloud Composer environment. Use this tool when you need to retrieve details of a specific DAG run, for example its state, start time or end time. |
| get_task_instance |
Retrieves details of a specific task instance from a Cloud Composer environment. Use this tool to retrieve details of a specific task instance, such as its state, start time, end time, or parameters, to understand its status or to debug a failure. This tool does not return task instance logs directly. Instead, use the |
| find_last_failed_dag_runs |
Finds failed DAG runs in a Cloud Composer environment. Use this tool to find recent DAG run failures, for example to begin troubleshooting. If the user wants to troubleshoot a specific DAG, provide the DAG ID to get the details of its last failed run. If the user doesn't know which DAG is failing, or wants to see recent failures across all DAGs, leave the DAG ID empty; in this case, the tool will return the last failed DAG run for every DAG in the environment that has at least one failure. If the response returns more than one DAG run and the user asked for troubleshooting or explaining the failure, ask the user which DAG run to troubleshoot. |
| list_failed_task_instances |
Lists failed task instances of a specific DAG run in a Cloud Composer environment. If a DAG run has failed, use this tool to get the list of task instances of this DAG run that are in the FAILED state and may have caused the DAG run failure. This tool does not return task instance logs directly. Instead, for each task instance, use the |
Get MCP tool specifications
To get the MCP tool specifications for all tools in an MCP server, use the tools/list method. The following example demonstrates how to use curl to list all tools and their specifications currently available within the MCP server.
| Curl Request |
|---|
curl --location 'https://composer.{region}.rep.googleapis.com/mcp' \ --header 'content-type: application/json' \ --header 'accept: application/json, text/event-stream' \ --data '{ "method": "tools/list", "jsonrpc": "2.0", "id": 1 }' |