Tool: execute_sql
Execute any valid SQL statement, including data definition language (DDL), data control language (DCL), data query language (DQL), or data manipulation language (DML) statements, on a Cloud SQL instance.
To support the execute_sql tool, a Cloud SQL instance must meet the following requirements:
- The value of
data_api_accessmust be set toALLOW_DATA_API. - For a MySQL instance, the database flag
cloudsql_iam_authenticationmust be set toon. For a PostgreSQL instance, the database flagcloudsql.iam_authenticationmust be set toon. - An IAM user account or IAM service account (
CLOUD_IAM_USERorCLOUD_IAM_SERVICE_ACCOUNT) is required to call theexecute_sqltool. The tool executes the SQL statements using the privileges of the database user logged with IAM database authentication.
After you use the create_instance tool to create an instance, you can use the create_user tool to create an IAM user account for the user currently logged in to the project.
The execute_sql tool has the following limitations:
- If a SQL statement returns a response larger than 10 MB, then the response will be truncated.
- The
execute_sqltool has a default timeout of 30 seconds. If a query runs longer than 30 seconds, then the tool returns aDEADLINE_EXCEEDEDerror. - The
execute_sqltool isn't supported for SQL Server.
If you receive errors similar to "IAM authentication is not enabled for the instance", then you can use the get_instance tool to check the value of the IAM database authentication flag for the instance.
If you receive errors like "The instance doesn't allow using executeSql to access this instance", then you can use get_instance tool to check the data_api_access setting.
When you receive authentication errors:
- Check if the currently logged-in user account exists as an IAM user on the instance using the
list_userstool. - If the IAM user account doesn't exist, then use the
create_usertool to create the IAM user account for the logged-in user. - If the currently logged in user doesn't have the proper database user roles, then you can use
update_usertool to grant database roles to the user. For example,cloudsqlsuperuserrole can provide an IAM user with many required permissions. Check if the currently logged in user has the correct IAM permissions assigned for the project. You can use
gcloud projects get-iam-policy [PROJECT_ID]command to check if the user has the proper IAM roles or permissions assigned for the project.- The user must have
cloudsql.instance.loginpermission to do automatic IAM database authentication. - The user must have
cloudsql.instances.executeSqlpermission to execute SQL statements using theexecute_sqltool orexecuteSqlAPI. - Common IAM roles that contain the required permissions: Cloud SQL Instance User (
roles/cloudsql.instanceUser) or Cloud SQL Admin (roles/cloudsql.admin)
- The user must have
When receiving an ExecuteSqlResponse, always check the message and status fields within the response body. A successful HTTP status code doesn't guarantee full success of all SQL statements. The message and status fields will indicate if there were any partial errors or warnings during SQL statement execution.
The following sample demonstrate how to use curl to invoke the execute_sql MCP tool.
| Curl Request |
|---|
curl --location 'https://sqladmin.googleapis.com/mcp' \ --header 'content-type: application/json' \ --header 'accept: application/json, text/event-stream' \ --data '{ "method": "tools/call", "params": { "name": "execute_sql", "arguments": { // provide these details according to the tool's MCP specification } }, "jsonrpc": "2.0", "id": 1 }' |
Input Schema
Instance execute sql request for MCP.
SqlInstancesExecuteSqlMcpRequest
| JSON representation |
|---|
{ "instance": string, "project": string, "sqlStatement": string, "database": string } |
| Fields | |
|---|---|
instance |
Required. Database instance ID. This does not include the project ID. |
project |
Required. Project ID of the project that contains the instance. |
sqlStatement |
Required. SQL statements to run on the database. It can be a single statement or a sequence of statements separated by semicolons. |
database |
Optional. Name of the database on which the statement will be executed. For Postgres it's required, for MySQL it's optional. For Postgres, if your query is not scoped to an existings database, like list databases / create new database / grant roles, you can pass in default value as postgres. |
Output Schema
Execute SQL statements response.
SqlInstancesExecuteSqlResponse
| JSON representation |
|---|
{ "messages": [ { object ( |
| Fields | |
|---|---|
messages[] |
A list of notices and warnings generated during query execution. For PostgreSQL, this includes all notices and warnings. For MySQL, this includes warnings generated by the last executed statement. To retrieve all warnings for a multi-statement query, |
metadata |
The additional metadata information regarding the execution of the SQL statements. |
results[] |
The list of results after executing all the SQL statements. |
status |
Contains the error from the database if the SQL execution failed. |
Message
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field
|
|
message |
The full message string. For PostgreSQL, this is a formatted string that may include severity, code, and the notice/warning message. For MySQL, this contains the warning message. |
Union field
|
|
severity |
The severity of the message (e.g., "NOTICE" for PostgreSQL, "WARNING" for MySQL). |
Metadata
| JSON representation |
|---|
{ "sqlStatementExecutionTime": string } |
| Fields | |
|---|---|
sqlStatementExecutionTime |
The time taken to execute the SQL statements. A duration in seconds with up to nine fractional digits, ending with ' |
Duration
| JSON representation |
|---|
{ "seconds": string, "nanos": integer } |
| Fields | |
|---|---|
seconds |
Signed seconds of the span of time. Must be from -315,576,000,000 to +315,576,000,000 inclusive. Note: these bounds are computed from: 60 sec/min * 60 min/hr * 24 hr/day * 365.25 days/year * 10000 years |
nanos |
Signed fractions of a second at nanosecond resolution of the span of time. Durations less than one second are represented with a 0 |
QueryResult
| JSON representation |
|---|
{ "columns": [ { object ( |
| Fields | |
|---|---|
columns[] |
List of columns included in the result. This also includes the data type of the column. |
rows[] |
Rows returned by the SQL statement. |
message |
Message related to the SQL execution result. |
partialResult |
Set to true if the SQL execution's result is truncated due to size limits or an error retrieving results. |
status |
If results were truncated due to an error, details of that error. |
Column
| JSON representation |
|---|
{ "name": string, "type": string } |
| Fields | |
|---|---|
name |
Name of the column. |
type |
Datatype of the column. |
Row
| JSON representation |
|---|
{
"values": [
{
object ( |
| Fields | |
|---|---|
values[] |
The values for the row. |
Value
| JSON representation |
|---|
{ "value": string, "nullValue": boolean } |
| Fields | |
|---|---|
value |
The cell value in string format. |
nullValue |
If cell value is null, then this flag will be set to true. |
Status
| JSON representation |
|---|
{ "code": integer, "message": string, "details": [ { "@type": string, field1: ..., ... } ] } |
| Fields | |
|---|---|
code |
The status code, which should be an enum value of |
message |
A developer-facing error message, which should be in English. Any user-facing error message should be localized and sent in the |
details[] |
A list of messages that carry the error details. There is a common set of message types for APIs to use. An object containing fields of an arbitrary type. An additional field |
Any
| JSON representation |
|---|
{ "typeUrl": string, "value": string } |
| Fields | |
|---|---|
typeUrl |
A URL/resource name that uniquely identifies the type of the serialized protocol buffer message. This string must contain at least one "/" character. The last segment of the URL's path must represent the fully qualified name of the type (as in In practice, teams usually precompile into the binary all types that they expect it to use in the context of Any. However, for URLs which use the scheme
Note: this functionality is not currently available in the official protobuf release, and it is not used for type URLs beginning with type.googleapis.com. As of May 2023, there are no widely used type server implementations and no plans to implement one. Schemes other than |
value |
Must be a valid serialized protocol buffer of the above specified type. A base64-encoded string. |
Tool Annotations
Destructive Hint: ✅ | Idempotent Hint: ❌ | Read Only Hint: ❌ | Open World Hint: ❌