Use the BigQuery remote MCP server

This document describes how to use the BigQuery remote Model Context Protocol (MCP) server to connect to BigQuery from AI applications such as Gemini CLI, agent mode in Gemini Code Assist, Claude Code, or in AI applications you're developing.

Model Context Protocol (MCP) standardizes how large language models (LLMs) and AI applications or agents connect to outside data sources. MCP servers let you use their tools, resources, and prompts to take actions and get updated data from their backend service.

Local MCP servers typically run on your local machine and use the standard input and output streams (stdio) for communication between services on the same device. Remote MCP servers run on the service's infrastructure and offer an HTTPS endpoint to AI applications for communication between the AI MCP client and the MCP server. For more information on MCP architecture, see MCP architecture.

Google and Google Cloud remote MCP servers have the following features and benefits:

  • Simplified, centralized discovery
  • Managed global or regional HTTPS endpoints
  • Fine-grained authorization
  • Optional prompt and response security with Model Armor protection
  • Centralized audit logging

For information about other MCP servers and information about security and governance controls available for Google Cloud MCP servers, see Google Cloud MCP servers overview.

You might use the BigQuery local MCP server for the following reasons:

  • You need to build a custom tool over a parameterized SQL query.
  • You don't have permissions to enable or use the remote MCP server in your project.

For more information about how to use our local MCP server, see Connect LLMs to BigQuery with MCP. The following sections apply only to the BigQuery remote MCP server.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains the resourcemanager.projects.create permission. Learn how to grant roles.

    Go to project selector

  3. If you're using an existing project for this guide, verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.

  4. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains the resourcemanager.projects.create permission. Learn how to grant roles.

    Go to project selector

  5. If you're using an existing project for this guide, verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.

  6. Enable the BigQuery API.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains the serviceusage.services.enable permission. Learn how to grant roles.

    Enable the API

    For new projects, the BigQuery API is automatically enabled.

  7. Optional: Enable billing for the project. If you don't want to enable billing or provide a credit card, the steps in this document still work. BigQuery provides you a sandbox to perform the steps. For more information, see Enable the BigQuery sandbox.

Required roles

To get the permissions that you need to enable the BigQuery MCP server, ask your administrator to grant you the following IAM roles on the project where you want to enable the BigQuery MCP server:

For more information about granting roles, see Manage access to projects, folders, and organizations.

These predefined roles contain the permissions required to enable the BigQuery MCP server. To see the exact permissions that are required, expand the Required permissions section:

Required permissions

The following permissions are required to enable the BigQuery MCP server:

  • Enable MCP servers in a project:
    • serviceusage.mcppolicy.get
    • serviceusage.mcppolicy.update
  • Make MCP tool calls: mcp.tools.call
  • Run BigQuery jobs: bigquery.jobs.create
  • Query BigQuery data: bigquery.tables.getData

You might also be able to get these permissions with custom roles or other predefined roles.

Enable or disable the BigQuery MCP server

You can enable or disable the BigQuery MCP server in a project with the gcloud beta services mcp enable command. For more information, see the following sections.

Enable the BigQuery MCP server in a project

If you're using different projects for your client credentials, such as service account keys, OAuth client ID or API keys, and for hosting your resources, then you must enable the BigQuery service and the BigQuery remote MCP server on both projects.

To enable the BigQuery MCP server in your Google Cloud project, run the following command:

gcloud beta services mcp enable SERVICE \
    --project=PROJECT_ID

Replace the following:

  • PROJECT_ID: the Google Cloud project ID
  • SERVICE: bigquery.googleapis.com (the global service name for BigQuery)

The BigQuery remote MCP server is enabled for use in your Google Cloud project. If the BigQuery service isn't enabled for your Google Cloud project, you're prompted to enable the service before enabling the BigQuery remote MCP server.

As a security best practice, we recommend that you enable MCP servers only for the services required for your AI application to function.

Disable the BigQuery MCP server in a project

To disable the BigQuery MCP server in your Google Cloud project, run the following command:

gcloud beta services mcp disable SERVICE \
    --project=PROJECT_ID

The BigQuery MCP server is disabled for use in your Google Cloud project.

Authentication and authorization

BigQuery MCP servers use the OAuth 2.0 protocol with Identity and Access Management (IAM) for authentication and authorization. All Google Cloud identities are supported for authentication to MCP servers.

The BigQuery remote MCP server doesn't accept API keys.

BigQuery MCP OAuth scopes

OAuth 2.0 uses scopes and credentials to determine if an authenticated principal is authorized to take a specific action on a resource. For more information about OAuth 2.0 scopes at Google, read Using OAuth 2.0 to access Google APIs.

BigQuery has the following MCP tool OAuth scopes:

Scope URI for gcloud CLI Description
https://www.googleapis.com/auth/bigquery View and manage your data in BigQuery and see the email address for your Google Account.

Additional scopes might be required on the resources accessed during a tool call. To view a list of scopes required for BigQuery, see OAuth 2.0 scopes for BigQuery API v2.

Configure an MCP client to use the BigQuery MCP server

Host programs, such as Claude or Gemini CLI, can instantiate MCP clients that connect to a single MCP server. A host program can have multiple clients that connect to different MCP servers. To connect to a remote MCP server, the MCP client must know at a minimum the URL of the remote MCP server.

In your host, look for a way to connect to a remote MCP server. You're prompted to enter details about the server, such as its name and URL.

For the BigQuery MCP server, enter the following as required:

  • Server name: BigQuery MCP server
  • Server URL or Endpoint: bigquery.googleapis.com/mcp
  • Transport: HTTP
  • Authentication details: your Google Cloud credentials, your OAuth Client ID and secret, or an agent identity and credentials

    Which authentication details you choose depend on how you want to authenticate. For more information, see Authenticate to MCP servers.

For host-specific guidance, see the following:

For more general guidance, see Connect to remote MCP servers.

Available tools

MCP tools that are read-only have the MCP attribute mcp.tool.isReadOnly set to true. You might want to only allow read-only tools in certain environments through your organization policy.

To view details of available MCP tools and their descriptions for the BigQuery MCP server, see the BigQuery MCP reference.

List tools

Use the MCP inspector to list tools, or send a tools/list HTTP request directly to the BigQuery remote MCP server. The tools/list method doesn't require authentication.

POST /mcp HTTP/1.1
Host: bigquery.googleapis.com
Content-Type: application/json

{
  "jsonrpc": "2.0",
  "method": "tools/list",
}

Sample use cases

The following are sample use cases for the BigQuery MCP server:

  • Build workflows that use insights from BigQuery data to trigger certain actions such as creating issues and composing emails.

  • Use BigQuery's advanced capabilities like forecasting for higher-order insights.

  • Build a conversational experience for your users with custom agent instructions.

Sample prompts

You can use the following sample prompts to get information about BigQuery resources, gain insights, and analyze BigQuery data:

  • List the datasets in project PROJECT_ID.
  • Find all the queries that I ran in project PROJECT_ID using the MCP server in the REGION region. Use the tag goog-mcp-server:true to identify the query jobs that ran through the MCP server.
  • Find the top orders by volume from DATASET_ID in project PROJECT_ID. Identify the appropriate tables, find the correct schema, and show the results.
  • Create a forecast on the table PROJECT_ID.DATASET_ID.TABLE_ID for future years. Use COLUMN_NAME as the data column and COLUMN_NAME as the timestamp column. Show the top 10 forecasts.

In the prompts, replace the following:

  • PROJECT_ID: the Google Cloud project ID
  • REGION: the name of the region
  • DATASET_ID: the name of the dataset
  • TABLE_ID: the name of the table
  • COLUMN_NAME: the name of the column

Optional security and safety configurations

MCP introduces new security risks and considerations due to the wide variety of actions that you can take with MCP tools. To minimize and manage these risks, Google Cloud offers defaults and customizable policies to control the use of MCP tools in your Google Cloud organization or project.

For more information about MCP security and governance, see AI security and safety.

Model Armor

Model Armor is a Google Cloud service designed to enhance the security and safety of your AI applications. It works by proactively screening LLM prompts and responses, protecting against various risks and supporting responsible AI practices. Whether you deploy AI in your cloud environment, or on external cloud providers, Model Armor can help you prevent malicious input, verify content safety, protect sensitive data, maintain compliance, and enforce your AI safety and security policies consistently across your diverse AI landscape.

Model Armor is only available in specific regional locations. If Model Armor is enabled for a project, and a call to that project comes from an unsupported region, Model Armor makes a cross-regional call. For more information, see Model Armor locations.

Enable Model Armor

To enable Model Armor, complete the following steps:

  1. Enable Model Armor on your Google Cloud project.

    gcloud services enable modelarmor.googleapis.com \
        --project=PROJECT_ID
    

    Replace PROJECT_ID with your Google Cloud project ID.

  2. Configure the recommended floor settings for Model Armor.

    gcloud model-armor floorsettings update \
        --full-uri='projects/PROJECT_ID/locations/global/floorSetting' \
        --mcp-sanitization=ENABLED \
        --malicious-uri-filter-settings-enforcement=ENABLED \
        --pi-and-jailbreak-filter-settings-enforcement=ENABLED \
        --pi-and-jailbreak-filter-settings-confidence-level=MEDIUM_AND_ABOVE
    

    Replace PROJECT_ID with your Google Cloud project ID.

    Model Armor is configured to scan for malicious URLs and prompt injection and jailbreak attempts.

    For more information about configurable Model Armor filters, see Model Armor filters.

  3. Add Model Armor as a content security provider for MCP services.

    gcloud beta services mcp content-security add modelarmor.googleapis.com \
        --project=PROJECT_ID
    

    Replace PROJECT_ID with the Google Cloud project ID.

  4. Confirm that MCP traffic is sent to Model Armor.

    gcloud beta services mcp content-security get \
        --project=PROJECT_ID
    

    Replace PROJECT_ID with the Google Cloud project ID.

Model Armor logging

For information about Model Armor audit and platform logs, see Model Armor audit logging.

Disable Model Armor in a project

To disable Model Armor in a Google Cloud project, run the following command:

gcloud beta services mcp content-security remove modelarmor.googleapis.com \
    --project=PROJECT_ID

Replace PROJECT_ID with the Google Cloud project ID.

MCP traffic on Google Cloud isn't be scanned by Model Armor for the specified project.

Disable scanning MCP traffic with Model Armor

If you still want to use Model Armor in a project, but you want to stop scanning MCP traffic with Model Armor, then run the following command:

gcloud model-armor floorsettings update \
    --full-uri='projects/PROJECT_ID/locations/global/floorSetting' \
    --mcp-sanitization=DISABLED

Replace PROJECT_ID with the Google Cloud project ID.

Model Armor won't scan MCP traffic on Google Cloud.

Organization-level MCP control

You can create custom organization policies to control the use of MCP servers in your Google Cloud organization using the gcp.managed.allowedMCPService constraint. For more information and usage examples, see Access control with IAM.

What's next