Enable the Conversational Analytics API

This page describes the steps for enabling the Conversational Analytics API (accessed through geminidataanalytics.googleapis.com) for your Google Cloud project. The Conversational Analytics API is the first capability under the Data Analytics API with Gemini in the Google Cloud console.

Before you begin

Before you can use the Conversational Analytics API, you must have a Google Cloud project that can access the data that you want to work with. To set up your project, complete the following steps:

  1. Identify the data that you want to use with the Conversational Analytics API. You can use data sources like BigQuery tables, Looker Explores, Looker Studio reports, and data from AlloyDB, GoogleSQL for Spanner, Cloud SQL, and Cloud SQL for PostgreSQL.
  2. In the Google Cloud console, on the project selector page, create a Google Cloud project that can access the data that you want to use with the Conversational Analytics API.

    Go to project selector

Looker and Looker Studio Pro requirements

If you use the Conversational Analytics API with Looker, your Looker instance must meet specific requirements:

To use the Conversational Analytics API with Looker Studio Pro, your Pro subscription must be outside of a VPC-SC perimeter.

Enable the required APIs

To use the Conversational Analytics API, you must enable a set of required APIs in your Google Cloud project. If you plan to use the Conversational Analytics API from a Colab Enterprise environment, you must also enable additional APIs.

Required APIs

console

Enable the following APIs in the Google Cloud console for the Google Cloud project that you will use with the Conversational Analytics API.

Enable the Gemini Data Analytics API

Enable the Gemini for Google Cloud API

Enable the BigQuery API

gcloud

With the Google Cloud CLI, run the following gcloud services enable commands to enable the Gemini Data Analytics API, the Gemini for Google Cloud API, and the BigQuery API, respectively:

gcloud services enable geminidataanalytics.googleapis.com --project=project_id
gcloud services enable cloudaicompanion.googleapis.com --project=project_id
gcloud services enable bigquery.googleapis.com --project=project_id

In the previous sample gcloud CLI commands, replace project_id with your Google Cloud project ID.

APIs for Colab Enterprise

If you plan to use the Conversational Analytics API from a Colab Enterprise environment, also enable the following APIs:

Enable the Dataform API

Enable the Compute Engine API

Enable the Vertex AI API

Grant required roles

To use the Conversational Analytics API, grant the following Identity and Access Management (IAM) roles in the Google Cloud console to the principals (such as users or service accounts) that need to interact with the API:

Go to Roles

To use Conversational Analytics and the Conversational Analytics API, a Looker user must be assigned a role that contains the gemini_in_looker permission for the models that they interact with.

In addition, permissions for actions like managing agents are controlled by Gemini Data Analytics roles, as described in Grant Conversational Analytics API IAM roles and permissions.

Additional configuration for VPC Service Controls

The Conversational Analytics API is integrated with VPC Service Controls. You can add the Conversational Analytics API to your service perimeters to enhance the security of your data and services and help mitigate the risk of data exfiltration. When you include geminidataanalytics.googleapis.com as a restricted service in a perimeter, VPC Service Controls protects the Conversational Analytics API.

Protect data sources

While the service perimeter restricts access to the Conversational Analytics API, you must also include its data source services within the same service perimeter to protect the data those sources contain. Common data sources for Conversational Analytics API include BigQuery, Looker, and Looker Studio. Ensure that the services corresponding to these data sources (for example, bigquery.googleapis.com) are also restricted in the service perimeter.

IAM roles and permissions

VPC Service Controls perimeters and Identity and Access Management (IAM) roles work together. VPC Service Controls enforces security boundaries around the service, while IAM governs who can access resources within the perimeter.

Even when access is within a protected perimeter, users or service accounts still require the necessary IAM permissions on both the Conversational Analytics API service and the underlying data sources (such as BigQuery datasets or Looker instances) to perform actions. Combining VPC Service Controls with proper IAM management is essential for robust security of your Conversational Analytics API workflows.