Handle quota errors by calling ML.GENERATE_EMBEDDING iteratively

This tutorial shows you how to use the BigQuery bqutil.procedure.bqml_generate_embeddings public stored procedure to iterate through calls to the ML.GENERATE_EMBEDDING function. Calling the function iteratively lets you address any retryable errors that occur due to exceeding the quotas and limits that apply to the function.

To review the source code for the bqutil.procedure.bqml_generate_embeddings stored procedure in GitHub, see bqml_generate_embeddings.sqlx. For more information about the stored procedure parameters and usage, see the README file.

This tutorial guides you through the following tasks:

  • Creating a remote model over a text-embedding-005 model.
  • Iterating through calls to the ML.GENERATE_EMBEDDING function, using the remote model and the bigquery-public-data.bbc_news.fulltext public data table with the bqutil.procedure.bqml_generate_embeddings stored procedure.

Required permissions

To run this tutorial, you need the following Identity and Access Management (IAM) roles:

  • Create and use BigQuery datasets, connections, and models: BigQuery Admin (roles/bigquery.admin).
  • Grant permissions to the connection's service account: Project IAM Admin (roles/resourcemanager.projectIamAdmin).

These predefined roles contain the permissions required to perform the tasks in this document. To see the exact permissions that are required, expand the Required permissions section:

Required permissions

  • Create a dataset: bigquery.datasets.create
  • Create, delegate, and use a connection: bigquery.connections.*
  • Set the default connection: bigquery.config.*
  • Set service account permissions: resourcemanager.projects.getIamPolicy and resourcemanager.projects.setIamPolicy
  • Create a model and run inference:
    • bigquery.jobs.create
    • bigquery.models.create
    • bigquery.models.getData
    • bigquery.models.updateData
    • bigquery.models.updateMetadata

You might also be able to get these permissions with custom roles or other predefined roles.

Costs

In this document, you use the following billable components of Google Cloud:

  • BigQuery ML: You incur costs for the data that you process in BigQuery.
  • Vertex AI: You incur costs for calls to the Vertex AI model.

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

For more information about BigQuery pricing, see BigQuery pricing.

For more information about Vertex AI pricing, see Vertex AI pricing.

Before you begin

  1. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  2. Make sure that billing is enabled for your Google Cloud project.

  3. Enable the BigQuery, BigQuery Connection, and Vertex AI APIs.

    Enable the APIs

Create a dataset

Create a BigQuery dataset to store your models and sample data:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to the BigQuery page

  2. In the Explorer pane, click your project name.

  3. Click View actions > Create dataset.

  4. On the Create dataset page, do the following:

    1. For Dataset ID, enter target_dataset.

    2. For Location type, select Multi-region, and then select US (multiple regions in United States).

    3. Leave the remaining default settings as they are, and click Create dataset.

Create the text embedding generation model

Create a remote model that represents a hosted Vertex AI text-embedding-005 model:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, run the following statement:

    CREATE OR REPLACE MODEL `target_dataset.embedding_model`
      REMOTE WITH CONNECTION DEFAULT
      OPTIONS (ENDPOINT = 'text-embedding-005');

    The query takes several seconds to complete, after which the embedding model appears in the sample dataset in the Explorer pane. Because the query uses a CREATE MODEL statement to create a model, there are no query results.

Run the stored procedure

Run the bqutil.procedure.bqml_generate_embeddings stored procedure, which iterates through calls to the ML.GENERATE_EMBEDDING function using the target_dataset.embedding_model model and the bigquery-public-data.bbc_news.fulltext public data table:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, run the following statement:

    CALL `bqutil.procedure.bqml_generate_embeddings`(
        "bigquery-public-data.bbc_news.fulltext",            -- source table
        "PROJECT_ID.target_dataset.news_body_embeddings",  -- destination table
        "PROJECT_ID.target_dataset.embedding_model",       -- model
        "body",                                              -- content column
        ["filename"],                                        -- key columns
        '{}'                                                 -- optional arguments encoded as a JSON string
    );

    Replace PROJECT_ID with the project ID of the project you are using for this tutorial.

    The stored procedure creates a target_dataset.news_body_embeddings table to contain the output of the ML.GENERATE_EMBEDDING function.

  3. When the query is finished running, confirm that there are no rows in the target_dataset.news_body_embeddings table that contain a retryable error. In the query editor, run the following statement:

    SELECT *
    FROM `target_dataset.news_body_embeddings`
    WHERE ml_generate_embedding_status LIKE '%A retryable error occurred%';

    The query returns the message No data to display.

Clean up

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.