Troubleshoot

This document lists the visible errors captured and surfaced by Data Agent Kit extension for VS Code. These errors are typically shown as Notifications (pop-ups) or as Status Bar Warnings in the IDE. Use this guide to identify and resolve issues.

If you're not able to identify the source of an issue after reviewing the possible error listed here, try signing out and signing back in.

Authentication and gcloud CLI errors

These errors impact the foundational connectivity of the extension.

Error message

Possible cause

Remediation

Google Cloud CLI is required... but was not found.

The gcloud binary is not in the system's path.

Install the Google Cloud CLI and restart VS Code.

You must be logged in to both the Google Cloud CLI and ADC...

gcloud auth login or gcloud auth application-default login has not been run.

Click the status bar or notification to trigger the login flow in the terminal.

Status bar shows a $(warning) icon next to your account name.

The logged-in extension account differs from the active gcloud CLI account.

Run gcloud auth login and gcloud auth application-default login with the same account used to sign into Google Cloud.

Status bar shows a $(warning) icon.

Extension settings such as project ID, region, or billing project ID don't match gcloud CLI settings.

Click the warning icon in the project status bar to synchronize settings between the extension and the gcloud CLI. Alternatively, manually update the configurations in either tool to ensure they match.

Cannot sync settings: gcloud CLI or ADC account does not match extension account.

Attempting to sync settings while accounts are inconsistent.

Align accounts between the gcloud CLI and the extension. Ensure that they are using the same user account, project, region, and billing project.

Cannot sync settings: Google Cloud CLI is not installed.

Attempting to sync settings while the gcloud CLI isn't installed.

Install the gcloud CLI.

Explorer and API fetching errors

The following are general errors encountered when browsing cloud resources in the Google Cloud Data Agent Kit menu tree.

Error message

Possible cause

Remediation

Failed to load ${itemKind}: ${error.message}

The server can't authorize your request (HTTP 403 Forbidden status code), the server can't find the resources (HTTP 404 Not Found status code), or the API isn't enabled.

Verify Identity and Access Management (IAM) permissions; ensure the relevant API, such as BigQuery, is enabled in settings.

No Google Cloud project selected.

A feature was triggered without an active project.

Use the project picker in the status bar to select a project.

Invalid table name format: ${tableName}

Incorrect naming convention (missing project ID or dataset ID).

Use the fully qualified name: project.dataset.table.

BigQuery and Dataform errors

The following errors are specific to SQL transformations and Dataform project management.

Error message

Possible cause

Remediation

Dataform CLI is not installed. Dataform features will not work.

The Dataform npm package is not installed globally.

Click Install Dataform CLI in the dialog to automate installation.

Failed to compile project in folder...: ${stderr}

Syntax errors in .sqlx files or invalid workflow_settings.yaml.

Check the Dataform Compiled Query view for specific syntax errors.

Failed to run Dataform project: ${message}

BigQuery job execution failure.

Review the BigQuery job history for detailed execution logs.

No content found for saved query: ${label}

The Knowledge Catalog or BigQuery entry exists but has NoSQL content.

Verify the query content by running it in the Google Cloud console.

Managed Service for Apache Spark and Spark errors

The following errors are related to cluster management and Spark execution.

Error message

Possible cause

Remediation

Error starting/stopping/restarting cluster ${name}: ${message}

Insufficient IAM permissions or cluster in transition.

Ensure you have the roles/dataproc.editor role; check if the cluster status is already UPDATING.

Error: No cluster name found.

Internal reference error in the tree view.

Refresh the Spark Resource Explorer view.

Failed to redirect to Spark UI: ${message}

Cluster is stopped or Component Gateway is disabled.

Start the cluster and ensure Component Gateway is enabled in the cluster configuration.

Failed to load [clusters/jobs/sessions/session templates]: [message]

API error

Refresh the Spark Resource Explorer view.

Failed to start Jupyter process / Pip install failed.

Local Jupyter server startup failed or installation of required packages failed.

Open a new Spark Notebook with the remote kernel.

No sample notebooks found for the selected kernel type.

Sample notebooks for the selected kernel type (local or remote) are not available.

Not applicable

This system does not have Windows Long Path support enabled. Please enable long paths and restart VS Code to work with Spark Notebooks on remote kernels. Learn More

This is specifically for Windows users when long paths are not enabled on your machine.

Enable long paths by referring to the link in the notification.

google-cloud-jupyter-config and jupyter_server are required to connect to remote kernels on Managed Service for Apache Spark. Install them?

This comes up when the user tries to use a remote kernel for the first time.

Install dependencies by clicking the notification and selecting Yes in the confirmation dialog.

Requires package installation

Shown in the kernel picker description next to Remote Kernel if Jupyter dependencies are missing.

Install dependencies by clicking the notification and selecting Yes in the confirmation dialog.

Install dependencies for remote Spark kernels. Require google-cloud-jupyter-config and jupyter_server

Shown in the kernel picker description when user runs an existing notebook on Remote Kernel

Install dependencies by clicking the notification and selecting Yes in the confirmation dialog.

Cloud Storage errors

The following errors are surfaced while managing buckets and objects.

Error message

Possible cause

Remediation

A folder with the name... already exists.

Naming collision in the bucket.

Choose a unique folder name.

Upload/Download failed: ${message}

Permission denied or network timeout.

Verify Storage Object Admin or Creator roles.

Failed to create bucket: ${message}

Name already taken globally or invalid characters.

Follow Cloud Storage bucket naming rules.

Generic interface errors

These errors indicate UI-level failures.

Error message

Possible cause

Remediation

No workspace folder open.

User is in No Folder mode in VS Code.

Open a folder using File > Open Folder.

No active notebook editor found.

You are running Spark or SQL commands without an open notebook.

Open a .ipynb or .sql file.

Cannot insert item: unsupported resource type.

Attempting to drag-and-drop a resource that has NoSQL string representation.

Only tables, views, and columns can be inserted into editors.

Database errors

The following errors are returned when querying Google Cloud databases.

AlloyDB

Error message

Possible cause

Remediation

postgresql error: role "[USER_EMAIL]" does not exist

The user logged into the extension has not been added as an IAM user to the instance.

See Manage IAM authentication to add the user as an IAM user.

postgresql error: permission denied for table [TABLE_NAME]

The IAM user does not have permission for the object.

See Grant appropriate database permissions to IAM users to grant permission to the IAM user for the object.

Cloud SQL for MySQL

Error message Possible cause Remediation
Invalid request: Some of your SQL statements failed to execute. Details: Error 1045 (28000): Access denied for user '[USER]'@'localhost' (using password: YES). The user logged into the extension has not been added as an IAM user to the instance. See IAM authentication to add the user as an IAM user for the instance.
Invalid request: Some of your SQL statements failed to execute. Details: Error 1044 (42000): Access denied for user '[USER]'@'%' to database '[DB_NAME]'. The IAM user does not have permission to access the database. See Grant database privileges to an individual IAM user or service account to grant privileges to IAM users for the database.
Invalid request: The instance doesn't allow using ExecuteSql to access this instance. You can allow it by patching the instance with {settings: { dataApiAccess: "ALLOW_DATA_API" }}. Data API access is not enabled. Enable Data API access from the query settings or see Execute SQL statements using the Cloud SQL Data API to enable it from the Google Cloud console or gcloud CLI.
Invalid request: IAM authentication is not enabled for the instance [PROJECT_ID]:[INSTANCE_NAME]. The IAM authentication flag is not enabled. Enable the IAM authentication flag from the query settings or see IAM authentication to enable it from the Google Cloud console or gcloud CLI.

Cloud SQL for PostgreSQL

Error message Possible cause Remediation
Invalid request: The instance doesn't allow using ExecuteSql to access this instance. You can allow it by patching the instance with {settings: { dataApiAccess: "ALLOW_DATA_API" }}. The Data API Access is not enabled. Enable Data API access from the query settings or see Execute SQL statements using the Cloud SQL Data API to enable it from the Google Cloud console or gcloud CLI.
Invalid request: IAM authentication is not enabled for the instance [PROJECT_ID]:[INSTANCE_NAME]. The IAM authentication flag is not enabled. Enable the IAM authentication flag from the query settings or see IAM authentication to enable it from the Google Cloud console or gcloud CLI.
Invalid request: Some of your SQL statements failed to execute. Details: pq: password authentication failed for user "[USER_EMAIL]". The user logged into the extension has not been added as an IAM user to the instance. See IAM authentication to add the user as an IAM user for the instance.
Execution failed. All statements are aborted. Details: pq: permission denied for table [TABLE_NAME] The IAM user does not have permission to access the object. See Manage users with IAM database authentication to grant privileges to IAM users for the database.

Spanner

Error message

Possible cause

Remediation

Result set too large. Result sets larger than 10.00M can only be yielded through the streaming API.

Result sizes larger than 10.00 MiB are not supported.

See executeSql.

What's next