This page shows you how to configure a scheduled export job for a Looker (Google Cloud core) instance. Scheduled exports run every 24 hours, beginning at a start time that you set. The export is saved to a Cloud Storage bucket. Scheduled exports don't create any downtime for the Looker (Google Cloud core) instance.
A scheduled export includes all data, configurations, and content from an instance, with the following exceptions:
- BigQuery data isn't exported.
 Secret Manager data isn't exported.
Looker Studio reports are not exported.
Any Google Cloud console settings, which must be made in the Google Cloud console for each instance, are not exported. However, settings in the Looker Admin panel within the Looker (Google Cloud core) application are exported.
Any OAuth tokens that are stored in an instance are not exported. If OAuth is used for individual database connections, after import each user will have to log in again and refresh their tokens for any database connections that use OAuth.
Looker application API keys are not exported. All users will lose access to all existing API keys since API keys won't be part of the imported package.
If you export and import into a Looker (Google Cloud core) instance that uses a different networking setup than the instance that created the export, you may need to configure your connections to external services differently.
If an export from an instance in one Google Cloud project is moved to an instance in a different project, IAM permissions to provide users with access to the instance need to be set in the new project.
The scheduled export process takes minutes to hours, depending on the size of the instance. While a scheduled export process is running, import operations and one-time exports aren't available.
Once a scheduled export job is set up, you can set up a script to import the exported data to a backup instance on a daily basis.
Required role
To get the permissions that you need to create a scheduled export job for Looker (Google Cloud core), ask your administrator to grant you the following IAM roles on the project in which the instance was created:
- 
              All:
              
  
  
    
      Looker Admin  (
roles/looker.admin) 
For more information about granting roles, see Manage access to projects, folders, and organizations.
You might also be able to get the required permissions through custom roles or other predefined roles.
Before you begin
- Create a Cloud Storage bucket to store the export artifact.
 - Create a customer-managed encryption key (CMEK) in Cloud Key Management Service. This is required to encrypt the data for the scheduled export, even if your instance does not use CMEK. The CMEK key must be in a different region from your Looker (Google Cloud core) instance, or it must be a global key.
 - Identify the service account for your Looker (Google Cloud core) instance.
 - Grant the following required Identity and Access Management (IAM) roles and
permissions to the instance's service account:
- Grant the 
roles/storage.objectAdminrole on the Cloud Storage bucket that stores your export. - Grant the 
roles/cloudkms.cryptoKeyEncrypterDecrypterrole on the CMEK key that you use to encrypt your data. - Alternatively, you can create a custom role with the following
permissions:
storage.objects.createstorage.objects.deletecloudkms.cryptoKeyVersions.useToEncryptcloudkms.cryptoKeyVersions.useToDecrypt
 
 - Grant the 
 
Configure a scheduled export job
You can configure a scheduled export job using the Google Cloud console or gcloud.
console
In the Google Cloud console, go to the Looker (Google Cloud core) Instances page.
Click the name of the instance for which you want to schedule an export job.
Click the Export tab.
Expand the Schedule export job section and enter the following information:
- Start time: The time of day in local time (HH:MM) to perform the daily export.
 - Cloud Storage bucket: The URI of your Cloud Storage bucket,
for example: 
gs://bucket-name/export-path. - Cloud KMS key: Select the Cloud Key Management Service key that will be used to encrypt the instance data.
 
Click Set schedule.
gcloud
- To configure a periodic export, run the following 
gcloudcommand: 
gcloud looker instances update INSTANCE_NAME \
    --project=PROJECT_ID \
    --region=REGION \
    --periodic-export-gcs-uri="gs://BUCKET_NAME/EXPORT_PATH" \
    --periodic-export-kms-key="KMS_KEY_NAME" \
    --periodic-export-start-time="TIME"
Replace the following:
INSTANCE_NAME: the name of your Looker (Google Cloud core) instance.PROJECT_ID: your Google Cloud project ID.REGION: the region where your instance is located.BUCKET_NAME: the name of your Cloud Storage bucket.EXPORT_PATH: the path within the bucket for the export artifact.KMS_KEY_NAME: the full resource name of your CMEK key.TIME: the time of day in UTC (HH:MM) to perform the daily export.
Verify the export configuration
After you set a schedule, you can view the scheduled export job configuration on the Instance details page.
In the Google Cloud console, go to the Looker (Google Cloud core) Instances page.
Click the name of your instance.
On the Instance details page, view the Export schedule configuration, which includes the scheduled start time each day in local time. The start time includes a link to the instance's Export page.
Stop a periodic export
To stop a periodic export from occurring, clear the configuration fields.
console
In the Google Cloud console, go to the Instances page.
Click the name of the instance.
Click the Export tab.
In the Set schedule for periodic export section, clear the values for the Cloud Storage bucket, Customer-managed encryption key, and Time fields.
Click Clear schedule.
gcloud
- To stop a periodic export, run the following 
gcloudcommand: 
gcloud looker instances update INSTANCE_NAME \
    --project=PROJECT_ID \
    --region=REGION \
    --clear-periodic-export-config
Replace the following:
INSTANCE_NAME: the name of your Looker (Google Cloud core) instance.PROJECT_ID: your Google Cloud project ID.REGION: the region where your instance is located.
What's next
- Learn how to import Looker (Google Cloud core) instance data from Cloud Storage.
 - Troubleshoot export or import errors.