Tool: create_transfer_config
Create a transfer configuration.
To create a transfer configuration, do the following:
- Provide the
required_fields. - Specify how often you want your transfer to run by specifying
schedule_options - Provide the
optional_fields. - If you want to use a service account to create this transfer, provide a
service_account_name. Check that you have valid credentials by calling
check_valid_creds:- If you do not have valid credentials, do the following:
- Find your
client_idanddata_source_scopesfrom your data source definition. - Authorize your data source by navigating to the following link:
https://bigquery.cloud.google.com/datatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=CLIENT_ID&scope=DATA_SOURCE_1%20DATA_SOURCE_2- Provide the
version_info. - If you have valid credentials, then
version_infois not required.
The following sample demonstrate how to use curl to invoke the create_transfer_config MCP tool.
| Curl Request |
|---|
curl --location 'https://bigquerydatatransfer.googleapis.com/mcp' \ --header 'content-type: application/json' \ --header 'accept: application/json, text/event-stream' \ --data '{ "method": "tools/call", "params": { "name": "create_transfer_config", "arguments": { // provide these details according to the tool's MCP specification } }, "jsonrpc": "2.0", "id": 1 }' |
Input Schema
Request for creating a transfer configuration.
The only supported data sources are:
- Campaign Manager (
data_source_id:dcm_dt) - Cloud Storage (
data_source_id:google_cloud_storage) - Comparison Shopping Service (CSS) Center (
data_source_id:css_center) - Dataset Copies (
data_source_id:cross_region_copy) - Display & Video 360 (
data_source_id:displayvideo) - Google Ad Manager (
data_source_id:dfp_dt) - Google Ads (
data_source_id:google_ads) - Google Analytics 4 (
data_source_id:ga4) - Google Merchant Center (
data_source_id:merchant_center) - Google Play (
data_source_id:play) - Scheduled Queries (
data_source_id:scheduled_query) - Search Ads 360 (
data_source_id:search_ads) - YouTube Channel (
data_source_id:youtube_channel) - YouTube Content Owner (
data_source_id:youtube_content_owner)
CreateTransferConfigRequest
| JSON representation |
|---|
{ "dataSource": enum ( |
| Fields | |
|---|---|
dataSource |
Required. Data source. |
projectId |
Required. Project ID or project number of the transfer config. |
location |
Required. Location of the transfer config. If specified location and location of the destination bigquery dataset do not match - the request will fail. |
displayName |
Required. Display name of the transfer config. |
destinationDatasetId |
Required. Destination dataset ID of the transfer config where data will be loaded. |
params |
Required. Data source parameters for the transfer config. |
scheduleOptions |
Optional. Schedule options for the transfer config. If not specified, the transfer config will be created with its default schedule defined in the data source definition. |
notificationPubsubTopic |
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: |
emailPreferences |
Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. |
Union field authorization. Authorization for the transfer config. authorization can be only one of the following: |
|
versionInfo |
This is required only if new credentials are needed, as indicated by https://bigquery.cloud.google.com/datatransfer/oauthz/auth?redirect_uri=urn:ietf:wg:oauth:2.0:oob&response_type=version_info&client_id=client_id&scope=data_source_scopes
Note that this should not be set when |
serviceAccountName |
Optional service account email. If this field is set, the transfer config will be created with this service account's credentials. It requires that the requesting user calling this API has permissions to act as this service account. |
Struct
| JSON representation |
|---|
{ "fields": { string: value, ... } } |
| Fields | |
|---|---|
fields |
Unordered map of dynamically typed values. An object containing a list of |
FieldsEntry
| JSON representation |
|---|
{ "key": string, "value": value } |
| Fields | |
|---|---|
key |
|
value |
|
Value
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field kind. The kind of value. kind can be only one of the following: |
|
nullValue |
Represents a JSON |
numberValue |
Represents a JSON number. Must not be |
stringValue |
Represents a JSON string. |
boolValue |
Represents a JSON boolean ( |
structValue |
Represents a JSON object. |
listValue |
Represents a JSON array. |
ListValue
| JSON representation |
|---|
{ "values": [ value ] } |
| Fields | |
|---|---|
values[] |
Repeated field of dynamically typed values. |
ScheduleOptionsV2
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field schedule. Data transfer schedules. schedule can be only one of the following: |
|
timeBasedSchedule |
Time based transfer schedule options. This is the default schedule option. |
manualSchedule |
Manual transfer schedule. If set, the transfer run will not be auto-scheduled by the system, unless the client invokes StartManualTransferRuns. This is equivalent to disable_auto_scheduling = true. |
eventDrivenSchedule |
Event driven transfer schedule options. If set, the transfer will be scheduled upon events arrial. |
TimeBasedSchedule
| JSON representation |
|---|
{ "schedule": string, "startTime": string, "endTime": string } |
| Fields | |
|---|---|
schedule |
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
startTime |
Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
endTime |
Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
Timestamp
| JSON representation |
|---|
{ "seconds": string, "nanos": integer } |
| Fields | |
|---|---|
seconds |
Represents seconds of UTC time since Unix epoch 1970-01-01T00:00:00Z. Must be between -62135596800 and 253402300799 inclusive (which corresponds to 0001-01-01T00:00:00Z to 9999-12-31T23:59:59Z). |
nanos |
Non-negative fractions of a second at nanosecond resolution. This field is the nanosecond portion of the duration, not an alternative to seconds. Negative second values with fractions must still have non-negative nanos values that count forward in time. Must be between 0 and 999,999,999 inclusive. |
EventDrivenSchedule
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field eventStream. The event stream which specifies the Event-driven transfer options. Event-driven transfers listen to an event stream to transfer data. eventStream can be only one of the following: |
|
pubsubSubscription |
Pub/Sub subscription name used to receive events. Only Google Cloud Storage data source support this option. Format: projects/{project}/subscriptions/{subscription} |
EmailPreferences
| JSON representation |
|---|
{ "enableFailureEmail": boolean } |
| Fields | |
|---|---|
enableFailureEmail |
If true, email notifications will be sent on transfer run failures. |
Output Schema
Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account.
TransferConfig
| JSON representation |
|---|
{ "name": string, "displayName": string, "dataSourceId": string, "params": { object }, "schedule": string, "scheduleOptions": { object ( |
| Fields | |
|---|---|
name |
Identifier. The resource name of the transfer config. Transfer config names have the form either |
displayName |
User specified display name for the data transfer. |
dataSourceId |
Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list |
params |
Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq |
schedule |
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
scheduleOptions |
Options customizing the data transfer schedule. |
scheduleOptionsV2 |
Options customizing different types of data transfer schedule. This field replaces "schedule" and "schedule_options" fields. ScheduleOptionsV2 cannot be used together with ScheduleOptions/Schedule. |
dataRefreshWindowDays |
The number of days to look back to automatically refresh the data. For example, if |
disabled |
Is this config disabled. When set to true, no runs will be scheduled for this transfer config. |
updateTime |
Output only. Data transfer modification time. Ignored by server on input. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
nextRunTime |
Output only. Next time when data transfer will run. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
state |
Output only. State of the most recently updated transfer run. |
userId |
Deprecated. Unique ID of the user on whose behalf transfer is done. |
datasetRegion |
Output only. Region in which BigQuery dataset is located. |
notificationPubsubTopic |
Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: |
emailPreferences |
Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config. |
encryptionConfiguration |
The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent. |
error |
Output only. Error code with detailed information about reason of the latest config failure. |
managedTableType |
The classification of the destination table. |
Union field destination. The destination of the transfer config. destination can be only one of the following: |
|
destinationDatasetId |
The BigQuery target dataset id. |
Union field
|
|
ownerInfo |
Output only. Information about the user whose credentials are used to transfer data. Populated only for |
Struct
| JSON representation |
|---|
{ "fields": { string: value, ... } } |
| Fields | |
|---|---|
fields |
Unordered map of dynamically typed values. An object containing a list of |
FieldsEntry
| JSON representation |
|---|
{ "key": string, "value": value } |
| Fields | |
|---|---|
key |
|
value |
|
Value
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field kind. The kind of value. kind can be only one of the following: |
|
nullValue |
Represents a JSON |
numberValue |
Represents a JSON number. Must not be |
stringValue |
Represents a JSON string. |
boolValue |
Represents a JSON boolean ( |
structValue |
Represents a JSON object. |
listValue |
Represents a JSON array. |
ListValue
| JSON representation |
|---|
{ "values": [ value ] } |
| Fields | |
|---|---|
values[] |
Repeated field of dynamically typed values. |
ScheduleOptions
| JSON representation |
|---|
{ "disableAutoScheduling": boolean, "startTime": string, "endTime": string } |
| Fields | |
|---|---|
disableAutoScheduling |
If true, automatic scheduling of data transfer runs for this configuration will be disabled. The runs can be started on ad-hoc basis using StartManualTransferRuns API. When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored. |
startTime |
Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. The time when a data transfer can be triggered manually is not limited by this option. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
endTime |
Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. The time when a data transfer can be triggered manually is not limited by this option. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
Timestamp
| JSON representation |
|---|
{ "seconds": string, "nanos": integer } |
| Fields | |
|---|---|
seconds |
Represents seconds of UTC time since Unix epoch 1970-01-01T00:00:00Z. Must be between -62135596800 and 253402300799 inclusive (which corresponds to 0001-01-01T00:00:00Z to 9999-12-31T23:59:59Z). |
nanos |
Non-negative fractions of a second at nanosecond resolution. This field is the nanosecond portion of the duration, not an alternative to seconds. Negative second values with fractions must still have non-negative nanos values that count forward in time. Must be between 0 and 999,999,999 inclusive. |
ScheduleOptionsV2
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field schedule. Data transfer schedules. schedule can be only one of the following: |
|
timeBasedSchedule |
Time based transfer schedule options. This is the default schedule option. |
manualSchedule |
Manual transfer schedule. If set, the transfer run will not be auto-scheduled by the system, unless the client invokes StartManualTransferRuns. This is equivalent to disable_auto_scheduling = true. |
eventDrivenSchedule |
Event driven transfer schedule options. If set, the transfer will be scheduled upon events arrial. |
TimeBasedSchedule
| JSON representation |
|---|
{ "schedule": string, "startTime": string, "endTime": string } |
| Fields | |
|---|---|
schedule |
Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source. |
startTime |
Specifies time to start scheduling transfer runs. The first run will be scheduled at or after the start time according to a recurrence pattern defined in the schedule string. The start time can be changed at any moment. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
endTime |
Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time. The end time can be changed at any moment. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
EventDrivenSchedule
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field eventStream. The event stream which specifies the Event-driven transfer options. Event-driven transfers listen to an event stream to transfer data. eventStream can be only one of the following: |
|
pubsubSubscription |
Pub/Sub subscription name used to receive events. Only Google Cloud Storage data source support this option. Format: projects/{project}/subscriptions/{subscription} |
EmailPreferences
| JSON representation |
|---|
{ "enableFailureEmail": boolean } |
| Fields | |
|---|---|
enableFailureEmail |
If true, email notifications will be sent on transfer run failures. |
UserInfo
| JSON representation |
|---|
{ // Union field |
| Fields | |
|---|---|
Union field
|
|
email |
E-mail address of the user. |
EncryptionConfiguration
| JSON representation |
|---|
{ "kmsKeyName": string } |
| Fields | |
|---|---|
kmsKeyName |
The name of the KMS key used for encrypting BigQuery data. |
StringValue
| JSON representation |
|---|
{ "value": string } |
| Fields | |
|---|---|
value |
The string value. |
Status
| JSON representation |
|---|
{ "code": integer, "message": string, "details": [ { "@type": string, field1: ..., ... } ] } |
| Fields | |
|---|---|
code |
The status code, which should be an enum value of |
message |
A developer-facing error message, which should be in English. Any user-facing error message should be localized and sent in the |
details[] |
A list of messages that carry the error details. There is a common set of message types for APIs to use. An object containing fields of an arbitrary type. An additional field |
Any
| JSON representation |
|---|
{ "typeUrl": string, "value": string } |
| Fields | |
|---|---|
typeUrl |
Identifies the type of the serialized Protobuf message with a URI reference consisting of a prefix ending in a slash and the fully-qualified type name. Example: type.googleapis.com/google.protobuf.StringValue This string must contain at least one The prefix is arbitrary and Protobuf implementations are expected to simply strip off everything up to and including the last All type URL strings must be legal URI references with the additional restriction (for the text format) that the content of the reference must consist only of alphanumeric characters, percent-encoded escapes, and characters in the following set (not including the outer backticks): In the original design of |
value |
Holds a Protobuf serialization of the type described by type_url. A base64-encoded string. |
Tool Annotations
Destructive Hint: ❌ | Idempotent Hint: ❌ | Read Only Hint: ❌ | Open World Hint: ❌