Migrating a workload from another platform to Cloud SQL for SQL Server often involves using the Google Cloud console to import data from a SQL dump file that you export from your previous environment.
This tutorial shows you how to create the Google Cloud resources that you need and then import a SQL database to a Cloud SQL for SQL Server instance. The tutorial demonstrates best practices when migrating to Cloud SQL for SQL Server, including the use of a Virtual Private Cloud (VPC) network with private services access and enabling private IP for your Cloud SQL instance.
As you work through the steps, retain the default values for settings unless otherwise specified.
Objectives
- Download a sample SQL dump file.
- Create a new Virtual Private Cloud network with private services access.
- Create a Cloud Storage bucket and upload a SQL dump file to it.
- Create a Cloud SQL for SQL Server instance configured for private IP.
- Create a destination database.
- Import from the dump file to a new database.
- Verify that the database was successfully imported by viewing the structure and running a query.
Costs
In this document, you use the following billable components of Google Cloud:
To generate a cost estimate based on your projected usage,
use the pricing calculator.
When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
-
Create a project: To create a project, you need the Project Creator role
(
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission. Learn how to grant roles.
-
If you're using an existing project for this guide, verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the Cloud SQL, Cloud SQL Admin, Compute Engine, Cloud Storage APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission. Learn how to grant roles. -
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
-
Create a project: To create a project, you need the Project Creator role
(
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission. Learn how to grant roles.
-
If you're using an existing project for this guide, verify that you have the permissions required to complete this guide. If you created a new project, then you already have the required permissions.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the Cloud SQL, Cloud SQL Admin, Compute Engine, Cloud Storage APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission. Learn how to grant roles.
Required roles
To get the permissions that you need to complete this tutorial, ask your administrator to grant you the following IAM roles on your project:
-
Cloud SQL Admin (
roles/cloudsql.admin) -
Storage Admin (
roles/storage.admin) -
Compute Network Admin (
roles/compute.networkAdmin)
For more information about granting roles, see Manage access to projects, folders, and organizations.
You might also be able to get the required permissions through custom roles or other predefined roles.
Obtain a sample database dump file
For this tutorial, you'll use a small sample database that contains country codes and world capitals.
Download the SQL file for the countries database to your local
environment: countries-sqlserver.sql.
Create a network with private services access
Before you create the rest of your resources, create a VPC network to run your services on. Private services access lets you restrict access to your Cloud SQL database by establishing a private connection between your external network and Cloud SQL for SQL Server using internal IPv4 addresses.
In the Google Cloud console, go to the VPC networks page.
Click Create VPC network.
In the name field, enter
tutorial-network.For Subnet creation mode, select Custom.
In the name field, enter
tutorial-subnet.Select a region near you.
For IPv4 range, enter
10.0.0.0/24.For Private Google Access, select On.
Click Done.
At the bottom of the page, click Create.
After the VPC network creation process finishes, you can configure private services access for the network.
- On the VPC networks screen, click tutorial-network.
- In the menu bar for tutorial-network, click Private services access.
- Click Allocated IP ranges for services.
- Click Allocate IP range.
- For Name, enter
tutorial-range. - For IP address range, select Custom.
- In the Range field, enter
192.168.0.0/20. - Click Allocate.
- In the submenu, click Private connections to services.
- Click Create connection.
- In the Assigned allocation drop-down, select tutorial-range.
- Click Connect. In a minute or two, the Google Cloud console displays a message letting you know that you have successfully created a private connection.
Create a Cloud Storage bucket
Next, create a Cloud Storage bucket to store the SQL dump file that you downloaded earlier. The Cloud SQL import tool expects the dump file to be in a bucket. A Cloud Storage bucket must have a globally unique name.
- In the Google Cloud console, go to the Cloud Storage Buckets page.
- Click Create.
- For the bucket name, create a globally unique name consisting of lowercase letters, numbers, and hyphens. You might want to use a random string generator, such as random.org/strings to generate the name. Note the name that you choose.
- Click Continue.
- Under Location type, select Region. This is the lowest-cost option.
- Choose the same region you chose for your subnet. Locating your Google Cloud resources in the same region reduces latency, improves speed, lowers data transfer costs, and simplifies networking.
- Click Continue.
- Click Create.
- If prompted with a dialog, leave Enforce public access prevention on this bucket selected, and click Confirm.
The Bucket details page for the new bucket opens with the Objects pane selected.
Upload your dump file to the bucket
Upload to your bucket the sample SQL dump file that you downloaded earlier.
- On the Objects tab, click Upload and then click Upload files.
- Navigate to and select the
countries-sqlserver.sqlfile. - Click Open. Cloud Storage uploads the dump file to the bucket.
Create a Cloud SQL instance
Create a Cloud SQL instance in the Google Cloud console using the following settings. For all other settings, keep the default. Although you don't need to enable private IP for the import process, using private IP for a production workload is a best practice.
- Go to the Cloud SQL Instances page in the Google Cloud console.
Go to the Cloud SQL Instances page - Click Create Instance.
- Click Choose SQL Server.
- For Choose a Cloud SQL edition, choose Enterprise.
- For Edition preset, select Sandbox.
- For Instance ID, enter
tutorial-instance. - Choose and enter a password for the default user account and save it for future use.
- Choose the same region that you chose for your subnet and bucket.
- For Zonal availability, select Single zone.
- Expand Show configuration options.
- Expand Connections.
- Clear Public IP.
- Select Private IP.
- From the Network drop-down, select tutorial-network. This places the new Cloud SQL instance in the private network that you created earlier.
- Click Create instance, and then wait until the instance initializes and starts. The initialization process can take more than five minutes.
Create a destination database
The import workflow requires you to select a destination database to import to, so you need to create an empty database.
- In the SQL navigation menu, click Databases.
- Click Create database.
- For Database Name type
countries. - Click Create.
Import from the dump file
Now you're ready to import the countries database using the
countries-sqlserver.sql dump file that you uploaded to your Cloud Storage
bucket.
- In the SQL navigation menu, click Overview.
- On the overview page, click Import.
- Under File format, select SQL.
- Under Select source file, choose Select file from Google Cloud Storage.
- Click Browse.
- Expand the storage bucket that you created earlier.
- Click countries-sqlserver.sql.
- Click Select.
- In the Destination section, click the Database drop-down and then select countries.
- Click Import.
When the import process is complete and the countries database has been
imported to Cloud SQL for SQL Server, a success message is displayed.
Validate the imported data in Cloud SQL for SQL Server
After the import operation is complete, you can verify that the database was imported by connecting to it using Cloud SQL Studio, inspecting the list of tables, and running a test query against the data.
Authenticate to Cloud SQL Studio
With the user account that you created earlier, connect to the new database using Cloud SQL Studio.
- In the SQL navigation menu, click Cloud SQL Studio. A login dialog is displayed.
- In the Database drop-down, choose countries.
- Select Built-in database authentication.
- In the User drop-down, select sqlserver.
- In the Password field, enter the password you specified when you created the instance.
- Click Authenticate. Cloud SQL Studio opens.
View and query the tables
- In the Explorer pane, examine the
countriesdatabase and confirm that the database has two tables:capitalsandcountry_codes. - Click Untitled Query to open the query editor.
Paste the following code into the query editor:
SELECT [capitals].[country_capital], [country_codes].[country_name] FROM [capitals] JOIN [country_codes] ON [capitals].[alpha_2_code] = [country_codes].[alpha_2_code] ORDER BY [capitals].[country_capital];Click Run.
The results pane displays an alphabetical list of world capitals and their countries.
Clean up
To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, do one of the following:
- Delete the project that contains the resources
- Keep the project and delete the individual resources.
Delete the project
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.
Delete individual resources
If you want to keep the project but avoid incurring charges, delete the Cloud SQL instance, Cloud Storage bucket, and VPC network that you created during the tutorial.
Delete the Cloud SQL instance
First, disable deletion protection and then delete the tutorial Cloud SQL instance:
- In the SQL navigation menu, click Overview.
- Click Edit.
- Expand the Data Protection section.
- In Instance deletion protection, deselect all options.
- Click Save. When the operation is complete, Delete is selectable.
- Click Delete. A dialog appears.
- In the Instance ID field, enter
tutorial-instance. - Click Delete.
Delete the Cloud Storage bucket
Next, delete the storage bucket and its contents.
- In the main Google Cloud console navigation menu, go to Cloud Storage > Buckets.
- Select the box next to the name of the bucket you created earlier.
- Click Delete.
- In the dialog, confirm deletion by typing
DELETE, and then click Delete.
Delete the VPC network
Now that you've deleted the resources in your VPC network, you are ready to delete the network. Before you can delete the VPC network, you must delete the peering connection that was created automatically when you created the VPC network.
- In the main navigation menu, go to VPC networks > VPC network peering.
- Select the box next to the peering connection with the VPC network tutorial-network.
- Click Delete.
- In the dialog, confirm by clicking Delete.
Now that the peering connection has been removed, you can delete the VPC network.
- In the VPC Network menu, click VPC networks.
- Click tutorial-network to open the details page.
- Click Delete VPC network.
- In the dialog, confirm deletion by typing
tutorial-networkand then clicking Delete.
What's next
- Best practices for importing and exporting data.
- Export and import using SQL dump files.
- VPC networks
- Explore reference architectures, diagrams, and best practices about Google Cloud. Take a look at our Cloud Architecture Center.