Deployment

This page guides you through the Cortex Framework deployment steps for your enterprise data.

Before you begin

Ensure you have completed the prerequisites outlined on the Prerequisites page before continuing with the deployment.

Prior to deploying Cortex Framework, relevant SAP ERP tables must be replicated into BigQuery. For more information, see the SAP ERP data source page.

Deployment

With your environment prepared, you are ready to execute the deployment. Use the Cortex Framework uv scripts to build and push your data assets to your Google Cloud project, transforming your local configurations into a live, scalable data architecture.

Configuration

Before deployment, the data foundation and data product layers are configured through a centralized, configuration-as-code approach. Cortex Framework uses a config/config.yaml file, to configure the Cortex Framework deployment. This file contains the essential mapping between your source data and the target Google Cloud environment. Proper configuration ensures that Cortex Framework knows exactly where to find your raw data and where to materialize the resulting layers.

For a successful Cortex Framework deployment, these mandatory values need to be added:

  • YOUR_BUILD_PROJECT_ID: The project where the deployment orchestration and uv scripts are executed.
  • YOUR_SOURCE_PROJECT_ID: The project containing your raw source data.
  • YOUR_TARGET_PROJECT_ID: The destination project where the processed data products and BigQuery datasets will reside.
  • YOUR_REPO_PROJECT_ID: The project hosting the Dataform repository and deployment artifacts.

For detailed information on the configuration file, see Deployment configuration.

Configure IAM roles

Configure the required Identity and Access Management (IAM) permissions to deploy Cortex Framework content within the required Google Cloud projects.

Roles for build project

To get the permissions that you need to build Cortex Framework, ask your administrator to grant you the BigQuery Data JobUser (roles/bigquery.jobUser) IAM role on your build project. For more information about granting roles, see Manage access to projects, folders, and organizations.

You might also be able to get the required permissions through custom roles or other predefined roles.

Roles for source project

To get the permissions that you need to deploy Cortex Framework, ask your administrator to grant you the BigQuery Data Viewer (roles/bigquery.dataViewer) IAM role on your source project. For more information about granting roles, see Manage access to projects, folders, and organizations.

You might also be able to get the required permissions through custom roles or other predefined roles.

Roles for target projects

To get the permissions that you need to deploy Cortex Framework, ask your administrator to grant you the following IAM roles on your target projects:

For more information about granting roles, see Manage access to projects, folders, and organizations.

You might also be able to get the required permissions through custom roles or other predefined roles.

Execute deployment

Run the following command to trigger the Cortex Framework deployment. This process will perform the following actions:

  • Verify that all prerequisites are complete.
  • Connect to the raw dataset, and based on the schema information, build and compile the .sqlx scripts.
  • Create the Dataform repository and workspace, then sync the compiled artefacts with the repository.
uv run cortex-build-and-deploy --config "config/config.yaml"

Command line options

The uv run command line interface (CLI) support these actions:

  • uv run cortex-build: Executes compilation of the Dataform workspace, translating YAML configs and SQLX/JS templates into deployable Dataform assets. For more information, see CLI Cortex build.
  • uv run cortex-deploy: Pushes the compiled assets to the target Dataform workspace. For more information, see CLI Cortex deploy.
  • uv run cortex-build-and-deploy: Combines the compilation and pushing of the build assets into Dataform workspace. For more information, see CLI Cortex build and deploy.

Execute the data foundation and data products pipeline

After the deployment is complete, the final step is to execute all actions in the Dataform repository. By executing the pipeline within the Dataform workspace, you initiate the logic transformation of raw data into structured, AI-ready Data Products, ready for immediate business use. Follow these steps:

  1. Go to Dataform.
  2. Make sure you are in the correct project.
  3. Click the created repository name.
  4. Click the workspace name.
  5. Click Start execution.
  6. Click Execute actions.
  7. Select All actions - you can also select the steps you are interested in.
  8. Click Start Execution.

Verify the success of the deployment:

  1. Click Executions.
  2. Verify that the status is completed successfully.
  3. Click the execution link for more details.

You can now review the BigQuery data assets deployed in the dataset configured in the config file. For example, CORTEX_SAP_DATAFOUNDATION and CORTEX_SAP_DATAPRODUCTS.