BigLake documentation

BigLake is a storage engine that lets you unify data warehouses and lakes. It enables open formats like Apache Iceberg, Apache Parquet and ORC, to be accessed with fine-grained security through a single copy of data without needing to move or duplicate data.

Go to the BigLake documentation for more.

  • Develop with our latest Generative AI models and tools.
  • Get free usage of 20+ popular products, including Compute Engine and AI APIs.
  • No automatic charges, no commitment.

Keep exploring with 20+ always-free products.

Access 20+ free products for common use cases, including AI APIs, VMs, data warehouses, and more.

Documentation resources

Find quickstarts and guides, review key references, and get help with common issues.
Explore self-paced training, use cases, reference architectures, and code samples with examples of how to use and connect Google Cloud services.
Training
Training and tutorials

Learn how to use BigLake metastore with Apache Spark and Apache Iceberg on Cloud Storage.

30 minutes introductory Free

Use case
Use cases

Use BigLake to unify your data silos and enable consistent security and performance across different analytics engines.

Use case
Use cases

Implement row- and column-level security for data stored in open formats in Cloud Storage.

Code sample
Code Samples

Explore Anthos with our demo deployment on Google Cloud, including Bank of Anthos.

Terraform Partner

Code sample
Code Samples

Sample HTTP-based web app with microservices that simulates a bank's payment processing network.

Financial

Code sample
Code Samples

HTTP-based e-commerce app with microservices.

Related videos