BigLake documentation
BigLake is a storage engine that lets you unify data warehouses and lakes. It enables open formats like Apache Iceberg, Apache Parquet and ORC, to be accessed with fine-grained security through a single copy of data without needing to move or duplicate data.
Start your proof of concept with $300 in free credit
- Develop with our latest Generative AI models and tools.
- Get free usage of 20+ popular products, including Compute Engine and AI APIs.
- No automatic charges, no commitment.
Keep exploring with 20+ always-free products.
Access 20+ free products for common use cases, including AI APIs, VMs, data warehouses, and more.
Documentation resources
Introduction to BigLake
BigLake Iceberg tables
Related resources
BigLake metastore with Spark, Iceberg, and Cloud Storage
Learn how to use BigLake metastore with Apache Spark and Apache Iceberg on Cloud Storage.
Unify data warehouses and lakes
Use BigLake to unify your data silos and enable consistent security and performance across different analytics engines.
Enable fine-grained access control
Implement row- and column-level security for data stored in open formats in Cloud Storage.
Anthos sample deployment
Explore Anthos with our demo deployment on Google Cloud, including Bank of Anthos.
Bank of Anthos
Sample HTTP-based web app with microservices that simulates a bank's payment processing network.