Skip to content
Catalogs Last updated: May 14, 2026

Snowflake Open Catalog

Snowflake Open Catalog is a managed Apache Polaris service offered by Snowflake that provides a vendor-neutral Iceberg REST Catalog for multi-engine lakehouse architectures, built on the same Apache Polaris open-source project co-created by Dremio and Snowflake.

snowflake open catalogsnowflake polaris catalogsnowflake iceberg catalogsnowflake rest catalogsnowflake apache polaris

Snowflake Open Catalog

Snowflake Open Catalog is Snowflake’s managed catalog service built on Apache Polaris — the open-source Iceberg REST Catalog implementation that Snowflake co-created with Dremio and donated to the Apache Software Foundation.

Open Catalog provides a vendor-neutral, multi-engine Iceberg REST Catalog that enables any Iceberg-compatible engine to discover and access Iceberg tables managed within a Snowflake-hosted Polaris instance — without requiring a Snowflake account or Snowflake-specific tooling.

Origin: The Polaris Connection

Apache Polaris was co-created by Dremio and Snowflake as the reference implementation of the Iceberg REST Catalog specification, then donated to the Apache Software Foundation for vendor-neutral governance. Both Dremio and Snowflake independently productized their managed versions:

Both are built on the same Apache Polaris codebase and implement the same Iceberg REST Catalog specification, making them interoperable from the perspective of client engines.

What Snowflake Open Catalog Provides

Multi-Engine Catalog Access

Any engine that supports the Iceberg REST Catalog protocol can connect to Snowflake Open Catalog:

Namespace and Table Management

# PyIceberg: connect to Snowflake Open Catalog
from pyiceberg.catalog import load_catalog

catalog = load_catalog(
    "snowflake_open_catalog",
    **{
        "type": "rest",
        "uri": "https://my-account.snowflakecomputing.com/polaris/api/catalog",
        "credential": "client-id:client-secret",
        "warehouse": "my-warehouse",
    }
)

# Create namespace and table
catalog.create_namespace("analytics")
catalog.create_table(
    identifier=("analytics", "orders"),
    schema=...,
    location="s3://my-bucket/warehouse/analytics/orders/",
)

Fine-Grained Access Control

Open Catalog supports the full Polaris RBAC model:

Multi-Catalog Architecture

Open Catalog supports running multiple catalogs within a single Polaris server, enabling tenant isolation and separation of concerns between different data domains.

Snowflake Open Catalog vs. Dremio Open Catalog

Both are built on Apache Polaris and implement the same REST spec. Key differences are in how they are deployed and what they integrate with:

AspectSnowflake Open CatalogDremio Open Catalog
Built onApache PolarisApache Polaris
IntegrationSnowflake ecosystemDremio Agentic Lakehouse
AI Semantic LayerNoYes (Dremio AI Semantic Layer)
AI AgentNoYes (Dremio AI Agent)
Query engine includedSnowflake SQLDremio Intelligent Query Engine
StandaloneYesIntegrated (part of Dremio)
Best forSnowflake-centric multi-engineAI analytics, Agentic Lakehouse

Using Snowflake Open Catalog with Apache Spark

spark = SparkSession.builder \
    .config("spark.sql.catalog.open_catalog", "org.apache.iceberg.spark.SparkCatalog") \
    .config("spark.sql.catalog.open_catalog.type", "rest") \
    .config("spark.sql.catalog.open_catalog.uri",
            "https://my-account.snowflakecomputing.com/polaris/api/catalog") \
    .config("spark.sql.catalog.open_catalog.credential", "client-id:client-secret") \
    .config("spark.sql.catalog.open_catalog.warehouse", "my-warehouse") \
    .getOrCreate()

spark.sql("SHOW NAMESPACES IN open_catalog").show()
spark.sql("SELECT * FROM open_catalog.analytics.orders LIMIT 10").show()

The Shared Polaris Ecosystem

Because Snowflake Open Catalog and Dremio Open Catalog share the same Apache Polaris foundation and REST Catalog API, a table registered in one can be read by engines connected to the other — as long as the underlying object storage is accessible. This multi-catalog, multi-engine portability is the realization of the open lakehouse vision that led Dremio and Snowflake to co-create Polaris in the first place.

📚 Go Deeper on Apache Iceberg

Alex Merced has authored three hands-on books covering Apache Iceberg, the Agentic Lakehouse, and modern data architecture. Pick up a copy to master the full ecosystem.

← Back to Iceberg Knowledge Base