Snowflake plugs PostgreSQL into its AI Data Cloud

Snowflake Introduces PostgreSQL Database-as-a-Service in AI Data Cloud

Snowflake is set to enhance its AI Data Cloud by launching a PostgreSQL database-as-a-service, designed to seamlessly integrate transactional workloads with analytics and AI under a unified governance framework. This innovative approach allows organizations to leverage their operational data for app development, AI agent functionality, and business performance analysis without the burdensome costs and complexities typically associated with managing multiple vendors and data pipelines.

Christian Kleinerman, Snowflake’s EVP of product, explained the significance of this development: “If you want to build an app on data stored in Snowflake, traditionally, you would need a relational OLTP [online transaction processing] database to manage that data. This often requires breaking out of the Snowflake environment. Our PostgreSQL service aims to create a secure boundary, ensuring that any applications or agents developed within it keep their data within the compliance and regulatory perimeter of Snowflake.”

One of the standout features of this service is its full compatibility with open-source PostgreSQL, allowing organizations to migrate existing applications to Snowflake without requiring any code modifications. This compatibility streamlines the transition process, making it easier for teams to adopt the new service.

The PostgreSQL service utilizes pg_lake, a collection of open-source extensions that enable developers and data engineers to read and write directly to Apache Iceberg tables from PostgreSQL. This capability eliminates the need for data extraction and movement, as Iceberg serves as an open table format that allows users to apply their preferred analytics engines directly to their data. The widespread support for Iceberg across the cloud and data platform ecosystem—including major players like Snowflake, Google, and AWS—further enhances its appeal.

By integrating these functionalities, Snowflake aims to eliminate the costly data movement typically required between transactional and analytical systems. Previously, Snowflake had introduced a transactional capability known as Unistore, announced in 2022 but only made generally available in late 2024. Despite its low-latency reads and writes, Unistore garnered limited interest from customers, prompting Snowflake to explore PostgreSQL-compatible options, ultimately leading to the acquisition of Crunchy Data, a provider specializing in PostgreSQL services.

According to IDC research director Devin Pratt, this strategic move positions Snowflake to extend its offerings beyond analytics into a managed OLTP solution. The integration of online transaction processing (OLTP) with online analytical processing (OLAP) within a single environment facilitates the development of agentic AI and real-time streaming capabilities. This is crucial, as agents require continuous access to both analytical insights and live transactional data, thereby minimizing delays between data generation and analysis.

Pratt also noted that Snowflake is not alone in this endeavor; the trend of combining operational databases with analytics to support real-time workflows is becoming increasingly common among vendors. For instance, following its acquisition of Neon, which specializes in serverless PostgreSQL architecture, Snowflake’s competitor Databricks has launched its own service, Lakebase.

By consolidating OLTP and OLAP capabilities within the same platform, organizations can significantly reduce ETL processes and data duplication, while also applying consistent governance and observability across both transactional and analytical workloads. “The value lies in creating a more unified operational and analytical stack, complete with consistent management and security,” Pratt concluded.

Tech Optimizer
Snowflake plugs PostgreSQL into its AI Data Cloud