Databricks rolls out Lakebase Postgres and Agent Bricks for AI-era apps

Databricks has unveiled a significant enhancement to its lakehouse architecture with the introduction of the Lakebase Postgres database layer. This new feature is designed to facilitate the execution of AI applications and agents, allowing them to perform analytics on operational data seamlessly within the Databricks ecosystem. Additionally, the company has launched Agent Bricks, a tool aimed at streamlining the development of automated AI agents.

Insights from the Data + AI Summit

The announcements were made during the annual Data + AI Summit held in San Francisco, where Databricks showcased its evolving product roadmap. Co-founder and CEO Ali Ghodsi emphasized the company’s commitment to helping enterprises develop AI applications that leverage proprietary data effectively. He stated, “With Lakebase, we’re creating a new category in the database market: a modern Postgres database, deeply integrated with the lakehouse and today’s development stacks.” Ghodsi highlighted the urgency for Fortune 500 companies to transition from outdated systems to solutions that meet the demands of the AI era.

The Lakebase, powered by Neon technology, features a fully managed, Postgres-compatible database that supports data loading and transformation from over 300 sources. Databricks points out that operational databases, which constitute a market exceeding 0 billion, are often hindered by legacy architectures that complicate management and incur high costs. The company asserts that modern applications require fast, reliable data to keep pace with AI-driven operations.

Convergence of Operational and Analytical Systems

To address the need for real-time decision-making, Databricks is focusing on the convergence of operational and analytical systems. Lakebase is designed to store operational data in cost-effective data lakes, featuring continuous autoscaling of compute resources to accommodate agent workloads efficiently.

William Blair analyst Jason Ader provided insights into the three-chapter framework presented at the summit:

  • Chapter 1: Establishing the lakehouse as a foundational architecture.
  • Chapter 2: Integrating AI into the platform to enhance access to data insights.
  • Chapter 3: Introducing Lakebase, a transactional layer that supports operational use cases within the lakehouse.

Ader believes this strategic positioning allows Databricks to unify the entire data stack, encompassing analytical, AI, and transactional workloads. The company’s expanding platform strategy aligns well with current enterprise demands for architectural simplicity and AI enablement.

Agent Bricks and Managed Iceberg Tables

The newly introduced Agent Bricks feature simplifies the creation of customized AI agents, empowering business users to develop their own solutions without extensive technical knowledge. Databricks automates the generation of synthetic data, enabling businesses to avoid the traditional trial-and-error approach associated with AI development.

Furthermore, Databricks has launched Managed Iceberg Tables in Public Preview, which support the Apache Iceberg REST Catalog API. This functionality allows external engines like Apache Spark, Flink, and Kafka to interact with tables governed by Unity Catalog, Databricks’ centralized metadata management system. The Managed Iceberg Tables promise automatic performance optimizations, ensuring cost-efficient storage and rapid query execution.

Informatica, a partner for the Managed Iceberg Tables and Lakebase, is set to enhance its offerings to accelerate the adoption of AI agents and GenAI through its Mosaic AI suite. New capabilities include:

  • Mosaic AI connectors for Cloud Application Integration (CAI): Rapidly deployable AI agents that integrate enterprise data with Mosaic AI via a no-code interface.
  • GenAI Recipes for CAI: Pre-configured templates designed to expedite GenAI application development and deployment.

Innovative Partnerships and Future Directions

Databricks has also announced two pivotal partnerships: an extended collaboration with Microsoft to deepen integration across Azure AI Foundry and a new alliance with Google Cloud, enabling the use of Gemini 2.5 models directly within Databricks. This integration allows customers to run Gemini models on their enterprise data while ensuring built-in governance and streamlined billing.

In a bid to address the talent gap in AI, Databricks has committed 0 million towards AI education, providing students and aspiring data professionals access to its Free Edition and various training resources.

Looking ahead, Databricks is set to release Lakeflow Designer, a no-code ETL capability that empowers non-technical users to create production data pipelines through a user-friendly interface. Ghodsi remarked, “Lakeflow Designer makes it possible for more people in an organization to create production pipelines so teams can move from idea to impact faster.”

Additionally, Databricks One will allow business users to engage with AI/BI dashboards and interact with data in natural language, marking a significant step towards democratizing data intelligence within organizations.

As Databricks continues to expand its capabilities, it anticipates reaching .7 billion in annualized revenue by July, with a year-on-year growth rate of 50 percent. The company is also planning to hire 3,000 new employees in 2025, reflecting its ambitious growth trajectory.

Lakebase is currently available in Public Preview, with further enhancements on the horizon. For those interested in learning more, a detailed blog post titled “What is a Lakebase?” is available on the Databricks website.

Tech Optimizer
Databricks rolls out Lakebase Postgres and Agent Bricks for AI-era apps