Xata built a product analytics warehouse entirely on Postgres using materialized views and pg_cron, eliminating the need for separate OLAP infrastructure.

Eliminate your OLAP system and run analytics on vanilla Postgres, cutting infrastructure cost and operational complexity for mid-market teams with moderate analytical workloads.
Signal analysis
Here at Lead AI Dot Dev, we tracked Xata's decision to build their product analytics warehouse on vanilla Postgres instead of adopting a separate OLAP database like Clickhouse or Snowflake. This isn't a minor optimization - it's a fundamental statement about what modern applications actually need. The stack relies on materialized views for pre-computed aggregations, pg_cron for scheduled refreshes, and copy-on-write branching for isolated data environments.
For builders, this matters because it collapses your data infrastructure. Instead of maintaining Postgres for transactional workloads and a separate warehouse for analytics, you handle both with a single system. Materialized views act as your aggregation layer - they compute expensive joins and group-bys once, then serve the results as tables. pg_cron triggers refreshes on schedules you control. The result is lower operational complexity and fewer data sync points where things break.
The copy-on-write branching feature is particularly relevant for Xata's use case. Developers can spin up isolated data branches for testing and development without copying the entire dataset. This reduces the blast radius of experimental changes and keeps your main analytics fresh while engineering teams iterate on transformations.
Most teams reach for dedicated OLAP systems because they assume Postgres can't handle analytical queries at scale. Xata's approach challenges that assumption - if your analytical workload fits in a reasonable time window (minutes to hours between refreshes rather than seconds), materialized views give you most of the benefits of a data warehouse without the infrastructure tax.
The key constraint is freshness. Materialized views are snapshots, not live queries. If your analytics dashboard needs real-time updates, this architecture requires more frequent refresh cycles, which increases compute costs. But for most product analytics use cases - daily metrics, weekly cohort analysis, monthly trends - hourly or six-hourly refreshes are sufficient.
Your data modeling needs to shift. Without a proper dimensional model and pre-computed aggregations, you'll still hit performance walls. This approach works best when you think ahead about which metrics matter most and materialize those views explicitly. It's less flexible than an OLAP system but more maintainable than a sprawling Postgres schema with complex queries.
Eliminating a separate OLAP system cuts infrastructure costs, but not always in the way you'd expect. You're not removing compute - materialized view refreshes still require CPU cycles. You're removing complexity: no data pipeline orchestration, no schema mapping between transactional and analytical layers, no separate credentials and monitoring. One database to secure, patch, and understand.
For teams already on managed Postgres platforms like AWS RDS, Supabase, or Neon, this is a clean win. You stay within your current tool ecosystem. For teams running data warehouses that cost thousands monthly, the savings depend on your scale. Small workloads benefit most; very large analytical datasets may still need specialized systems. The sweet spot is mid-market teams with tight budgets and moderate analytics needs.
The operational shift is subtle but important. Instead of data engineers building pipelines, application engineers can own analytics logic directly. Materialized views live next to your tables. pg_cron schedules live in the same database configuration. This democratizes analytics ownership but requires developers comfortable with SQL and database design. Thank you for listening, Lead AI Dot Dev
Best use cases
Open the scenarios below to see where this shift creates the clearest practical advantage.
One concise email with the releases, workflow changes, and AI dev moves worth paying attention to.
More updates in the same lane.
Mistral Forge allows organizations to convert proprietary knowledge into custom AI models, enhancing enterprise capabilities.
Version 8.1 of the MongoDB Entity Framework Core Provider brings essential updates. This article analyzes the implications for builders.
The latest @composio/core update enhances Toolrouter with custom tool integration, expanding flexibility for developers.