Skip to content
Codedock
ServicesHow we workInsightsCase StudiesCareerContact
Back to all articles
Enterprise Integration

·

8 min read

·

Written by Tomáš Mikeš

Microsoft Fabric one year into enterprise: what works, what doesn't

A year with Microsoft Fabric on the Magistra DWH project for 200+ pharmacies. Where Fabric genuinely adds value over Databricks/Snowflake, where we complain, and whether you would start today.

Microsoft FabricAzureData warehouseEnterprise

When Microsoft launched Fabric in November 2023, it promised a unified data platform — OneLake as single storage, Lakehouse + Data Warehouse + Power BI + Data Factory in one product. For clients weighing a data platform that was a serious pitch: replacing the Databricks + Snowflake + Fivetran + Power BI stack with a single ecosystem.

For Magistra (a network of 200+ pharmacies, e-shop, external systems) we've been running Fabric for 8 months. Here is an honest review — what fits and what doesn't.

What works well

OneLake and data mesh from day one

OneLake is the strongest Fabric feature. No duplicate storage for Lakehouse vs. Warehouse vs. Power BI — data is stored once, in one place (Delta Lake format), accessed via shortcuts. For enterprises with multiple teams this removes an entire class of pain points.

Example: Magistra has a pharmacy team, e-shop team, finance. Each needs its own calculations over the same source data. On Databricks you'd have 3 workspaces, 3 storage accounts, and some kind of sync. On Fabric you have one OneLake, three workspaces, shortcuts between them. Data isn't copied.

Power BI integration without friction

When data lands in OneLake in Delta format, Power BI can use DirectLake mode — it reads Delta files directly without ETL into a tabular model. Refresh is instant because there's nothing to refresh. For enterprise dashboards over billions of rows this cuts infrastructure by 40-50% versus the classic import/refresh model.

Data Factory (Fabric flavour) is reasonable

Fabric ships its own Data Factory (separate from Azure Data Factory — yes, naming is confusing). It's an orchestration engine with drag-and-drop plus SQL/PySpark notebooks. Covers 80% of ETL scenarios. For complex transforms use Spark notebooks.

Licensing model is simple (in theory)

You buy capacity (F2, F4, F8...), not per-user, per-pipeline, per-warehouse. Capacity is shared across workloads. For an organisation with a clear budget that's much simpler than Databricks DBUs + Snowflake credits + Power BI per-user.

What does NOT work well

Capacity is underspecified — and expensive

“Capacity is shared across workloads” sounds great until you need to predict which F-tier to buy. F4 (~€1 500/mo) is the minimum for anything meaningful. F8 (~€3 000/mo) is realistic minimum for a production DWH with daily refreshes. F16+ for enterprise scale.

Problem: capacity throttling is unpredictable. A DAX query in Power BI consumes the same capacity units as a Spark job — one “colleague from finance” with a slow query can throttle the whole platform. CU monitoring is critical but the dashboards are primitive.

CI/CD — still a problem

Fabric workspaces don't have first-class Git integration like Databricks Repos. You have Fabric deployment pipelines (Dev → Test → Prod) but:

  • Notebooks are export/import, not Git-first
  • Parameter rewriting between environments is manual
  • PR review of Power BI reports? Doesn't work — review has to happen in a live workspace

In Databricks Repos you have a full Git workflow. Fabric doesn't in 2026. If your team is used to software engineering practices (code review, branch protection, PR checks), this will bite.

Product maturity — still in motion

Microsoft pushes new features every month. Sounds good, but also means:

  • 6-month-old documentation may point at UI that no longer exists
  • Best practices from blog posts are stale because the feature changed
  • GA (general availability) status gets delayed. Something you use might still be preview — no SLA, potential breaking changes

Databricks and Snowflake are more mature products. Fabric is in its teenage years — lots of growth, not everything lines up.

SQL endpoint performance below expectations

Fabric Warehouse (SQL endpoint) still trails Databricks SQL and Snowflake on latency and throughput. For ad-hoc analytical queries (“show me sales by category for last month”) it's fine. For high-concurrency BI with hundreds of parallel users… not enough benchmarks yet.

Where Fabric beats Databricks/Snowflake

  • Microsoft-shop client. If they already have Entra ID, Azure, Power BI, M365 — Fabric fits the ecosystem. Procurement is straightforward.
  • Team without deep data-engineering skills. Fabric is approachable for BI analysts. Databricks wants a data-engineer orientation.
  • Budget-conscious startups / mid-market. F4 starts at ~€1 500/mo — cheaper than a production Databricks + Snowflake.
  • Power BI first-class. If you already invested in Power BI reports, Fabric is continuity, not rebuild.

Where to avoid it

  • Team needs full Git-first workflow. Databricks Repos is better.
  • Extreme query performance requirements. Snowflake remains the benchmark.
  • Multi-cloud. Fabric is Azure-only. If you need AWS/GCP too, skip.
  • Heavy real-time streaming. Fabric real-time analytics exists, but Databricks Structured Streaming is more mature.

What Magistra taught us

After 8 months of operation:

  • F8 capacity suffices for a 200+ pharmacy DWH with ~4 TB of data and daily refreshes
  • Throttling issues twice in 8 months, both resolved by rescheduling daily runs to off-peak hours
  • Data Factory pipelines are stable, but CI/CD is still friction (currently deploying via PowerShell scripts)
  • Power BI DirectLake means end users see fresh data within 15 minutes of refresh — on Databricks + Power BI import mode this would be 1-2 hours
  • Total cost ~42% below the Databricks + Snowflake + Fivetran alternative we originally quoted

Recommendation

If you're in the Microsoft stack, you're starting a data platform, and your team is more BI-oriented than data-engineering — Fabric is a legitimate choice today. Earlier (in 2024) I would have said wait. In 2026 there are enough case studies and product maturity is ok.

If you're on a different stack, need top-tier performance, or a full Git workflow — stick with Databricks + Snowflake. Not a slower decision, a different trade-off.

Working on something similar?

Book a 30-minute technical call. No sales process — direct architectural feedback.

Pick a time

Architecture, cloud and integration for complex systems. A senior architect on every project.

Navigation

ServicesHow we workInsightsCase StudiesCareerContactAgency vs. freelancer vs. us

Services

DevelopmentCloudDevOpsAI & DataConsultingDelivery

Contact

CodeDock s.r.o.

Zlenická 863/9, 104 00 Praha 22

Czech Republic

info@codedock.com

Company ID: 14292769

VAT ID: CZ14292769


© 2026 Codedock

ContactPrivacy Policy
Book a call