Open-source Supply Chain analytics on Microsoft Fabric: scalable medallion architecture (Bronze-Silver-Gold), automated CSV ingestion, Delta Lake transformations, semantic modeling with DAX & RLS, and interactive Power BI reports. Contribute to refine pipelines, models, and dashboards.
This project follows the medallion architecture which is a multi-tiered approach where each zone progressively refines data quality and readiness:
- Bronze: Raw landing zone for unprocessed CSVs.
- Silver: Cleansed and enriched data, standardized and validated. A single PySpark notebook is used to perform scalable transformation and enrichment in this layer.
- Gold: Consumption-ready, high-value fact and dimension tables, powering semantic models and reporting.
The Supply Chain Project demonstrates a medallion-based analytics solution on Microsoft Fabric:
- Bronze (Raw Landing Zone): Pipeline performs metadata-driven filtering and timestamp comparison using
Lookup,GetMetadata, andFilteractivities to ingest only new or changed raw files. - Silver (Cleansed & Enriched): Notebooks transform copied data into Delta format, including cleansing, deduplication, and enrichment.
- Gold (Analytics & Reporting): SQL scripts create or refresh staging views that mirror Lakehouse Delta tables, and a stored procedure performs UPSERT logic into the Data Warehouse.
- Semantic Model: A Power BI semantic model is automatically refreshed at the end of the pipeline using the
PBISemanticModelRefreshactivity. - Power BI Reporting: Final
.pbixreports provide interactive dashboards sourced from the refreshed semantic model.
- Microsoft Fabric workspace
- Fabric Data Warehouse endpoint
- Power BI Desktop (for
.pbixfiles)
-
Clone the repository
git clone https://github.com/your-org/SupplyChain.git cd SupplyChain/fabric-artifacts -
Configure environment variables
- Populate
.envincore/andorchestration/with Fabric credentials and connection strings.
- Populate
-
Deploy Data Layers
cd core/LH_SupplyChain && python deploy_lakehouse.py cd ../WH_SupplyChain && python deploy_warehouse.py cd ../SM_SupplyChain && tabular-editor deploy
-
Run Pipelines
cd ../../orchestration/PL_SupplyChain az datafactory pipeline create-run \ --factory-name MyFabricFactory \ --resource-group MyRG \ --pipeline-name LoadSupplyChainDataThis pipeline handles:
- Incremental ingestion of targeted files based on modification timestamp
- Notebook-driven transformation into Delta Lake tables
- Warehouse upserts via SQL stored procedures
- View generation for staging tables
- Logging into the
IngestionLogstable for traceability
-
Launch Notebooks
cd ../NB_SupplyChain jupyter lab -
Open Power BI Report
- Open
delivery/SupplyChain/SupplyChainReport.pbixin Power BI Desktop and verify that the semantic model has been refreshed.
- Open
fabric-artifacts/
├── core/
│ ├── LH_SupplyChain # Lakehouse JSON definitions (Bronze & Silver zones)
│ ├── WH_SupplyChain # Warehouse DDL and upsert scripts (Gold zone)
│ └── SM_SupplyChain # Semantic model definitions and refresh settings
├── orchestration/
│ ├── PL_SupplyChain # Pipelines including ingestion, transforms, semantic refresh
│ └── NB_SupplyChain # Jupyter notebooks for cleansing, enrichment, profiling
└── delivery/
└── SupplyChain # Power BI `.pbix` report and related assets
Assets: screenshots and GIFs live in docs/assets/.
- Ingest & Land: Trigger
PL_SupplyChainpipeline to pull new or updated CSVs into Bronze using file metadata. - Transform & Enrich: Notebooks in
NB_SupplyChainstandardize and produce Silver Delta tables. - Load to Warehouse: Pipeline executes
CREATE VIEWand stored procedures to upsert Silver into Gold tables. - Refresh Semantic Model: The pipeline triggers a Power BI dataset refresh automatically.
- View Reports: Open the Power BI report (
delivery/SupplyChain/SupplyChainReport.pbix) which uses the refreshed semantic layer.
We welcome contributions! Please follow these steps:
- Fork the repository.
- Create a feature branch:
git checkout -b feature/YourFeature
- Add or update artifacts under
fabric-artifacts/. - Commit your changes:
git add -A git commit -m "feat: Describe your change" - Push and open a Pull Request.
This project is licensed under the MIT License