Urban Apparel: Automated Ingestion for the “Summer Splash” Campaign

Project Overview
This Project focuses on building an automated data ingestion pipeline in Microsoft Fabric for Urban Appeal's digital marketing campaign. The goal was to eliminate manual data handling and ensure that the data is available reliably for analysis. The dataset represents daily advertising metrics for Urban Appeal's Summer Splash Campaign across different platforms.
​
Business Problem
Urban Apparel's marketing team is currently hampered by a manual data process for their "Summer Splash 2025" campaign. Each morning, a team member must manually download a CSV file of the previous day's performance data, a process that is both slow and prone to human error. This operational bottleneck delays the analyst team's ability to generate timely, data-driven insights. Our mandate is to eliminate this inefficiency.
​
Project Objective
-
To architect and deploy a robust, fully automated process that loads the daily campaign performance CSV into a Microsoft Fabric Lakehouse without any manual intervention.
-
Successful Pipeline Execution: The deployed pipeline must run end-to-end without errors.
-
Data Integrity: A table containing the summer_splash_campaign data must be created in the Lakehouse, with all records and columns mirroring the source file precisely.
-
Verified Automation: The pipeline must be configured with an active daily schedule trigger, set to run each morning according to the client's reporting cadence.
​​
Project Design
-
Built a Lakehouse to ingest data from Azure Storage.
​​
​​
​
​
​
​
​
-
Built a pipeline using Copy Data Activity to automate the data movement.
​
​​
​
​
​
​​
​
​
​
​
-
Automating Data movement reduced the manual effort and increased the efficiency by generating timely insights.
​
​
​
​
​
​
​
​
​Business Problem Solved
By automating daily Data Ingestion from Azure storage to the Lakehouse, we eliminated the operational bottleneck of manually downloading the CSV files from the previous day, which was slow and prone to error.
This enabled the analyst teams to generate timely, data-driven insights, thereby increasing the team's efficiency.


