Microsoft Fabric - Ingesting 5GB into a Bronze Lakehouse using Data Factory
Tutorial
In this video Ed Freeman continues the Microsoft Fabric End-to-End demo series by seeing how we can quickly ingest ~5GB of data from an unauthenticated HTTP data source into OneLake using Data Factory in Microsoft Fabric. We'll see the distinction between Tables and Files in a Fabric Lakehouse, and look at how we can preview data in the Lakehouse explorer.
The talk contains the following chapters:
- 00:00 Introduction
- 00:15 Dataset recap
- 01:25 Workspace and pipeline artifacts
- 01:57 Pipeline UI layout
- 02:21 Copy data activity options
- 03:07 Configure copy data activity source
- 05:00 Configure copy data activity destination
- 06:21 Add dynamic content for destination Lakehouse filepath
- 08:21 Copy Data activity additional settings
- 09:00 Manually trigger pipeline
- 09:21 Alternative parameterized pipeline
- 11:38 Reviewing pipeline run details
- 12:10 Default workspace artifacts
- 13:04 Viewing Lakehouse Files
- 13:46 Roundup and outro
Useful links:
Microsoft Fabric End to End Demo Series:
- Part 1 - Lakehouse & Medallion Architecture
- Part 2 - Plan and Architect a Data Project
- Part 3 - Ingest Data
- Part 4 - Creating a shortcut to ADLS Gen2 in Fabric
- Part 5 - Local OneLake Tools
- Part 6 - Role of the Silver Layer in the Medallion Architecture
- Part 7 - Processing Bronze to Silver using Fabric Notebooks
- Part 8 - Good Notebook Development Practices
From Descriptive to Predictive Analytics with Microsoft Fabric:
- Part 1 - Overview
- Part 2 - Data Validation with Great Expectations
- Part 3 - Testing Notebooks
- Part 4 - Task Flows
- Part 5 - Observability
Microsoft Fabric First Impressions:
Decision Maker's Guide to Microsoft Fabric
and find all the rest of our content here.