Data Lakehouse with PySpark — High Level Architecture & DW Model

--

Representation Image

Today we are going to discuss over the High Level System Architecture and Data Warehousing model. We are going to use AWS Cloud services to design the Data Lakehouse and Processing power of Spark to load & manipulate the data.

This series requires basic understanding on Python, Apache Spark, Delta Lake and AWS.

By the end of the series, you would be confident enough to design any Data Lakehouse on cloud architectures.

Checkout out the complete discussion on YouTube:

Make sure to Like and Subscribe.

Follow us on YouTube: https://youtube.com/@easewithdata

If you are new to Data Warehousing checkout — https://youtube.com/playlist?list=PL2IsFZBGM_IE-EvpN9gaZZukj-ysFudag

--

--