Data Lakehouse with PySpark — High Level Architecture & DW Model

Representation Image

Today we are going to discuss over the High Level System Architecture and Data Warehousing model. We are going to use AWS Cloud services to design the Data Lakehouse and Processing power of Spark to load & manipulate the data.

This series requires basic understanding on Python, Apache Spark, Delta Lake and AWS.

By the end of the series, you would be confident enough to design any Data Lakehouse on cloud architectures.

Checkout out the complete discussion on YouTube:

Make sure to Like and Subscribe.

Follow us on YouTube:

If you are new to Data Warehousing checkout —

--

--

⚒️ Senior Data Engineer with 10+ YOE | 📽️ YouTube channel: https://www.youtube.com/@easewithdata | 📞 TopMate : https://topmate.io/subham_khandelwal

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store