Docker LabAdvanced4 hours
Build a Streaming Pipeline with Kafka
Set up Kafka brokers, create producers and consumers, and build a real-time data pipeline for event processing.
Part of Data Engineering (Week 7)
What You'll Build
A real-time event processing pipeline with Kafka producers generating e-commerce events, consumers processing and aggregating data, and a PostgreSQL sink for analytics queries.
Tools Used
Apache KafkaPythonDocker ComposePostgreSQL
Skills Practiced
Event streamingKafka producers/consumersData pipeline architecture
Prerequisites
- Python intermediate
- Docker Compose
Why This Matters in Real Jobs
Kafka is the backbone of real-time data infrastructure at companies like LinkedIn, Uber, and Netflix. Data engineering roles increasingly require hands-on streaming experience.
Access This Lab
This lab is part of the Data Engineering course. Enrol to get access to all labs, projects, and career support.