FastAPI Event Driven Development With Kafka, Zookeeper and Docker Compose: A Step-by-Step Guide Part -3

Ahmed Nafies
7 min readApr 23

Digging deeper into Kafka

In part 2, we have created a simple FastAPI application that integrates with Apache Kafka to produce messages whenever an operation occurs. We have used Docker and Docker Compose to containerise the application. We have produced a test message and confirmed that it was produced using the console-consumer. In this tutorial, we will dive deeper into Kafka to understand how it works behind the scenes. We will explore how Kafka provides data persistence and event sequence guarantee, making it well-suited for applications that require fast, reliable, and scalable data processing and messaging.

A photo head of Franz Kafka by Ross Sokolovski

What kind of problems does Kafka solve in the real world?

Here are some real-world scenarios where Kafka is essential:

  1. Real-time data processing is crucial in industries like finance, healthcare, and transportation. Kafka enables this by acting as a messaging system for quick decision-making.
  2. In e-commerce, Kafka processes and stores vast amounts of data for personalised recommendations and promotions.
  3. Social media platforms utilise Kafka for real-time analysis of user-generated content and ad optimisation.
  4. In IoT, Kafka collects and processes sensor data for monitoring and controlling devices.
  5. Fraud detection by processing transactional data in real-time, helping to identify and prevent suspicious activity.

In general, Kafka is well-suited for any application that requires fast, reliable, and scalable data processing and messaging. One of Kafka most useful features is data persistence and event sequence guarantee.

Lets take E-commerce for example

A Customer creates an order, the order is fulfilled by a merchant, then the order gets captured and customer receives the order, finds a problem and requests a refund.

Ahmed Nafies