Supercharging Live Alerts With Redis And Kafka

Published On: 20 March 2025.By .

Supercharging Live Alerts With Redis And Kafka

 

Optimizing Live Alerts with Kafka and Redis

Introduction

Real-time systems demand both speed and reliability, especially when dealing with high-frequency sensor data. This was the challenge faced while implementing live alerts . Initially, the system relied on a database-centric approach, where data was stored, and a cron job executed every 5 minutes to evaluate alert conditions. However, this caused significant delays, as the increasing volume of IOT data made the process sluggish and unscalable. To address these issues, we revamped the system with Kafka and Redis, achieving near-instantaneous processing and alerting capabilities.

 

Challenges with the Initial Setup

  1. Delayed Alerts: Cron jobs executed every 5 minutes introduced inherent delays, making the system incapable of real-time alerts.
  2. High Database Load: With rows of sensor data continuously flooding the database, querying and processing became a bottleneck.
  3. Scalability Issues: As the data volume grew, the time to process alerts increased proportionally, leading to performance degradation.

 

Transitioning to Kafka and Redis

To overcome these bottlenecks, the system was re-engineered with a focus on real-time data streaming and processing. Kafka was introduced to handle high-throughput, distributed message streaming, and Redis was chosen for its speed and versatility in maintaining a replica-like data store.

  • Kafka for Message Streaming:
    • Sensor data was streamed in real-time to Kafka topics, ensuring a continuous flow of data without delays.
    • Kafka’s partitioning and fault tolerance allowed the system to scale efficiently.

 

  • Redis as a Real-Time Data Store:
    • Redis was utilized to create a database-like replica of sensor data.
    • Redis Pub/Sub ensured that any changes or updates made by users were instantly published and synced.
    • Redis Streams enabled efficient handling of event queues for processing and alerting.

 

Implementing Redis for Real-Time Processing

The core of the revamped system was Redis, extensively used to solve several key challenges:

  • Database-Like Replica
    • Due to certain changes made by users , it was important to maintain almost similar database-like structure in redis using HASH SETS . Using a pub/ sub system the  changes  were captured instantaneously and changes were reflected  and then stored permanently onto redis .  
    • HSETs with similar prefixes were queried dynamically to retrieve relevant sensor data using Rejson . It ensured efficient reads for multiple data points

 

  • Pub/Sub for Data Updates :
    • A publish/subscribe mechanism ensured that any update or change made by users was immediately reflected in Redis.
    • These updates were captured and permanently stored in the database through a dedicated listener service.

 

  • Condition Evaluation for Alerts:
    • Kafka messages were polled live to check predefined alert conditions.
    • If conditions were met, alert notifications were pushed to Redis Streams for email processing.

 

Walk Through Of The Process 

1) Enabling  redis-stack server

  • First we would need a redis-stack server to add on redis features .
 

 

  • In order to check if redis-stack server  is actually installed in terminal follow the  below process 
 

 

 

2) Django signals to publish changes to redis

  • Django Side Snapshot
 
  • A  service would be subscribed and  continuously listen to messages . The service then would permanently store the  messages onto a hash key . 

 

   
  • The messages are stored  into a hash keys that have common prefix . This is maintained as redis queries only work on  hash keys  with common prefix  . Attaching redis snapshot ..

 

  • Kafka consumers are setup and messages are consumed . An example on how to setup redis index and  effectively query on it is shown below . 

 

 

 

An index is created with the name we choose ( typical copy of a database index ); 

  • A snapshot on how to query index  : 
 

 

  • When  alert is  raised  , alert entry is done in notification and database stream . Example are as follows .
 

 

Architectural Diagram

                                                                       System  Architecture 

 

Conclusion

By shifting from a database-cron system to a Kafka-Redis architecture, the live  alert system achieved real-time processing, scalability, and reliability. Leveraging Redis JSON for querying, Pub/Sub for updates, and Streams for notifications proved to be pivotal in the system’s success. This solution not only addressed the immediate performance issues but also laid a robust foundation for future scalability and enhancements.

 

Resources   

 

Related content

That’s all for this blog

Go to Top