- An event-driven architecture utilizing Kafka, MongoDB, and PostgreSQL is employed for data management, ensuring real-time tracking and auditing.
- A PostgreSQL trigger on the customer table monitors INSERT, UPDATE, and DELETE operations and uses the LISTEN/NOTIFY mechanism to publish changes.
- A Spring Boot listener, CustomerChangeListener, monitors database changes and sends structured events to Apache Kafka via KafkaProducerService.
- A Kafka topic named customer_events is created to manage customer change events, with KafkaProducerService publishing these events and KafkaConsumerService listening for them.
- Events received by KafkaConsumerService are stored in a MongoDB collection called customer_history, which captures details about changes for auditing.
- The MongoDB customer_history collection serves as a repository for historical customer changes, including who made the change, what was altered, when it occurred, and the rationale.
- A project structure must be established, and the Maven pom.xml file updated with dependencies for Spring Boot, PostgreSQL, MongoDB, and Kafka.
- Application properties need to be configured to connect to PostgreSQL, MongoDB, and the Kafka broker.
- The main application file is CustomerTrackingApplication.java, which runs the service.
- CustomerController.java manages CRUD operations for customer data, triggering database actions and Kafka notifications.
- CustomerService.java contains business logic for managing customer data and interacts with PostgreSQL and Kafka.
- A history table and trigger must be created in PostgreSQL to log all changes to the customer table.
- CustomerChangeListener.java listens for notifications from PostgreSQL and sends relevant data to Kafka.
- Kafka producer and consumer services manage messages related to customer changes, ensuring accurate history in MongoDB.
- All changes (insertions, updates, deletions) are stored in the customer_history collection in MongoDB.