I'm currently developing a system that is used for generating verification codes. The system authenticates a user, sends them a code, stores it locally, then at some later point receives the code from the user, after which point the code is spent and deleted from the system
The original proposal was to create this system using an SQL database, however this use case just screams Kafka, and since these codes have a very short duration (they expire in 30 seconds, which means that they must be spent almost immediately) it only makes sense to have producers and consumers that create and consume events that hold these codes.
However, the issue is that all events must be logged, i.e. upon code consumption/expiration something must be written in the audit log. Also, any attempt to use an invalid code must be logged as well. It seems like a waste to have a fully fledged database and then use it to have one AUDIT_LOG table in it.
Since we're using Spring boot we'd like to use some plug-and-play solution that can be easily interfaced with. I've seen suggestions of using Kafka itself to store audit logs, but we need long term persistence and Kafka doesn't seem ideal for that. Using a no-SQL DBs was also suggested, and I've seen some promising append-only solutions.
Do you folks have anything to recommend here?
original proposal was to create this system using an SQL database
Use both? Kafka Connect can write to JDBC databases from Kafka topics. This way, you don't need to write JDBC Clients within your Java code, and non-Java clients can also send data to the same Kafka topic since the protocol is generic TCP, not specific to your database.
If that SQL database doesn't work, you can still consume the same Kafka topic to a different system (Elasticsearch works great for log analysis, as does HDFS or S3 for larger datasets). Since you mention "offline" systems, then MinIO is a self-hosted, S3 compatible option.