I'm working on a daemon that I can run in a container which will continuously grab live data and feed it to redis to be consumed by other parties, and I have been intending to attach a REST API which would provide info on the status of the redis connection and allow the daemon to be instructed to publish to other redis instances aswell.
After getting a basic API running with the python module falcon, I realised that this daemon will now need to be doing two things at the same time, and so I will need to use several threads, or some other asynchronous mechanism.
This raises two questions:
1) Is this construction of combining a continuous daemon and a REST API a common or sensible decision? It seems valuable to me to be able to interact with the daemon in a network based way, to allow stronger integration tests etc., but maybe this is overkill and more trouble than it's worth? The alternative I guess is to configure it via a config file and monitor its health based on the output data.
2) If I were to go ahead with this choice, what is the most sensible choice of framework for achieving the dual behaviour, threading or asyncio? Maybe there is no major difference, but I'd love to hear a suggestion or two.
Many thanks, Kerzane.