I am running my Django app in a load balanced Elastic Beanstalk Environment. I want to add a Celery daemon process to do the following things:
Now, I want to know if it is the right way to deploy celery on the same server as Django is running using Amazon SQS? If yes, how do I set that up?
And if multiple servers on Elastic Beanstalk can cause duplicate tasks because of celery beat?
It doesn't matter where you will start your celery: on the same server or on the separate, the both ways are right. Does matter what will you use for the celery backend. If you use some shared redis or database between all celeries than there is no chance that tasks will duplicate but if every celery has its own backend that it will chaos and disaster.