3

I am thinking of creating two applications, one of which (App 1) will be in Django (DRF) and other (App 2)might be Django but might be another more lightweight framework (maybe Flask or plain Django or maybe different language altoghether depending of how efficient will be the script in Python). Anyway the idea is that the App 2 needs to be standalone.

I am thinking of creating a connection between them that would be working in both directions.

App 1 (Main Django app) would send request to the App 2 to start calculation (which would take quite long time). Response for this request would be some information that App 2 received request and now is doing calculation in background, so that App 1 would not be waiting for this request to finish. Then when App 2 finishes it would then send request to App 1 to inform App 1 that it(App 2) had finished calculating and updated database (which is shared). After that App 1 sends information to the FrontEnd that everything is ready (updated database, App 2 finished calculations).

What would be best way to enable this back and forth communications?

I read about celery-redis, but should I implent this in App 1 or in App 2 in this case? As far as I understand I could create task queue in App 1 and when App 1 sends first request to App 2 (like calling external api), the request is moved to the task queue and the response will be later delivered when App 2 finishes calculation, is that the "go to" approach?

I also started reading into websockets (django channels) and Im wondering if they would be better fit for this ase.

2
  • 1
  • @gnat I see, will try to follow that pattern in next questions. Could you please point of specifically what made my question wrong, so I can improve on it? I feel I got the answers I needed, when asking this question though, so I wonder how should I have phrased it differently. Thanks!
    – Alex T
    Commented Jun 11, 2021 at 10:17

2 Answers 2

6

The idea that App 2 makes a request back to App 1 is a common architecture to get web services to work together. Usually, this would work like this:

  • App 1 makes a request to App 2
    • the request contains an URL on App 1 to be called upon completion
  • App 2 responds to the requests, and begins background work
  • eventually, App 2 requests the URL provided by App 1's initial request
    • this request can either contain necessary status data, or App 2 might have an API from which App 1 can retrieve necessary data

Such a design is e.g. common on many authorization or payment flows on the web, where multiple parties must work together. However, there's a lot of back-and-forth here, which might be inefficient.

A completely different approach uses message queues.

  • App 1 publishes a message on a topic
    • this happens asynchronously, so there is no initial response by App 2
    • each message has an ID
  • App 2 is listening on that topic and will eventually process the message
  • App 2 publishes a new message with results
    • the results message will have a correlation-ID that matches the ID of the request
  • App 1 is listening for results and will eventually receive a results message with the desired correlation ID

Such message queue based architectures are especially interesting in high-throughput scenarios, or in an enterprise context where you want to be able to easily add more listeners for events, instead of HTTP-style 1:1 communication.

Celery is a Python tool that uses such message queues, but simplifies them. With Celery, Celery is the App 2 – you must start a celery worker server. This lets App 1 easily perform some function call as a background task. This is similar to using async functions or to starting a thread, but the Celery worker doesn't have to run on the same server.

So if you just want asynchronous background tasks, Celery is a good fit. If you already have an App 2, Celery won't help. Instead, you should implement one of the HTTP-request or message queue based approaches.

8
  • Thanks! In the case I already have App 2, I should implement message queue (if going through that route) in App 1? I kind of am familiar with celery when it comes to background tasks within one app, but Im wondering how it works when you actually send requests to other app to act as a background task service for App 1.
    – Alex T
    Commented Jun 11, 2021 at 10:24
  • @AlexT The message queue is usually provided by a message broker, which would be a third service. Both apps would connect to the broker to listen to topics and to send messages. Celery won't help here – you're kind of implementing your own Celery-style system.
    – amon
    Commented Jun 11, 2021 at 11:03
  • Oh so celery/django-rq implemented in App 1 wouldnt suffice to handle queing requests to external Apps that would take long time to respond?
    – Alex T
    Commented Jun 11, 2021 at 13:33
  • @AlexT I don't really have experience with Celery or Django, and I don't want to say something wrong. But it seems that even Celery requires that an external broker is running. Also, Celery only sends tasks to a celery worker, and doesn't assist with sending messages to App 2.
    – amon
    Commented Jun 11, 2021 at 14:14
  • 1
    @AlexT Uh, it would usually be better if the apps only communicate via one medium: either HTTP requests or a message queue or a shared database. App 1 knows that the task is still pending if the callback URL hasn't been invoked by App 2. App 1 might then keep track of pending tasks in its own database tables. A more microservice-y way might be for App 2 to have an endpoint where info about task status can be requested.
    – amon
    Commented Jun 16, 2021 at 12:57
3

I would do it with a queue in both directions. App1 -> queue -> App2 -> queue -> App1. I would use RabbitMQ and have multiple instances of App2 if the load increased.

The problem you could have is with notifying front end, but I guess you could have some sort of polling or push notifications when the operation finished.

2
  • Why there is a need to use queues in both Apps?
    – Alex T
    Commented Jun 14, 2021 at 14:46
  • 1
    So the apps do not know about each other. Basically App 1 sends tasks to "process" queue and consumes the "processed" queue. App 2 consumes the "process" queue and sends to "processed" queue. You can then have multiple producers and consumers.
    – Blaž Mrak
    Commented Jun 16, 2021 at 19:33

Not the answer you're looking for? Browse other questions tagged or ask your own question.