0

I'm running a Django application on Google Cloud Run using the default WSGI-based setup (e.g., Gunicorn/ runserver for local dev).

To avoid blocking during long-running operations like third-party API calls, I'm planning to use Google Cloud Tasks.

Current design:

A request comes in to a Django endpoint (e.g., a webhook or ping from an external service)

Instead of processing the request inline, I enqueue a Cloud Task

That task posts to another Django endpoint within the same service, which performs the third-party API call using data passed in the task payload

This means:

I'm not offloading the work to a separate Cloud Run service

The fetch logic is still part of the same Django service/container, just decoupled by the task

My question:

Does this setup allow the third-party call to be effectively asynchronous (i.e., non-blocking to the original request), despite using WSGI and not ASGI or Celery?

When searching around, I mostly see articles and examples where Cloud Tasks are used to call a separate Cloud Run service, not another internal route in the same app.

Is this internal invocation pattern valid and scalable under WSGI, or are there caveats I should be aware of?

asked Aug 5 at 13:37

2 Answers 2

1

It depends a little on what you are trying not to block here. The task will take the same amount of time regardless of the way it is initiated, so the client will not be able to make use of the result until it is complete in any case.

What I think you might want is a way for a client to get a response to its request quickly rather than waiting for the task to complete because the client does not care about the result and just wants to initiate the process. So you really want the long running process to not block the response to the initial request. If so the pattern you describe should work fine.

This pattern worked well for us in App Engine, where long running tasks would be triggered from the original client request. The original request would receive a response as soon as the task was created with a task identifier. Cloud tasks then makes a request to the long running endpoint within the same application, and records progress in data store. Cloud tasks is agnostic to what it calls, whether that is part of the original application or not, so there is no need for an additional application or service. The client could optionally then poll an endpoint for status updates.

In terms of the server scaling, running ASGI may allow for an instance to handle more requests concurrently (particularly if they are blocked on a third party API), which can reduce costs. But even with WSGI cloud run will just create more instances to handle more requests. To put it another way, if you were using an ASGI server, and returning the result of the long running operation in the first request the client would still have to wait a long time for the first response, you just might be able to handle more clients per instance.

answered Aug 7 at 23:39
Sign up to request clarification or add additional context in comments.

3 Comments

Hi there, that's really helpful and yes ensuring that the client receives a quick response to their request is exactly what I need (this particular case is my django application responding to a ping request from the third party, so I need to provide a rapid 200 prior to subsequent processing). I'm interested to know how you secured the endpoint cloud tasks called? It seems necessary to expose the DRF post endpoint publicly for cloud tasks (I'm using token validation for all other endpoints), so I'm just using oidc token currently, but would be good to know if there's an alternative
App Engine is a bit more integrated with cloud tasks, so it was possible to just accept cloud tasks as authenticated through configuration. I think the oidc token is a good way to do this. If you use the python google-cloud-tasks client I think you can just specify a service account email - e.g. cloud.google.com/tasks/docs/samples/…
yes I ended up using that documentation and the cloud-tasks client seems to work well. api integration now running smoothly! thanks for the advice
0

To ensure if that setup is valid or possible, it would be best to consult a Google Cloud sales specialist. They can offer personalized advice and technical recommendations tailored to your application’s needs. From identifying suitable use cases to helping you manage future workload costs effectively, their insights can be invaluable.

answered Aug 6 at 19:51

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.