By default, workers will only call endpoints to process jobs one at a time, however it is easy to achieve concurrency by using the concurrency option. This is an optional field that accepts any positive number, as a concurrency factor, i.e. how many calls to an endpoint can be perform in parallel.

There is no upper limit on the concurrency factor other than the practical limits of the system running the proxy, Redis™ or the endpoint, please adjust as needed.

We can expand our previous interface with a concurrency field:

interface WorkerMetadata {
  opts?: {
    concurrency?: number;
    // .. more options
  // .. more options

As before we just can post this object to the /workers endpoint to register or update an existing endpoint:

curl --location 'http://localhost:8080/workers' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer 1234' \
--data '{
    "queue": "my-test-queue",
    "opts": {
      "concurrency": 50
    "endpoint": {
        "url": "",
        "method": "post"

Last updated