BullMQ Proxy
  • What is BullMQ Proxy
  • Getting started
    • Architecture overview
    • Using Dragonfly
  • HTTP API
    • Authentication
    • Queues
      • Adding jobs
        • Retries
        • Delayed jobs
        • Prioritized
        • Repeatable
        • LIFO
        • Custom Job IDs
      • Getting jobs
      • Queue's actions
      • Reference
    • Workers
      • Endpoints
      • Adding workers
        • Concurrency
        • Rate-Limit
        • Removing finished jobs
        • Stalled jobs
        • Timeouts
      • Removing workers
      • Getting workers
      • Reference
    • Jobs
      • Jobs' actions
        • Update job progress
        • Add job logs
      • Reference
    • Configuration
    • Debugging
Powered by GitBook
On this page
  1. HTTP API
  2. Workers
  3. Adding workers

Concurrency

By default, workers will only call endpoints to process jobs one at a time, however it is easy to achieve concurrency by using the concurrency option. This is an optional field that accepts any positive number, as a concurrency factor, i.e. how many calls to an endpoint can be perform in parallel.

There is no upper limit on the concurrency factor other than the practical limits of the system running the proxy, Redis™ or the endpoint, please adjust as needed.

We can expand our previous interface with a concurrency field:

interface WorkerMetadata {
  opts?: {
    concurrency?: number;
    // .. more options
 }
  // .. more options
}

As before we just can post this object to the /workers endpoint to register or update an existing endpoint:

curl --location 'http://localhost:8080/workers' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer 1234' \
--data '{
    "queue": "my-test-queue",
    "opts": {
      "concurrency": 50
    },
    "endpoint": {
        "url": "http://mydomain.dev",
        "method": "post"
    }
}'
PreviousAdding workersNextRate-Limit

Last updated 1 year ago