Background Jobs for NodeJS

BullMQ is a fast and robust background job processing library for Redis™

myQueue.ts
import { Queue } from "bullmq";

const queue = new Queue("Paint");
await queue.add("cars", {
color: "blue",
delay: 30000
});
myWorker.ts
import { Worker } from 'bullmq';

const worker = new Worker('Paint', async job => {
if (job.name === 'cars') {
await paintCar(job.data.color);
}
}, { concurrency: 100 });

Performance

  • Robust and reliable
  • Run jobs in parallel
  • Add jobs in bulk
  • Fast, 10k jobs per second
  • Scale horizontally
  • Loaded with functionality

  • Delayed jobs
  • Retry failed jobs
  • Job's priorities
  • Rate-limited jobs
  • Job progress, logs and events
  • Advanced features

  • Flows
  • Repeatable jobs
  • Global events
  • Queue metrics
  • Job's history (coming soon)
  • What is BullMQ?

    BullMQ is a lightweight, robust, and fast NodeJS library for creating background jobs and sending messages using queues. BullMQ is designed to be easy to use, but also powerful and highly configurable. It is backed by Redis, which makes it easy to scale horizontally and process jobs across multiple servers.

    BullMQ is a rewrite of the popular Bull library by the same authors, but with a new API and a more modern codebase written in Typescript and with a bunch of new features and performance improvements.

    BullMQ has been succesfully used in production to implement video transcoding, image processing, email sending, and many other types of background jobs.

    Open source under the generous MIT License, without any artificial limitations on the number of workers, concurrency or the like.

    Why Message Queues?

    Decouple components

    Message queues are a great way to decouple your application components. They are also a great way to scale your application by distributing the load across multiple workers.

    Increase reliability

    Increase reliability by adding retries and delays to your jobs that communicate with other services or components.

    Offload long running tasks

    Run long running tasks in the background, and let your users continue using your application while the task is running in your pool of workers.

    Schedule jobs

    Jobs can be delayed, so that they are run at a specific point in time, or you can run a job at a given interval.

    Divide tasks into smaller ones

    Breaking a large task into smaller, independent jobs can make it easier to scale your application as well as make it more reliable in case one of the jobs fails.

    Conform to existing APIs

    Limit the speed to which your application can consume external APIs by adding a queue in between and enable a rate limit.

    Code Examples

    Check out these code snippets to get a feel for how BullMQ works. You can also check out the documentation for more information and examples.

    events.ts
    import { QueueEvents } from "bullmq";

    const queueEvents = new QueueEvents("Paint");

    queueEvents.on("completed", (jobId, result) => {
    console.log(`Job ${jobId} completed ${result}`);
    });

    queueEvents.on("failed", (jobId, err) => {
    console.log(`Job ${jobId} failed ${err.message}`);
    });

    queueEvents.on(
    "progress",
    ( e: { jobId: string; data: number }) => {
    console.log(`Job ${e.jobId} progress: ${e.data}`);
    }
    );
    repeatable.ts
    import { Queue } from 'bullmq';

    const queue = new Queue('Paint');

    // Repeat job once every day at 3:15 (am)
    await queue.add(
    'submarine',
    { color: 'yellow' },
    {
    repeat: {
    pattern: '* 15 3 * * *',
    },
    },
    );
    ratelimit.ts
    import { Worker } from 'bullmq';

    const worker =
    new Worker('Paint', async job => paintCar(job), {
    limiter: {
    max: 10,
    duration: 1000,
    },
    });
    retry.ts
    import { Queue } from 'bullmq';

    const queue = new Queue('foo');

    await queue.add(
    'Paint',
    { color: 'pink' },
    {
    attempts: 3,
    backoff: {
    type: 'exponential',
    delay: 1000,
    },
    },
    );
    flow.ts
    import { FlowProducer } from "bullmq";

    const flowProducer = new FlowProducer();

    const flow = await flowProducer.add({
    name: "Renovate Car",
    queueName: "renovate",
    children: [
    { name: "paint",
    data: "ceiling", queueName: "steps" },
    { name: "engine",
    data: "pistons", queueName: "steps" },
    { name: "wheels",
    data: "Michelin", queueName: "steps" },
    ],
    });

    Dashboard

    As your queues grow and your system becomes larger, it is important to have a good overview of what is going on. You can use BullMQ with Taskforce.sh to gain better insights in your queues and powerful debugging and monitoring tools to manage your jobs.

    By subscribing to Taskforce.sh you are supporting the future development of BullMQ.