-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sequentially queue processing #1446
Comments
This is exactly default behavior. Jobs are processed one by one, in same order they have been added to queue. |
@stansv, let me provide more details. queue instance const options = {
redis: api.redis,
defaultJobOptions: {
attempts: 1,
timeout: 1000 * 60 * 5,
lifo: false,
removeOnComplete: true,
removeOnFail: true
}
};
const queue = new bull(api.queues.puppeteer, options); set named task await queue.add(api.analyzers.cdp, { analyze: {} }); process queue.process(api.analyzers.cdp, 1, _onCdp); What if i set different names for tasks in one queue - what is default process behavior? |
You should create queue, and register job processor outside express handler function. Only call
Named processors is just the same as if you define only single processor without name and put switch block inside to select a job-specific processor; if no matching processor found for next job will just fail. |
yes, will try, thank you |
Did you get it work? I see that bull starts new jobs without finished previous one... |
No =( |
Yes, it is strange that queue can't be sequential... |
It is possible to limit how many jobs to be processed in parallel actually, but this requires the Pro version and use the groups functionality (worst case just have 1 group with max concurrency 1): https://docs.bullmq.io/bullmq-pro/groups/concurrency |
I have same problem using nest.js. |
Same problem here. Jobs run in parallel in queue. How is this a queue in anyway, if its just executes jobs by chance? |
@trsh running jobs in parallel is a feature, not a problem. I already told you that you can limit concurrency, by limiting the amount of workers and concurrency factors. |
In fact, you can even process jobs manually if you want to have full control: https://github.com/OptimalBits/bull/blob/develop/PATTERNS.md#manually-fetching-jobs |
@manast can you point me in the direction, how I can do that? I did not find a thing - No match for P.S features should have on/off no? |
Almost got happy about |
You are probably not implementing the processor correctly. I suggest you that you start using BullMQ instead, there is more documentation than Bull (https://docs.bullmq.io) and plenty of tutorials (https://blog.taskforce.sh) |
@manast its not an option for this time being. Just tell people who see this issue how to limit queue.process('process', 1, async (job: Job<EventJobData>) => {
await something.to.compleye();
return Promise.resolve();
} |
@trsh If you read the documentation it is explained in several places. Your example should work, but I guess the reason you think the jobs are executing in parallel is that "something.to.compleye()" is not implemented correctly. The best is if you write a test case that does not hold, and then we take it from there. |
Ok I will. My |
You have to setup the Just set the Lines 51 to 60 in 60fa88f
|
Not sure it would work because if you set |
Hi guys, we recommend you to use BullMQ as all new features will be implemented there, for sequential execution we have a pr. Now that we also added a global concurrency feature in bullmq, it's possible for us to bring this new functionality |
It's possible to consume one queue (single node) sequentially? What is
Bull.QueueOptions
for this behavior?For example, i set 10 tasks at the same time and want to process each one-by-one, without delay and parallelism.
The text was updated successfully, but these errors were encountered: