-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ ENHANCEMENT ] Automatically randomize queue #41
Comments
There's a couple considerations here:
I don't think that either of these limitations are likely to change in the near future—they're pretty fundamental to the queue's design, and would require major overhauls to re-tool. |
we can probably try to assign priority by normalizing the time and manipulate the millisecond part of time (then we have 1000 slots to use which should be decent). And since OH queue has ELK the original time of entry will be kept in log system and can be exported normally in the full queue log (if few seconds of inaccuracy matters at all |
The main issue with that is that the timestamp is derived from the ID. We can't freely manipulate the timestamp; it's baked in. (Changing the ID is a non-starter for a ton of reasons; it's the unique identifier of a queue entry.) |
This idea is now getting even crazier - for entries added to the queue in the first x seconds, randomize their timestamp millisecond portion with a fixed seed after each derivation of the timestamp? This happens every time when queue entires endpoint is requested Not sure how feasible this is, but a fixed seed (which could be a queue opening timestamp, for example) ensures that the random millisecond generated is fixed each time, while eliminating the need to store more state. This would also allow for automatic randomisation as a side effect.
|
Would be nice if there was an option to automatically randomize the queue order for the first x seconds after queue opens for students with the same priority. The current system completely messes up priority from our testing with 482 staff.
The text was updated successfully, but these errors were encountered: