You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm tempted to close this as wontfix. Hashes are u64s, so even storing 10k hashes takes ~100kB.
We currently only hash circuits small enough to be inserted on the queue, so the # of circuits doesn't grow that quickly.
For example, running the barenco_tof_10 example for 100s the hash table grows to 3k2 circuits, while the peak memory consumption goes up to 390MB.
We had a PR that did this (#133 e33fc6a), but had to revert it in #144 as it was hurting optimisation performance.
Done right, it should not have any impact on optimisation.
The text was updated successfully, but these errors were encountered: