-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limitation on the number of event for hepevt #1356
Comments
Probably this DD4hep/DDG4/plugins/Geant4EventReaderHepEvt.cpp Lines 183 to 184 in 665f7c4
|
Hi @BrieucF Did this answer your question? Would you like to increase this limit? |
Hi Andre, I am afraid I do not fully understand the meaning of the two comments you quoted... If that limit is just to remain within what is deemed reasonable, yes I would like to increase it. Some beam induced background simulation provide a lot of particles, even with some filtering, but the simulation can still be run in a decent amount of time. If there is a deeper reason for this limit then we will have to find workarounds. |
Coverity static Analysis complained that we read some value from a file and without checking used that as a loop boundary, so someone put a limit that seemed reasonable at the time. If you think a larger number is also reasonable we can increase the limit. Regardless of the limit, time and memory for simulating one event should be kept in mind. |
Indeed, time and memory should be kept in mind. But providing "sane" input file is somehow the user responsibility. I also do not see such limit for e.g. the HepMC3 reader. So yes, if a limit has to be kept for hepevt, it would be nice to increase it to e.g. 50M. Otherwise, I am also happy if we just remove this. |
PR to increase the limit is welcome! HepMC3 does its own validation, or is at least outside our purview. |
Here it is: #1359 |
Hi, what was the reason for this limitation to 1 M particles with the hepevt format:
DD4hep/DDG4/plugins/Geant4EventReaderHepEvt.cpp
Lines 186 to 189 in 665f7c4
The text was updated successfully, but these errors were encountered: