-
Notifications
You must be signed in to change notification settings - Fork 268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
potential memory leak #270
Comments
Did it start happening with new changes in the explorer introduced for the upcoming HF? |
no, its been going on ever since I've been running xmrchain honestly. I just haven't really been concerned because restarting it "fixes" it, and it seems to run fine, so I didn't think it was critical. so its been running since i made that first post above. and once again its at 75% memory. |
Can you try running it bear bones, without any extra options like |
well if i kill the api then some stupid services that don't understand decentralization will stop working, so i'll do everything except that. |
Ok. I will try to run the explorer and simulate the high traffic, and see how it goes. |
well i ran it with only the API, and its currently at 57% ram |
Still lots. Could you share the settings/options that you use for the explorer? |
this is what i normally run: ./xmrblocks -p 8888 Can I revert my settings to the ones above, or do you still need me to run it minimal? |
You can run it normally. I will try to investigate what's happening with and without those settings. |
Another possible memory-leak related issue: #308 |
xmrchain runs on a box with 64 gb ram. using top, it shows that the mainnet explorer (the one that gets the most traffic) ends up with 74% memory utilization over time. I occasionally restart it because that amount seems absurd with how clean everything looks and feels with the explorer. After restart, its back at < 5% or so.
Any thoughts on how I could help hunt this down? Perhaps right around a HF is not the time for this, but just wanted to let you know in case I haven't already.
The text was updated successfully, but these errors were encountered: