-
Notifications
You must be signed in to change notification settings - Fork 37
Upgraded to Pro, upload still stuck #84
Comments
also, speaking of running out of space... any chance Hashbase could host a ~60GB dat archive? that is beyond the usual 10GB Pro limit datpediaright now dat://datpedia.org is just 3.5 GB. that's because it's only Simple English Wikipedia i'm currently processing the real Wikipedia (full article text, downsized images). that will probably come to ~60GB. it would be cool to keep that dat online without running an EC2 instance continuously; that's kinda expensive (instead, my plan is to use AWS every time a new Wikipedia dump comes out in order to download it, process it, and update the dat. between updates, someone still needs to seed the dat) |
We have some code that's supposed to account for that specific situation, but it may not trigger immediately after the upgrade. It's a job (see here) that runs every 5 minutes. Right here it'll reconfigure our swarm behavior based on your quota usage. Unfortunately the admin mode doesn't show upload pct for other users (todo) so I can't check whether it's kicked in yet. Can you check again and lmk? If it still hasn't, I'll restart the server and see if that solves it (and then we'll know we have a bug). Regarding the 60GB, we could probably make that happen but could it wait a couple months? Tara and I are heads down getting beaker 0.8 in good shape, so we're a bit resource-starved for hashbase tasks. |
running |
The 99% bug is something else -- it includes all of history in the calculation, and some dat clients (like the CLI) don't save history, and so it can't hit 100%. This is especially obnoxious because hashbase doesn't read the metadata till it hits 100%, so your archive will stay untitled. I dedicated two afternoons to fix our progress calculation to exclude history, and found it extremely difficult to do efficiently. (The entire process collapsed under load.) I didn't have that problem testing on dev so it may have been some kind of edgecase that was eating cycles, but I ran out of time to spend on it. |
a bit of an edge case... i uploaded my dat for dat://dcpos.ch
this filled up the rest of my 100MB, so the upload progress stopped. i upgraded to Pro
however, the upload progress is still stuck:
dat archive in question: dat://36abda5bff5a816b4e69f45e41c6778ebb3c34e45bc6a5e5758e2f047559884b
alternate URL for the same dat: dat://dcpos.ch
alternate URL for the same dat (incomplete upload): dat://index-dcposch.hashbase.io
archive size: ~97MB
The text was updated successfully, but these errors were encountered: