You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 27, 2024. It is now read-only.
It would be nice to have the ability to specify wanted part size, and have the downloader look up the file size and do the calculation "filesize / partsize = Parts".
That makes it easier to download various size files without having to change the number of parts each time.
Ie. if you download 100MB it doesn't make sense to have 40 parts. But with a 50GB file it makes sense with a larger number of parts.
Right now with the cloudflarerscraper it takes too long to start up all the downloaders, so before your half way through starting all 40, the first parts are already done. This is not optimal.
Maybe there exist an optimal number, and we could just pass "auto" for number of parts to have it automatically calculate the optimal based on each downloader session usually ends at around 299KB/s.
Edit: After some calculations, I can see that it might be difficult to automatically find an optimal part size, without knowing the available bandwidth. But being able to supply part size would still be very usable. And maybe be able to limit the number of parts, by setting a max number of parts on the commandline.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
It would be nice to have the ability to specify wanted part size, and have the downloader look up the file size and do the calculation "filesize / partsize = Parts".
That makes it easier to download various size files without having to change the number of parts each time.
Ie. if you download 100MB it doesn't make sense to have 40 parts. But with a 50GB file it makes sense with a larger number of parts.
Right now with the cloudflarerscraper it takes too long to start up all the downloaders, so before your half way through starting all 40, the first parts are already done. This is not optimal.
Maybe there exist an optimal number, and we could just pass "auto" for number of parts to have it automatically calculate the optimal based on each downloader session usually ends at around 299KB/s.
Edit: After some calculations, I can see that it might be difficult to automatically find an optimal part size, without knowing the available bandwidth. But being able to supply part size would still be very usable. And maybe be able to limit the number of parts, by setting a max number of parts on the commandline.
The text was updated successfully, but these errors were encountered: