-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Download (for video segments) randomly stops and does not resume/retry #213
Comments
Can you please use the |
I've tried |
That's definitely odd to say the least, but not sure I can do anything about it. If it were a deadlock, at least I would know there's an issue in the code. Youtube causing a permanent 403 on a fragment for that specific attempt is not something I can control. |
I assume it works fine if I created a new task, is because it will have entirely different auth info / query parameters / session etc.. I was thinking maybe we can do something similar within the same process, if it failed for too long? No idea how difficult it would be, though, just an idea. |
I'm not sure it would, actually. The main factor that would change anything would be creating a new http client, I think, since all other info used is constant. There is no different auth info or query params in a newly started instance. Thinking about it, maybe re-loading the cookies from file and replacing the cookie jar might do something. But it also might not. I'll consider it though. |
Thanks. For what it's worth, I usually download anonymously. |
Another idea: maybe We probably can add some exponential backoffs on these segment retries. |
Nah, it already waits something like 15 seconds minimum. It doesn't grab new URLs unless you see the "Retrieving URLS..." message. |
It did say "attempting to retrieve a new download URL" every second in the screenshot I posted above. Does this not count? |
Nope, since the actual function that tries checks how long it has been since it last grabbed them. |
Then I'm confused. It says it retried 10 times from 18:30:47 to 10:30:58. If it's not making new HTTP request what exactly was it retrying? I assume you meant the actual function responsible for making HTTP request won't do it if the attempt was too close to the previous one. I.e. it doesn't actually retry 10 times, just wrong print? |
Nothing. It's just a debug message that fires whenever it thinks it should try, before it calls the actual function that tries. That's why it appears, even if it doesn't end up making a full attempt. |
I've noticed this happening a lot lately.
Video segment download will stop at some point and never seem to recover.
Re-start usually fixes it.
The text was updated successfully, but these errors were encountered: