You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running the repo on my local GPU, which is Tesla T4 and other options generally take around 2GB of GPU RAM, but the appearance option alone takes 11GB of GPU RAM, though it uses the same loaded model. Can someone clarify this part or am I missing something? I took the same code as available in the HuggingFace space app.
If it's the same image, the memory usage should be comparable for tasks other than the dewarping task (which uses 256x256 resolution input). Could you please try running this script inference.py to see if it has the same issue?
If it's the same image, the memory usage should be comparable for tasks other than the dewarping task (which uses 256x256 resolution input). Could you please try running this script inference.py to see if it has the same issue?
I am running the repo on my local GPU, which is Tesla T4 and other options generally take around 2GB of GPU RAM, but the appearance option alone takes 11GB of GPU RAM, though it uses the same loaded model. Can someone clarify this part or am I missing something? I took the same code as available in the HuggingFace space app.
@ZZZHANG-jx
The text was updated successfully, but these errors were encountered: