Make sure --free_gpu_mem still works when using CKPT-based diffuser model #2367
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR attempts to fix
--free_gpu_mem
option that was not working in CKPT-based diffuser model after #1583.I noticed that the memory usage after #1583 did not decrease after generating an image when
--free_gpu_mem
option was enabled.It turns out that the option was not propagated into
Generator
instance, hence the generation will always run without the memory saving procedure.This PR also related to #2326. Initially, I was trying to make
--free_gpu_mem
works on 🤗 diffuser model as well.In the process, I noticed that InvokeAI will raise an exception when
--free_gpu_mem
is enabled.I tried to quickly fix it by simply ignoring the exception and produce a warning message to user's console.