Skip to content

Commit

Permalink
default to "auto" dtype
Browse files Browse the repository at this point in the history
  • Loading branch information
ArthurZucker committed Nov 25, 2024
1 parent 11cc229 commit b606208
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/transformers/modeling_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -3273,7 +3273,7 @@ def from_pretrained(
`device_map`) is redundant and will not provide any benefit in regards to CPU memory usage. However,
this should still be enabled if you are passing in a `device_map`.
</Tip>
torch_dtype (`str` or `torch.dtype`, *optional*):
torch_dtype (`str` or `torch.dtype`, *optional*, defaults to `"auto"`):
Override the default `torch.dtype` and load the model under a specific `dtype`. The different options
are:
Expand Down Expand Up @@ -3407,7 +3407,7 @@ def from_pretrained(
from_pipeline = kwargs.pop("_from_pipeline", None)
from_auto_class = kwargs.pop("_from_auto", False)
_fast_init = kwargs.pop("_fast_init", True)
torch_dtype = kwargs.pop("torch_dtype", None)
torch_dtype = kwargs.pop("torch_dtype", "auto")
low_cpu_mem_usage = kwargs.pop("low_cpu_mem_usage", None)
device_map = kwargs.pop("device_map", None)
max_memory = kwargs.pop("max_memory", None)
Expand Down

0 comments on commit b606208

Please sign in to comment.