Skip to content

Commit

Permalink
remove float16 restriction on bnb plugin #46
Browse files Browse the repository at this point in the history
Signed-off-by: Yu Chin Fabian Lim <[email protected]>
  • Loading branch information
fabianlim committed Jul 2, 2024
1 parent abc06e8 commit b485807
Showing 1 changed file with 0 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -173,9 +173,6 @@ def augmentation(

# some assertions
assert peft_config is not None, "need peft_config to install PEFT adapters"
assert (
model.dtype == torch.float16 or train_args.fp16
), "need to run in fp16 mixed precision or load model in fp16"

# requires a custom prepare because the stock one in peft will introduce
# extraneous casting
Expand Down

0 comments on commit b485807

Please sign in to comment.