Skip to content

Commit

Permalink
tested
Browse files Browse the repository at this point in the history
  • Loading branch information
huseinzol05 committed Jan 2, 2025
1 parent ee40705 commit cc0cbca
Show file tree
Hide file tree
Showing 3 changed files with 516 additions and 0 deletions.
24 changes: 24 additions & 0 deletions session/small-malaysian-reasoning/3b.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
WANDB_PROJECT="lora-embedding-256-llama3.2-3b-small-malaysian-reasoning" \
CUDA_VISIBLE_DEVICES="2" \
TORCH_DISTRIBUTED_DEBUG="info" \
torchrun --nproc_per_node 1 \
-m train \
--model_name_or_path unsloth/Llama-3.2-3B-Instruct \
--per_device_train_batch_size 2 \
--gradient_accumulation_steps 6 \
--output_dir lora-embedding-256-llama3.2-3b-small-malaysian-reasoning \
--bf16 --do_train --do_eval false --num_train_epochs 5 \
--train_file packing-4k \
--logging_steps 1 \
--learning_rate 2e-5 \
--weight_decay 0.01 \
--block_size 24576 \
--save_steps 100 \
--save_total_limit 3 \
--gradient_checkpointing true \
--neftune_noise_alpha 5.0 \
--torch_dtype bfloat16 \
--rank 256 \
--ddp_find_unused_parameters false \
--dataloader_num_workers 3 \
--dataloader_prefetch_factor 4
10 changes: 10 additions & 0 deletions session/small-malaysian-reasoning/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Small Malaysian Reasoning

## how to

1. Install necessary libraries,

```bash
pip3 install transformers==4.47.0 accelerate==1.1.1
pip3 install git+https://github.com/mesolitica/ml-cross-entropy-lora-embedding
```
Loading

0 comments on commit cc0cbca

Please sign in to comment.