Releases: tunib-ai/parallelformers
Releases · tunib-ai/parallelformers
v1.2.7
v1.2.4
Fix gpu overallocation issue by adding deallocation before forward
v1.2.3
Remove redundant operations
v1.2.2
v1.2
v1.1
v1.0.1
v1.0
- Parallelformers, which is based on Megatron LM, is designed to make model parallelization easier.
- You can parallelize various models in HuggingFace Transformers on multiple GPUs with a single line of code.
- Currently, Parallelformers only supports inference. Training features are NOT included.