Skip to content

Commit

Permalink
Merge pull request #246 from aws-samples/KeitaW-patch-1
Browse files Browse the repository at this point in the history
Fix typo in 15.gpt-neox README
  • Loading branch information
verdimrc authored Apr 11, 2024
2 parents a7676b6 + cbcae5d commit e71f3f8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion 3.test_cases/15.gpt-neox/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Pythia GPT-NeoX Test Case <!-- omit in toc -->

GPT-NeoX is an [EleutherAI](https://www.eleuther.ai)'s library for training large-scale language models on GPUs. This framework is based on NVIDIA's Megatron Language Model](https://github.com/NVIDIA/Megatron-LM) and has been augmented with techniques from [DeepSpeed](https://www.deepspeed.ai as well as some novel optimizations. This test case illustrates how to train [Pythia](https://arxiv.org/abs/2304.01373) model using GPT-Neox.
GPT-NeoX is an [EleutherAI](https://www.eleuther.ai)'s library for training large-scale language models on GPUs. This framework is based on [NVIDIA's Megatron Language Model](https://github.com/NVIDIA/Megatron-LM) and has been augmented with techniques from [DeepSpeed](https://www.deepspeed.ai) as well as some novel optimizations. This test case illustrates how to train [Pythia](https://arxiv.org/abs/2304.01373) model using GPT-Neox.

## 1. Preparation

Expand Down

0 comments on commit e71f3f8

Please sign in to comment.