PhoGPT: Generative Pre-training for Vietnamese (2023)
-
Updated
Nov 12, 2024 - Python
PhoGPT: Generative Pre-training for Vietnamese (2023)
An autoregressive language model like ChatGPT.
A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
Simple GPT app that uses the falcon-7b-instruct model with a Flask front-end.
An Industrial Project about NLP in Finance Application
A implimentation of GPT2 varient.
PyTorch implementation of GPT from scratch
(GPT-1) | Generative Pre-trained Transformer - 1
I built a GPT model from scratch to generate text
Repository for all things Natural Language Processing
A Generatively Pretrained Transformer that generates Shakespeare-eque quotes
Repository for personal experiments
Add a description, image, and links to the generative-pre-trained-transformer topic page so that developers can more easily learn about it.
To associate your repository with the generative-pre-trained-transformer topic, visit your repo's landing page and select "manage topics."