Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 476 Bytes

File metadata and controls

7 lines (4 loc) · 476 Bytes

Greek-Transformer-Model-Punctuation-Prediction

The paper for this model is available at:

This repository is used to store starter code to use the Greek Transformer Punctuation Prediction Model available at Hugging Face at: https://huggingface.co/Andrianos/bert-base-greek-punctuation-prediction-finetuned

Within this repository you can find a Python Notebook file that initializes and runs examples of the Model. Feel free to suggest new functionality to the demo file!