PhD student @ Hacettepe University
-
TS Corpus
- WonderLand
- http://tanersezer.com
Pinned Loading
-
ts_tokenizer
ts_tokenizer PublicTS Tokenizer is a hybrid (lexicon-based and rule-based) tokenizer designed specifically for tokenizing Turkish texts.
Python 2
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.