Pytorch reimplement of the paper "A Novel Cascade Binary Tagging Framework for Relational Triple Extraction" ACL2020. The original code was written in keras.
- keras-bert
- tensorflow-gpu
- transformers
- CMED: CHIP-2020 中文医学文本实体关系抽取
-
Get the pre-trained Chinese BERT model
- Download the vocab.txt of BERT-wwm
- Get the pre-trained BERT cache
from transformers import * model = BertModel.from_pretrained("hfl/chinese-bert-wwm")
p.s. I use the chinese-bert-wwm here. You can also choose other pre-trained models like this
p.p.s. the bert cache usually will be /home/your user name/.cache/torch/transformers
-
Train the model
python train.py
-
Test the model
python test.py
CasRel-keras: test F1 45.59
CasRel-pytorch: test F1 47.59