Replies: 3 comments 1 reply
-
Hi, so right now I purposefully abstract that away as I have to do some case specific args to the tokenizers so allowing custom args could result in explanations not working. Can I ask what you are trying to achieve? |
Beta Was this translation helpful? Give feedback.
-
Hi Charles.
I tried the following on the Tokenizer
Using BERT to try a Binary model I have trained. Using Transformer Interpret to check the reason behind the prediction. Any suggestions? |
Beta Was this translation helpful? Give feedback.
-
Here is what we ended up doing when running into the same error: from functools import partial tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased") There may be a better way, but this worked for now. |
Beta Was this translation helpful? Give feedback.
-
Hello! I can't understand how to set any args to BertTokenizer. Is it true that there is no way to set truncate, special tokens, etc.?
Beta Was this translation helpful? Give feedback.
All reactions