Releases: luozhouyang/transformers-keras
Releases · luozhouyang/transformers-keras
Release v0.4.1
Updates:
- Rename arguemt of
from_pretrained
frommodel_params
tooverride_params
- Add
BertForSentenceEmbedding
model for embedding extraction - Add
BertForTokenClassification
、AlbertForTokenClassification
models for token classify tasks - Add Conditioanl Random Field based models
BertCRFForTokenClassification
andAlbertCRFForTokenClassification
for token classify tasks
Release v0.4.0
Updates:
- Simplify pretrained models loadding
- Add down-stream models, like
BertForQuestionAnswering
,BertForSequenceClassification
Release v0.3.1
Updates:
- Convert arguments of
Bert.call
toList
Release v0.3.0
Changes:
- Refactoring modeling to simplify pretrained weights adapting
- Add new signature to export model in SavedModel format
Release v0.2.6
Updates:
- Pass arguments as tuple in
call
when subclassingkeras.Model
, which makes it simplier to export model in saved model format
Release v0.2.5
Updates:
- Convert
hidden_states
andattention_weights
of each encoder layer fromList
toTensor
- Update
adapters
to allow arbitrary model name when loading pretrained checkpoints
Release v0.2.4
Updates:
- Expand arguments of
call
method instead of as a tuple
Release v0.2.3
Updates:
- Add support for skip loading weights from ckpt
- Add tokenizers
Release v0.2.2
Updates
- Optionally return
encoder states
andattention weights
, bothFalse
in default.
Release v0.2.1
Updates:
- Refactored the adapter for more flexible use