Skip to content

Text summarization task #1079

Discussion options

You must be logged in to vote

Just adding to @jbischof's answer here: if you are looking for summarisation, you can alternatively use BART, an encoder-decoder model (instead of GPT-2). We do have a BART preset available for summarisation (finetuned on the CNN+Daily Mail dataset). You can use it like this:

import keras_nlp

model = keras_nlp.models.BartSeq2SeqLM.from_preset("bart_large_en_cnn")
model.compile("greedy")
model.generate("you text goes here", max_length=100)

Not sure what the quality of the summaries will be, depends on your dataset, and whether it lies in the same domain as CNN+DM. If the generated summaries aren't good, you can always finetune the model.

Replies: 3 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by mattdangerw
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
4 participants