Skip to content

Commit

Permalink
Fix parameter (facebookresearch#3045)
Browse files Browse the repository at this point in the history
Summary:
# Before submitting

- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [x] Did you read the [contributor guideline](https://github.com/pytorch/fairseq/blob/master/CONTRIBUTING.md)?
- [x] Did you make sure to update the docs?
- [x] Did you write any new necessary tests?

## What does this PR do?

`src_lengths` is not a required parameter in `TransformerEncoder`.
It is a dummy variable.

Maybe more changes should be done to fix this issue in Class such as `Transformer`, `FairseqEncoderDecoderModel`, `BARTModel` etc.

## Did you have fun?
Make sure you had fun coding �

Pull Request resolved: facebookresearch#3045

Reviewed By: ngoyal2707

Differential Revision: D25632992

Pulled By: myleott

fbshipit-source-id: 762d595144b611e1a6c236248d7001049afed0ab
  • Loading branch information
xu-song authored and facebook-github-bot committed Dec 18, 2020
1 parent edc321e commit a041e1a
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions fairseq/models/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -402,7 +402,7 @@ def forward_embedding(
def forward(
self,
src_tokens,
src_lengths,
src_lengths: Optional[torch.Tensor] = None,
return_all_hiddens: bool = False,
token_embeddings: Optional[torch.Tensor] = None,
):
Expand All @@ -418,7 +418,7 @@ def forward(
default `None` will recompute embeddings
Returns:
namedtuple:
dict:
- **encoder_out** (Tensor): the last encoder layer's output of
shape `(src_len, batch, embed_dim)`
- **encoder_padding_mask** (ByteTensor): the positions of
Expand Down

0 comments on commit a041e1a

Please sign in to comment.