Skip to content

Commit

Permalink
add docstring example for compute_loss_func (#35020)
Browse files Browse the repository at this point in the history
  • Loading branch information
secrettoad authored Dec 2, 2024
1 parent 3129967 commit f0dec87
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -360,8 +360,7 @@ class Trainer:
inner layers, dropout probabilities etc).
compute_loss_func (`Callable`, *optional*):
A function that accepts the raw model outputs, labels, and the number of items in the entire accumulated
batch (batch_size * gradient_accumulation_steps) and returns the loss. For example, here is one using
the loss function from `transformers`
batch (batch_size * gradient_accumulation_steps) and returns the loss. For example, see the default [loss function](https://github.com/huggingface/transformers/blob/052e652d6d53c2b26ffde87e039b723949a53493/src/transformers/trainer.py#L3618) used by [`Trainer`].
compute_metrics (`Callable[[EvalPrediction], Dict]`, *optional*):
The function that will be used to compute metrics at evaluation. Must take a [`EvalPrediction`] and return
a dictionary string to metric values. *Note* When passing TrainingArgs with `batch_eval_metrics` set to
Expand Down

0 comments on commit f0dec87

Please sign in to comment.