Skip to content

Commit

Permalink
fix typo in GradScaler docstring (pytorch#42236)
Browse files Browse the repository at this point in the history
Summary:
Closes pytorch#42226.

Pull Request resolved: pytorch#42236

Reviewed By: albanD

Differential Revision: D22817980

Pulled By: ngimel

fbshipit-source-id: 4326fe028dba1dbeed454edc4e4d4fffa56f51d6
  • Loading branch information
mcarilli authored and facebook-github-bot committed Jul 29, 2020
1 parent 79cfd85 commit 7cdf786
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion torch/cuda/amp/grad_scaler.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ class GradScaler(object):
Arguments:
init_scale (float, optional, default=2.**16): Initial scale factor.
growth_factor (float, optional, default=2.0): Factor by which the scale is multiplied during
:meth:`update` if no inf/NaN gradients occur for ``growth_factor`` consecutive iterations.
:meth:`update` if no inf/NaN gradients occur for ``growth_interval`` consecutive iterations.
backoff_factor (float, optional, default=0.5): Factor by which the scale is multiplied during
:meth:`update` if inf/NaN gradients occur in an iteration.
growth_interval (int, optional, default=2000): Number of consecutive iterations without inf/NaN gradients
Expand Down

0 comments on commit 7cdf786

Please sign in to comment.