Skip to content

Commit

Permalink
Fix require_grad typo
Browse files Browse the repository at this point in the history
Fix require_grad typos (should be requires_grad).
Before the fix, the code doesn't cause any errors but doesn't do what it's supposed to do.

Also see pytorch/benchmark#1771
  • Loading branch information
kit1980 authored Jul 18, 2023
1 parent d10dc4f commit 6ea25a7
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion bert_pytorch/model/embedding/position.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ def __init__(self, d_model, max_len=512):

# Compute the positional encodings once in log space.
pe = torch.zeros(max_len, d_model).float()
pe.require_grad = False
pe.requires_grad = False

position = torch.arange(0, max_len).float().unsqueeze(1)
div_term = (torch.arange(0, d_model, 2).float() * -(math.log(10000.0) / d_model)).exp()
Expand Down

0 comments on commit 6ea25a7

Please sign in to comment.