Skip to content

Commit 6ea25a7

Browse files
authored
Fix require_grad typo
Fix require_grad typos (should be requires_grad). Before the fix, the code doesn't cause any errors but doesn't do what it's supposed to do. Also see pytorch/benchmark#1771
1 parent d10dc4f commit 6ea25a7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

bert_pytorch/model/embedding/position.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ def __init__(self, d_model, max_len=512):
1010

1111
# Compute the positional encodings once in log space.
1212
pe = torch.zeros(max_len, d_model).float()
13-
pe.require_grad = False
13+
pe.requires_grad = False
1414

1515
position = torch.arange(0, max_len).float().unsqueeze(1)
1616
div_term = (torch.arange(0, d_model, 2).float() * -(math.log(10000.0) / d_model)).exp()

0 commit comments

Comments
 (0)