Skip to content

Commit d3f9c83

Browse files
authored
Fixed positional encoding not used in Demo Transformer (#20099)
1 parent 29f70e7 commit d3f9c83

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/lightning/pytorch/demos/transformer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ def forward(self, x: Tensor) -> Tensor:
8888
# TODO: Could make this a `nn.Parameter` with `requires_grad=False`
8989
self.pe = self._init_pos_encoding(device=x.device)
9090

91-
x + self.pe[: x.size(0), :]
91+
x = x + self.pe[: x.size(0), :]
9292
return self.dropout(x)
9393

9494
def _init_pos_encoding(self, device: torch.device) -> Tensor:

0 commit comments

Comments
 (0)