Skip to content

Commit

Permalink
selfattention block: Remove the fc linear layer if it is not used
Browse files Browse the repository at this point in the history
Signed-off-by: John Zielke <[email protected]>
  • Loading branch information
johnzielke committed Feb 4, 2025
1 parent 8dcb9dc commit 892edc6
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion monai/networks/blocks/selfattention.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,11 @@ def __init__(

self.num_heads = num_heads
self.hidden_input_size = hidden_input_size if hidden_input_size else hidden_size
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
self.out_proj: nn.Module
if include_fc:
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
else:
self.out_proj = nn.Identity()

self.qkv: Union[nn.Linear, nn.Identity]
self.to_q: Union[nn.Linear, nn.Identity]
Expand Down

0 comments on commit 892edc6

Please sign in to comment.