Update TwoWayTransformer Docs. (#16161)

Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
Co-authored-by: UltralyticsAssistant <web@ultralytics.com>
Co-authored-by: Ultralytics Assistant <135830346+UltralyticsAssistant@users.noreply.github.com>
action-recog
Jason Guo 3 months ago committed by fcakyon
parent 67e1b4f38e
commit 8e94bb2bdc
  1. 1
      ultralytics/models/sam/modules/transformer.py

@ -61,7 +61,6 @@ class TwoWayTransformer(nn.Module):
Attributes:
depth (int): Number of layers in the transformer.
embedding_dim (int): Channel dimension for input embeddings.
embedding_dim (int): Channel dimension for input embeddings.
num_heads (int): Number of heads for multihead attention.
mlp_dim (int): Internal channel dimension for the MLP block.
layers (nn.ModuleList): List of TwoWayAttentionBlock layers.

Loading…
Cancel
Save