Update TwoWayTransformer Docs. (#16161)

Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
Co-authored-by: UltralyticsAssistant <web@ultralytics.com>
Co-authored-by: Ultralytics Assistant <135830346+UltralyticsAssistant@users.noreply.github.com>
pull/16216/head^2
Jason Guo 2 months ago committed by GitHub
parent 9850172707
commit 463ca1a804
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
  1. 1
      ultralytics/models/sam/modules/transformer.py

@ -61,7 +61,6 @@ class TwoWayTransformer(nn.Module):
Attributes:
depth (int): Number of layers in the transformer.
embedding_dim (int): Channel dimension for input embeddings.
embedding_dim (int): Channel dimension for input embeddings.
num_heads (int): Number of heads for multihead attention.
mlp_dim (int): Internal channel dimension for the MLP block.
layers (nn.ModuleList): List of TwoWayAttentionBlock layers.

Loading…
Cancel
Save