mirror of https://github.com/opencv/opencv.git
Merge pull request #25861 from Abdurrahheem:ash/torch-attention-export-fix-4x
Merge pull request #25861 from Abdurrahheem:ash/torch-attention-export-fix-4x
Support for Unflatten operation requred by Attention layer - 4.x #25861
### Pull Request Readiness Checklist
All test data and models for PR are located [#1190](https://github.com/opencv/opencv_extra/pull/1190)
This PR fixes issue reised when importing batched vanilla `Attention` layer from `PyTorch` via ONNX. Currently batched version of `Attention` layer in PyTorch [has unflatten operation inside](e3b3431c42/torch/nn/functional.py (L5500C17-L5500C31)
). `unflatten` operation causes issue in `reshape` layer (see the Reshape_2 in the graph below) due to incorrect output of `slice` layer. This PR particularly fixes `slice` and `concat` layers to handle `unflatten` operation.
<img width="673" alt="image" src="https://github.com/opencv/opencv/assets/44877829/5b612b31-657a-47f1-83a4-0ac35a950abd">
See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request
- [x] I agree to contribute to the project under Apache 2 License.
- [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or another license that is incompatible with OpenCV
- [x] The PR is proposed to the proper branch
- [x] There is a reference to the original bug report and related work
- [x] There is accuracy test, performance test and test data in opencv_extra repository, if applicable
Patch to opencv_extra has the same branch name.
- [x] The feature is well documented and sample code can be built with the project CMake
pull/25870/head
parent
9f5139b575
commit
efbc9f0b66
3 changed files with 18 additions and 5 deletions
Loading…
Reference in new issue