CARVIEW |
Navigation Menu
-
Notifications
You must be signed in to change notification settings - Fork 24.7k
[Quant][Inductor] expand quantization conv-binary(-unary) pattern fusion inside inductor #138051
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/138051
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit d1ed386 with merge base e4ad028 ( BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR. Please add the corresponding UT for these patterns.
Thanks for your suggestion, I have added the corresponding UT for these patterns. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. I think we should further enhance the _qconv2d_add_cpu_test_helper2
by swapping inputs of add.
x1 = self.conv1(x) | ||
tmp = self.add_fn(x1, x2) | ||
if self.use_relu: | ||
tmp = self.relu(tmp) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Further adding a flag to swap inputs of add?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added, thanks for your advice!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
small nits on the code comment, others LGTM.
X | ||
/ \ | ||
Conv(X) extra input |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A bit confusing here: the "extra input" doesn't depend on "X", right? Why adding a link here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your comments! yes, you are right, it's my mistake, I have fixed it.
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…ion inside inductor (#138051) ### Summary Expand quantization conv-binary(-unary) pattern fusion inside inductor to support the following two patterns: Pattern 1: ``` Conv(X) extra input \ / Add | Optional(relu) | Y ``` Pattern 2: ``` extra input Conv(X) \ / Add | Optional(relu) | Y ``` Pull Request resolved: #138051 Approved by: https://github.com/leslie-fang-intel, https://github.com/jansel, https://github.com/jgong5
Stack from ghstack (oldest at bottom):
Summary
Expand quantization conv-binary(-unary) pattern fusion inside inductor to support the following two patterns:
Pattern 1:
Pattern 2:
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @chauhang @aakhundov