CARVIEW |
Navigation Menu
-
Notifications
You must be signed in to change notification settings - Fork 5.8k
[PIR][oneDNN] Add matmul_activation_fuse_pass #62901
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
e26289f
to
ff633b9
Compare
Hi @onecatcn,这个PR的CI已经过了,可以麻烦帮忙请相应的owner来review一下吗?Thx~ |
return true; | ||
}); | ||
pat.RequireNativeCall([&](const paddle::drr::MatchContext &match_ctx) { | ||
auto result_gelu = match_ctx.Attr<bool>("approximate"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
approximate为false就不能匹配吗?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我把两种Gelu的情况拆开来写了,这个是新加的一个MatmulGeluTanhPattern,用来单独表示approximate为true时可以fuse的pattern,approximate 为false的时候是跟MatmulActivationFusePattern一起的,会进matmul+gelu_erf fusion。
5776f37
to
e4cd1f1
Compare
pir::RewritePatternSet ps(context); | ||
// std::vector<bool> bool_set = {false, true}; | ||
int benefit_idx = 1; | ||
for (auto act_op : act_ops) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for (auto act_op : act_ops) { | |
for (const auto& act_op : act_ops) { |
另外关于benefit_idx的设置可能有些问题,从代码逻辑上看,从上到下依次添加的这些pattern,他们的benefit_idx是依次变大的,benefit_idx值越大的pattern,我们期望是越优先被应用,以MatmulClipFusePattern和FusedMatmulClipFusePattern来说,我们期望MatmulClipFusePattern比FusedMatmulClipFusePattern先应用,因此MatmulClipFusePattern的benefit_idx应该较大才对
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
从我的理解来看,因为这个pass目前matmul相关pass里不属于第一个pass,也就是说,在之后遇到fused_matmul的可能性较大,所以这种情况下优先match fused_matmul会不会更好一点?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
不过如果是以单个pattern为准在整个graph上取多个符合要求的pattern,那确实应该是原生matmul+act被match到的个数更多的可能性较大
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
目前这样也可以,问题不大
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
好的,那我就先不改了哈。如果之后pass order有变化或者说还是需要以非fused_matmul的pattern为主我再随时更改~
paddle::dialect::MatmulOp::name(), | ||
paddle::onednn::dialect::FusedMatmulOp::name(), | ||
benefit_idx++)); | ||
for (auto act_op : act_ops) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for (auto act_op : act_ops) { | |
for (const auto& act_op : act_ops) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Hi @wanghuancoder , @XiaoguangHu01 可以麻烦帮忙review/approve一下吗?Thx~ |
PR types
New features
PR changes
Others
Description
Based on new pass mechanism, here we add pass "matmul_activation_fuse_pass" for PIR.
The new pass is same as "matmul_activation_mkldnn_fuse_pass" in "/paddle/fluid/framework/ir/mkldnn/matmul_activation_mkldnn_fuse_pass.cc"