CARVIEW |
Navigation Menu
-
Notifications
You must be signed in to change notification settings - Fork 5.8k
[Auto Parallel] Add spmd rule No.9 for group_norm and group_norm_grad ops. #72946
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
VLOG(4) << "shape size of x: 3~5"; | ||
VLOG(4) << "Einsum Notation: " << x_axes << "," << scale_axes << "," | ||
<< bias_axes << "-->" << out_axes << "," << mean_axes << "," | ||
<< variance_axes; | ||
VLOG(4) << "X" | ||
<< " shape: [" << str_join(x_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(x_dist_attr_src.dims_mapping()) | ||
<< "] " | ||
<< "dst_dims_mapping: [" << str_join(x_dims_mapping) << "]"; | ||
VLOG(4) << "Scale" | ||
<< " shape: [" << str_join(scale_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(scale_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(scale_dist_attr_dst.dims_mapping()) << "]"; | ||
VLOG(4) << "Bias" | ||
<< " shape: [" << str_join(bias_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(bias_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(bias_dist_attr_dst.dims_mapping()) << "]"; | ||
VLOG(4) << "Out dims mapping: [" << str_join(out_dist_attr.dims_mapping()) | ||
<< "]"; | ||
VLOG(4) << "Mean dims mapping: [" << str_join(mean_dist_attr.dims_mapping()) | ||
<< "]"; | ||
VLOG(4) << "Variance dims mapping: [" | ||
<< str_join(variance_dist_attr.dims_mapping()) << "]"; | ||
VLOG(4) << std::endl; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we use macro LOG_SPMD_INPUT
or LOG_SPMD_OUTPUT
to simplify log code
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I will use it to simplify log code in next commit.
VLOG(4) << "Scale" | ||
<< " shape: [" << str_join(scale_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(scale_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(scale_dist_attr_dst.dims_mapping()) << "]"; | ||
VLOG(4) << "Bias" | ||
<< " shape: [" << str_join(bias_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(bias_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(bias_dist_attr_dst.dims_mapping()) << "]"; | ||
VLOG(4) << "Y" | ||
<< " shape: [" << str_join(y_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(y_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" << str_join(y_dist_attr_dst.dims_mapping()) | ||
<< "]"; | ||
VLOG(4) << "Mean" | ||
<< " shape: [" << str_join(mean_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(mean_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(mean_dist_attr_dst.dims_mapping()) << "]"; | ||
VLOG(4) << "Variance" | ||
<< " shape: [" << str_join(variance_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(variance_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(variance_dist_attr_dst.dims_mapping()) << "]"; | ||
VLOG(4) << "Y_grad" | ||
<< " shape: [" << str_join(y_grad_shape) << "] " | ||
<< "src_dims_mapping: [" << str_join(y_grad_dims_mapping) << "] " | ||
<< "dst_dims_mapping: [" | ||
<< str_join(y_grad_dist_attr_dst.dims_mapping()) << "]"; | ||
|
||
VLOG(4) << "X_grad dims mapping: [" | ||
<< str_join(x_grad_dist_attr.dims_mapping()) << "]"; | ||
VLOG(4) << "Scale_grad dims mapping: [" | ||
<< str_join(scale_grad_dist_attr.dims_mapping()) << "]"; | ||
VLOG(4) << "Bias_grad dims mapping: [" | ||
<< str_join(bias_grad_dist_attr.dims_mapping()) << "]"; | ||
VLOG(4) << std::endl; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we use macro LOG_SPMD_INPUT
or LOG_SPMD_OUTPUT
to simplify log code
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we use macro
LOG_SPMD_INPUT
orLOG_SPMD_OUTPUT
to simplify log code
Thanks, I will use it to simplify log code in next commit.
Sorry to inform you that f82d50d's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually. |
@Glencsa 每条评审意见需要都回复 |
/re-run all-failed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
… ops. (PaddlePaddle#72946) * add unary ops which have spmd_rule but not add in yaml file. * Add spmd_rule for group_norm ops. * Add spmd_rule for group_norm ops. * add CI test for group_norm. * add CI test for group_norm. * fix bug. * fix bug(PD_REGISTER_SPMD_RULE not surport string) * PD_REGISTER_SPMD_RULE need less than 5? * fix bug. * fix bug. * fix bug. * fix bug. * fix bug. * fix bug. * add partial status. * fix bug. * fix bug.
PR Category
Auto Parallel
PR Types
New features
Description