You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Download the pre-trained model basnet.pth from GoogleDrive or baidu extraction code: 6phq, and put it into the dirctory 'saved_models/basnet_bsi/'
Cd to the directory 'BASNet', run the training or inference process by command: python basnet_train.py
or python basnet_test.py respectively.
We also provide the predicted saliency maps (GoogleDrive,Baidu) for datasets SOD, ECSSD, DUT-OMRON, PASCAL-S, HKU-IS and DUTS-TE.
Architecture
Quantitative Comparison
Qualitative Comparison
Citation
@article{DBLP:journals/corr/abs-2101-04704,
author = {Xuebin Qin and
Deng{-}Ping Fan and
Chenyang Huang and
Cyril Diagne and
Zichen Zhang and
Adri{\`{a}} Cabeza Sant'Anna and
Albert Su{\`{a}}rez and
Martin J{\"{a}}gersand and
Ling Shao},
title = {Boundary-Aware Segmentation Network for Mobile and Web Applications},
journal = {CoRR},
volume = {abs/2101.04704},
year = {2021},
url = {https://arxiv.org/abs/2101.04704},
archivePrefix = {arXiv},
}
Citation
@InProceedings{Qin_2019_CVPR,
author = {Qin, Xuebin and Zhang, Zichen and Huang, Chenyang and Gao, Chao and Dehghan, Masood and Jagersand, Martin},
title = {BASNet: Boundary-Aware Salient Object Detection},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}
About
Code for CVPR 2019 paper. BASNet: Boundary-Aware Salient Object Detection