You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Non-official implement of Paper:CBAM: Convolutional Block Attention Module
Introduction
The codes are PyTorch re-implement version for paper: CBAM: Convolutional Block Attention Module
Woo S, Park J, Lee J Y, et al. CBAM: Convolutional Block Attention Module[J]. 2018. ECCV2018
Structure
The overview of CBAM. The module has two sequential sub-modules:
channel and spatial. The intermediate feature map is adaptively refined through
our module (CBAM) at every convolutional block of deep networks.
Requirements
Python3
PyTorch 0.4.1
tensorboardX (optional)
torchnet
pretrainedmodels (optional)
Results
We just test four models in ImageNet-1K, both train set and val set are scaled to 256(minimal side), only use Mirror and RandomResizeCrop as training data augmentation, during validation, we use center crop to get 224x224 patch.
ImageNet-1K
Models
validation(Top-1)
validation(Top-5)
ResNet50
74.26
91.91
ResNet50-CBAM
75.45
92.55
About
Non-official implement of Paper:CBAM: Convolutional Block Attention Module