You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this paper, we propose to generalize the traditional Sign and PReLU functions to RSign and RPReLU, which enable explicit learning of the distribution reshape and shift at near-zero extra cost. By adding simple learnable bias, ReActNet achieves 69.4% top-1 accuracy on Imagenet dataset with both weights and activations being binary, a near ResNet-level accuracy.
Citation
If you find our code useful for your research, please consider citing:
@inproceedings{liu2020reactnet,
title={ReActNet: Towards Precise Binary Neural Network with Generalized Activation Functions},
author={Liu, Zechun and Shen, Zhiqiang and Savvides, Marios and Cheng, Kwang-Ting},
booktitle={European Conference on Computer Vision (ECCV)},
year={2020}
}
Run
1. Requirements:
python3, pytorch 1.4.0, torchvision 0.5.0
2. Data:
Download ImageNet dataset
3. Steps to run:
(1) Step1: binarizing activations
Change directory to ./resnet/1_step1/ or ./mobilenet/1_step1/
run bash run.sh
(2) Step2: binarizing weights + activations
Change directory to ./resnet/2_step2/ or ./mobilenet/2_step2/