You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
08/10/2022. We release the source code. Any issues are welcomed!
15/09/2022. EcoFormer is accepted by NeurIPS 2022! 🔥🔥🔥
A Gentle Introduction
We present a novel energy-saving attention mechanism with linear complexity, called EcoFormer, to save the vast majority of multiplications from a new binarization perspective. More details can be found in our paper.
Installation
Requirements
Python ≥ 3.8
PyTorch 1.10.1
CUDA 11.1
Torchvision 0.11.2
PyTorch Image Models (timm) 0.4.9
MMCV 1.3.8
Einops 0.4.1
SciPy 1.8.0
Instructions
Use Anaconda to create the running environment for the project, kindly run
If you find EcoFormer useful in your research, please consider to cite the following related papers:
@inproceedings{liu2022ecoformer,
title={EcoFormer: Energy-Saving Attention with Linear Complexity},
author={Liu, Jing and Pan, Zizheng and He, Haoyu and Cai, Jianfei and Zhuang, Bohan},
booktitle={NeurIPS},
year={2022}
}
License
This repository is released under the Apache 2.0 license as found in the LICENSE file.
Acknowledgement
This repository is built upon PVT and Twins. We thank the authors for their open-sourced code.
About
[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of "EcoFormer: Energy-Saving Attention with Linear Complexity"