You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Accelerating Diffusion Transformer via Increment-Calibrated Caching with Channel-Aware Singular Value Decomposition
This is the official implementation of CVPR2025 paper Accelerating Diffusion Transformer via Increment-Calibrated Caching with Channel-Aware Singular Value Decomposition.
Quick Start
Take increment-calibrated caching for DiT as an example.
Setup
Download and setup the repo:
git clone https://github.com/ccccczzy/icc.git
cd icc/DiT
Create the environment and install required packages:
conda env create -f environment.yml
conda activate DiT
@misc{chen2025icc,
title={Accelerating Diffusion Transformer via Increment-Calibrated Caching with Channel-Aware Singular Value Decomposition},
author={Zhiyuan Chen and Keyi Li and Yifan Jia and Le Ye and Yufei Ma},
year={2025},
eprint={2505.05829},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2505.05829},
}
Acknowledgments
This codebase borrows from DiT, PixArt-alpha and ADM. Thanks to the authors for their wonderful work and codebase!
About
[CVPR 2025] Accelerating Diffusion Transformer via Increment-Calibrated with Channel-Aware Singular Value Decomposition