You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The code is tested for Tensorflow-v1.0.1 and python-2.7. The code additionally needs numpy, scipy,
scikit-learn and caffe-r1.0 (caffe only for Zhang et al. colorization network).
Fetch data by
bash get_data.sh
Fetch Zhang et al. colorization network for MDN features by
bash get_zhang_colorization.sh
Execute run_lfw.sh to first train vae+mdn and then, generate results for LFW
bash run_lfw.sh
Execute run_demo.sh to get diverse colorization for any image, the model is trained on imagenet
bash run_demo.sh
If you use this code, please cite
@inproceedings{DeshpandeLDColor17,
author = {Aditya Deshpande, Jiajun Lu, Mao-Chuang Yeh, Min Jin Chong and David Forsyth},
title = {Learning Diverse Image Colorization},
booktitle={Computer Vision and Pattern Recognition},
url={https://arxiv.org/abs/1612.01958},
year={2017}
}
Some examples of diverse colorizations on LFW, LSUN Church and ImageNet-Val dataset
Some examples of diverse colorizations for images in the wild, model is trained on imagenet
About
Implementation of "Learning Diverse Image Colorization" CVPR'17