You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For other training objectives, replace the --mode option with baseline, da, mt, or sla+sd.
For other augmentations, replace the --aug option with the function names in augmentations.py.
Large-scale datasets. We empirically found that summation (instead of average) of losses across self-supervised transformations could provide an accuracy gain in the large-scale datasets such as ImageNet or iNaturalist. To this end, use the --with-large-loss option.
Evaluation
You can check the results in the log files stored in the logs/ directory (single_acc for SLA+SI or SLA+SD; agg_acc for SLA+AG). To re-evaluation, use test.py.
BibTeX
@inproceedings{lee2020_sla,
title={Self-supervised label augmentation via input transformations},
author={Lee, Hankook and Hwang, Sung Ju and Shin, Jinwoo},
booktitle={International Conference on Machine Learning},
pages={5714--5724},
year={2020},
organization={PMLR}
}}
About
Self-supervised Label Augmentation via Input Transformations (ICML 2020)