You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is the official repository for our paper TAP: TARGETED PROMPTING FOR TASK ADAPTIVE GENERATION OF TEXTUAL
TRAINING INSTANCES FOR VISUAL CLASSIFICATION. We provide the code for reproducing the results
for all the 8 datasets used in our paper.
Installation
Our code is built upon the official codebase of the CoOp paper and has been
tested in an environment having python 3.8.8 and pytorch 2.0.1 compiled with CUDA 11.6.
As a first step, install dassl library (under TAP/) in your environment by following the instructions here.
To further install all other dependencies, please run the following command, after having your environment activated:
pip install -r requirements.txt
Datasets
Please download and structure your datasets according to the instructions provided in the CoOp
official repository. All the 8 datasets should be present in the data/ directory.
Descriptions
The generic and dataset specific descriptions for all the 8 datasets are present in the descriptions/ directory.
Experiments
TAP
To reproduce the results for TAP all the 8 datasets in Table 1, please run the following command:
bash scripts/tap.sh <dataset_name>
where <dataset_name> can be one of dtdoxford_flowersimagenet_rfgvc_aircraftfood101eurosatucf101sun397
Zero-Shot
Similarly, to obtain zero-shot CLIP results with the single prompt template a photo of a {category}. Please run:
bash scripts/zeroshot.sh <dataset_name>
by replacing the <dataset_name> with one of the 8 datasets mentioned above.