You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The CoT and retrieval-augmented CoT results are given in the fold results/, where the chain_of_thought_gpt3 indicates the responses.
Steps
For SKR_prompt and SKR_icl, we use the prompts shown in the paper to elicit the self-knowledge of the dev data directly.
For SKR_cls, we use the training data and train a BERT classifier to elicit the self-knowledge of the dev data. We use the settings with lr=2e-5 and epochs=10.
For SKR_knn, the steps are as follows:
cd source/ , collect the self-knowledge of the training data, run skr.py and get the train_skr.json file.
run knn.py to use the self-knowledge to the dev data and get the dev_skr_knn.json file.
run eval_skr.py to evaluate the results.
Citation
@inproceedings{wang-etal-2023-self-knowledge,
title = "Self-Knowledge Guided Retrieval Augmentation for Large Language Models",
author = "Wang, Yile and Li, Peng and Sun, Maosong and Liu, Yang",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2023",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.findings-emnlp.691",
pages = "10303--10315",
}