You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To Run classification tasks using 10 training examples: can be sst2 or sicke2b. can be pero or pero_abl (without sep token learning). <start_idx> specifies the trainin split, to reproduce results from the paper, use 0,10,20,30,40.
[1] Sawan Kumar and Partha Talukdar. 2021. Reordering Examples Helps during Priming-based Few-Shot Learning. To appear in Findings of ACL, 2021. Association for Computational Linguistics.
References
[2] Shin, Taylor, et al. "Eliciting Knowledge from Language Models Using Automatically Generated Prompts." Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2020.
[3] Petroni, Fabio, et al. "Language Models as Knowledge Bases?." Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019.
Contact
For any clarification, comments, or suggestions please create an issue or contact sawankumar@iisc.ac.in