You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[ACL 2021] This is the project containing source codes and pre-trained models about ACL2021 Long Paper ``LGESQL: Line Graph Enhanced Text-to-SQL Model with Mixed Local and Non-Local Relations".
@inproceedings{cao-etal-2021-lgesql,
title = "{LGESQL}: Line Graph Enhanced Text-to-{SQL} Model with Mixed Local and Non-Local Relations",
author = "Cao, Ruisheng and
Chen, Lu and
Chen, Zhi and
Zhao, Yanbin and
Zhu, Su and
Yu, Kai",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.198",
doi = "10.18653/v1/2021.acl-long.198",
pages = "2541--2555",
}
Create environment and download dependencies
The following commands are provided in setup.sh.
Firstly, create conda environment text2sql:
In our experiments, we use torch==1.6.0 and dgl==0.5.3 with CUDA version 10.1
We use one GeForce RTX 2080 Ti for GLOVE and base-series pre-trained language model~(PLM) experiments, one Tesla V100-PCIE-32GB for large-series PLM experiments
Download pre-trained language models from Hugging Face Model Hub, such as bert-large-whole-word-masking and electra-large-discriminator, into the pretrained_models directory. The vocab file for glove.42B.300d is also pulled: (please ensure that Git LFS is installed)
Download, unzip and rename the spider.zip into the directory data.
Merge the data/train_spider.json and data/train_others.json into one single dataset data/train.json.
Preprocess the train and dev dataset, including input normalization, schema linking, graph construction and output actions generation. (Our preprocessed dataset can be downloaded here)
./run/run_preprocessing.sh
Training
Training LGESQL models with GLOVE, BERT and ELECTRA respectively:
Create the directory saved_models, save the trained model and its configuration (at least containing model.bin and params.json) into a new directory under saved_models, e.g. saved_models/electra-msde-75.1/.
For evaluation, see run/run_evaluation.sh and run/run_submission.sh (eval from scratch) for reference.
Model instances and submission scripts are available in codalab:plm and google drive: including submitted BERT and ELECTRA models. Codes and model for GLOVE are deprecated.
Results
Dev and test EXACT MATCH ACC in the official leaderboard, also provided in the results directory:
model
dev acc
test acc
LGESQL + GLOVE
67.6
62.8
LGESQL + BERT
74.1
68.3
LGESQL + ELECTRA
75.1
72.0
Acknowledgements
We would like to thank Tao Yu, Yusen Zhang and Bo Pang for running evaluations on our submitted models. We are also grateful to the flexible semantic parser TranX that inspires our works.
About
[ACL 2021] This is the project containing source codes and pre-trained models about ACL2021 Long Paper ``LGESQL: Line Graph Enhanced Text-to-SQL Model with Mixed Local and Non-Local Relations".