You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here are some things you can do with this package out of the box.
Train the main model
Access and download data/tmp_seq_data at Google Drive (4.6 G)
run sh train_taco_lm.sh
The script is set to default parameters and will export the model to models/. You can configure differently by editing the script.
The training process will generate one directory to store the loss logs, as well as NUM_EPOCH directories for each epoch's model.
You will need to add BERT's vocab.txt to the epoch directories for evaluation. See more detail in the next section on pre-trained models.
The training data is pre-generated and formatted. More details here.
Experiments
You can download pre-trained models in models/ at Google Drive (0.4 G each),
or follow the training procedure in the previous section.
General Usage
You can do many things with the model by just treating it as a set of transformer weights that fit exactly into a BERT-base model. Have an on-going project with BERT? Give it a try!
Intrinsic Experiments
The intrinsic evaluation relies on pre-formatted data.
run sh eval_intrinsic.sh
see eval_results/intrinsic.txt for results
TimeBank Experiment
by default this requires the epoch 2 model.
run sh eval_timebank.sh to produce evaluation results on 3 different seeds. They are by default stored under eval_results
run python scripts/eval_timebank.py to see result interpretations.
HiEVE Experiment
by default this requires the epoch 2 model.
run sh eval_hieve.sh to produce eval results under eval_results
run python scripts/eval_hieve.py to see interpretations.
@inproceedings{ZNKR20,
author = {Ben Zhou, Qiang Ning, Daniel Khashabi and Dan Roth},
title = {Temporal Common Sense Acquisition with Minimal Supervision},
booktitle = {ACL},
year = {2020},
}
About
Temporal Common Sense Acquisition with Minimal Supervision, ACL'20