Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Dialogue Relation Extraction with Document-level Heterogeneous Graph Attention Networks

License

Notifications You must be signed in to change notification settings

declare-lab/dialog-HGAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

History

12 Commits

Repository files navigation

Dialog-HGAT

This repository contains the Pytorch implementation of Dialogue Relation Extraction with Document-level Heterogeneous Graph Attention Networks.

Setup

Download GloVe vectors from here and put it into dataset/ folder

Next Install the required libraries:

  1. Assume you have installed Pytorch >= 1.5
  2. Install dgl library according to your cuda version using the commands below
pip install --pre dgl-cu100 # For CUDA 10.0 Build
pip install --pre dgl-cu101 # For CUDA 10.1 Build
pip install --pre dgl-cu102 # For CUDA 10.2 Build
  1. Install PytorchLightning github
  2. Install from requirements.txt by pip install -r requirements.txt and run python -m spacy download en_core_web_sm

Run code

Training

python main.py

Testing

python main.py --mode test --ckpt_path [your_ckpt_file_path]

Citation

If you find the code helpful in your research, please cite:

@article{chen2020dialogue,
 title={Dialogue relation extraction with document-level heterogeneous graph attention networks},
 author={Chen, Hui and Hong, Pengfei and Han, Wei and Majumder, Navonil and Poria, Soujanya},
 journal={Cognitive Computation},
 year={2022}
}

About

Dialogue Relation Extraction with Document-level Heterogeneous Graph Attention Networks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

Languages

AltStyle によって変換されたページ (->オリジナル) /