Back

Dual Supervised Learning for Natural Language Understanding and Generation

This paper is published in Proceedings of The 57th Annual Meeting of the Association for Computational Linguistics (ACL2019).

Full paper: Here, arXiv

Natural language understanding (NLU) and natural language generation (NLG) are both critical research topics in the NLP field. Natural language understanding is to extract the core semantic meaning from the given utterances, while natural language generation is opposite, of which the goal is to construct corresponding sentences based on the given semantics. However, such dual relationship has not been investigated in the literature. This paper proposes a new learning framework for language understanding and generation on top of dual supervised learning, providing a way to exploit the duality. The preliminary experiments show that the proposed approach boosts the performance for both tasks.


Researcher of Natural Language Processing.
Shang-Yu Su