Publications
Zero-shot cross-lingual sequence tagging as Seq2Seq generation for joint intent classification and slot filling
Abstract
The joint intent classification and slot filling task seeks to detect the intent of an utterance and extract its semantic concepts. In the zeroshot cross-lingual setting, a model is trained on a source language and then transferred to other target languages through multi-lingual representations without additional training data. While prior studies show that pre-trained multilingual sequence-to-sequence (Seq2Seq) models can facilitate zero-shot transfer, there is little understanding on how to design the output template for the joint prediction tasks. In this paper, we examine three aspects of the output template–(1) label mapping,(2) task dependency, and (3) word order. Experiments on the MASSIVE dataset consisting of 51 languages show that our output template significantly improves the performance of pretrained cross-lingual language models.
Metadata
- publication
- year
- 2023
- publication date
- 2023
- authors
- Fei Wang, Kuan-Hao Huang, Anoop Kumar, Aram Galstyan, Greg Ver Steeg, Kai-Wei Chang
- link
- https://www.amazon.science/publications/zero-shot-cross-lingual-sequence-tagging-as-seq2seq-generation-for-joint-intent-classification-and-slot-filling
- resource_link
- https://www.amazon.science/publications/zero-shot-cross-lingual-sequence-tagging-as-seq2seq-generation-for-joint-intent-classification-and-slot-filling