Publications

LLM-driven instruction following: Progresses and concerns

Abstract

The progress of natural language processing (NLP) is primarily driven by machine learning that optimizes a system on a large-scale set of task-specific labeled examples. This learning paradigm limits the ability of machines to have the same capabilities as humans in handling new tasks since humans can often solve unseen tasks with a couple of examples accompanied by task instructions. In addition, we may not have a chance to prepare task-specific examples of large-volume for new tasks because we cannot foresee what task needs to be addressed next and how complex to annotate for it. Therefore, task instructions act as a novel and promising resource for supervision. This tutorial targets researchers and practitioners who are interested in AI and ML technologies for NLP generalization in a low-shot scenario. In particular, we will present a diverse thread of instruction-driven NLP studies that try to answer the following questions:(i) What is task instruction?(ii) How is the process of creating datasets and evaluating systems conducted?(iii) How to encode task instructions?(iv) When and why do some instructions work better?(v) What concerns remain in LLM-driven instruction following? We will discuss several lines of frontier research that tackle those challenges and will conclude the tutorial by outlining directions for further investigation.

Metadata

publication
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
year
2023
publication date
2023/12
authors
Wenpeng Yin, Qinyuan Ye, Pengfei Liu, Xiang Ren, Hinrich Schütze
link
https://aclanthology.org/2023.emnlp-tutorial.4/
resource_link
https://aclanthology.org/2023.emnlp-tutorial.4.pdf
conference
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts
pages
19-25