Publication
EMNLP 2023
Conference paper

DiSTRICT: Dialogue State Tracking with Retriever Driven In-Context Tuning

Abstract

Dialogue State Tracking (DST), a key component of task-oriented conversation systems, represents user intentions by determining the values of pre-defined slots. Existing approaches use hand-crafted templates and additional slot information to prompt large pre-trained language models and elicit slot values from the dialogue history. This requires significant manual effort and domain knowledge to design effective prompts, and limits generalizability to new domains and tasks. In this work, we propose DiSTRICT, a generalizable in-context tuning approach for DST that retrieves highly relevant training examples for a given dialogue to create prompts in an automated manner without the need for human input. Additionally, our approach achieves and exceeds the performance of prior work using a much smaller model, providing an important advantage for real-world deployments that often have limited resource availability. Experiments on the MultiWOZ benchmark datasets show that DiSTRICT outperforms existing approaches in various zero-shot and few-shot settings.

Date

Publication

EMNLP 2023