Frugal Prompting for Dialog Models

Frugal Prompting for Dialog Models

227 Lượt nghe
Frugal Prompting for Dialog Models
The use of large language models (LLMs) in natural language processing (NLP) tasks is rapidly increasing, leading to changes in how researchers approach problems in the field. To fully utilize these models’ abilities, a better understanding of their behavior for different input protocols is required. With LLMs, users can directly interact with the models through a text-based interface to define and solve various tasks. Hence, understanding the conversational abilities of these LLMs, which may not have been specifically trained for dialog modeling, is also important. This study examines different approaches for building dialog systems using LLMs by considering various aspects of the prompt. As part of prompt tuning, we experiment with various ways of providing instructions, exemplars, current query and additional context. The research also analyzes the representations of dialog history that have the optimal usable-information density. Based on the findings, the paper suggests more compact ways of providing dialog history information while ensuring good performance and reducing model’s inference-API costs. The research contributes to a better understanding of how LLMs can be effectively used for building interactive systems. In this video, I will talk about the following: 00:00:00 Huge inference costs of LLMs 00:01:57 Related Work 00:04:20 Prompt Ingredients for Dialog Systems 00:07:16 Manual versus Perplexity Prompts 00:10:55 Optimizing the Dialog History Input 00:13:24 Datasets: Multisession Chat (MSC) and Topical Chat (TC) 00:14:56 Models; Prompt Design; Metrics 00:18:54 Input Lengths 00:22:18 Absolute perf analysis 00:26:14 UID Results and Analysis 00:29:38 Varying a in UID For more details, please look at Paper: https://drive.google.com/file/d/1lIzXG9oOvOWFGl0wsjvxf-9IF_5t4bwa/view Slides: https://docs.google.com/presentation/d/1CC2z-KQogw0LpoBYeta5hvQMDxIWYAtn/ Bishal Santra, Sakya Basak, Abhinandan De, Manish Gupta, Pawan Goyal. Frugal Prompting for Dialog Models. The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023.