Lulu Zhao

Lulu Zhao
  • Doctor of Engineering
  • Beijing University of Posts and Telecommunications

About

12
Publications
3,050
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
168
Citations
Current institution
Beijing University of Posts and Telecommunications

Publications

Publications (12)
Preprint
Existing controllable dialogue generation work focuses on the single-attribute control and lacks generalization capability to out-of-distribution multiple attribute combinations. In this paper, we explore the compositional generalization for multi-attribute controllable dialogue generation where a model can learn from seen attribute values and gene...
Conference Paper
Full-text available
Existing controllable dialogue generation work focuses on the single-attribute control and lacks generalization capability to out-of-distribution multiple attribute combinations. In this paper, we explore the compositional generalization for multi-attribute controllable dialogue generation where a model can learn from seen attribute values and gene...
Conference Paper
Full-text available
Traditional dialogue summarization models rely on a large-scale manually-labeled corpus, lacking generalization ability to new domains, and domain adaptation from a labeled source domain to an unlabeled target domain is important in practical summarization scenarios. However, existing domain adaptation works in dialogue summarization generally requ...
Conference Paper
Full-text available
The most advanced abstractive dialogue summarizers lack generalization ability on new domains and the existing researches for domain adaptation in summarization generally rely on large-scale pre-trainings. To explore the lightweight fine-tuning methods for domain adaptation of dialogue summarization, in this paper, we propose an efficient and gener...
Preprint
Full-text available
The most advanced abstractive dialogue summarizers lack generalization ability on new domains and the existing researches for domain adaptation in summarization generally rely on large-scale pre-trainings. To explore the lightweight fine-tuning methods for domain adaptation of dialogue summarization, in this paper, we propose an efficient and gener...
Article
Currently, sequence/graph-to-sequence models for abstractive dialogue summarization are being studied extensively. However, previous methods strive to integrate complex events spanning multiple utterances, and the generated summaries are often filled with incorrect facts. In this study, we first utilize the speaker-aware structure to model the info...
Preprint
Full-text available
Previous dialogue summarization datasets mainly focus on open-domain chitchat dialogues, while summarization datasets for the broadly used task-oriented dialogue haven't been explored yet. Automatically summarizing such task-oriented dialogues can help a business collect and review needs to improve the service. Besides, previous datasets pay more a...
Conference Paper
Full-text available
Abstractive dialogue summarization suffers from a lots of factual errors, which are due to scattered salient elements in the multi-speaker information interaction process. In this work, we design a heterogeneous semantic slot graph with a slot-level mask cross-attention to enhance the slot features for more correct summarization. We also propose a...
Article
Relation extraction has been an active research interest in the field of Natural Language Processing (NLP). The past works primarily focused on a corpus of formal text which is inherently non-dialogic. Recently, the dialogue-based relation extraction task, which detects relations among speaker-aware entities scattering in dialogues, has been gradua...
Conference Paper
Full-text available
Recently, people have been beginning paying more attention to the abstractive dialogue summa- rization task. Since the information flows are exchanged between at least two interlocutors and key elements about a certain event are often spanned across multiple utterances, it is necessary for researchers to explore the inherent relations and structure...
Article
Relation classification is an important semantic processing task in the field of Natural Language Processing (NLP). The past works mainly focused on binary relations in a single sentence. Recently, cross-sentence N-ary relation classification, which detects relations among n entities across multiple sentences, has been arousing people’s interests....

Network

Cited By