2022年12月

1.前缀提示
Controllable Natural Language Generation with Contrastive Prefixes
https://arxiv.org/pdf/2202.13257.pdf

2.Response Generation with Context-Aware Prompt Learning
https://arxiv.org/pdf/2111.02643.pdf

3.Planning with Learned Entity Prompts for Abstractive Summarization
https://arxiv.org/pdf/2104.07606.pdf

4.The Power of Scale for Parameter-Efficient Prompt Tuning
https://arxiv.org/pdf/2104.08691.pdf

5.XDAI: A Tuning-free Framework for Exploiting Pre-trained
Language Models in Knowledge Grounded Dialogue Generation
http://keg.cs.tsinghua.edu.cn/jietang/publications/KDD22-Yu-et-al-XDAI.pdf
参考文献:
1.https://cloud.tencent.com/developer/article/2117074

一、abstract

In order to make full use of medical knowledge graph in medical dialogue system
solve the problems of effective medical knowledge missing and medical dialogue generation uncontrollable, we
propose a sentiment