DiscoverNLP Highlights126 - Optimizing Continuous Prompts for Generation, with Lisa Li
126 - Optimizing Continuous Prompts for Generation, with Lisa Li

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

Update: 2021-05-24
Share

Description

We invited Lisa Li to talk about her recent work, Prefix-Tuning: Optimizing Continuous Prompts for Generation. Prefix tuning is a lightweight alternative to finetuning, and the idea is to tune only a fixed-length task-specific continuous vector, and to keep the pretrained transformer parameters frozen. We discussed how prefix tuning compares with finetuning and other efficient alternatives on two tasks in various experimental settings, and in what scenarios prefix tuning is preferable.

Lisa is a Phd student at Stanford University. Lisa's webpage: https://xiangli1999.github.io/

The hosts for this episode are Pradeep Dasigi and Ana Marasović.
Comments 
In Channel
Are LLMs safe?

Are LLMs safe?

2024-02-2942:15

loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

Allen Institute for Artificial Intelligence