Lexically Constrained Decoding of Transformers (2025.02-2025.03) [paper] [code] [slides]

In a class project, we adapted the constrained decoding algorithm Grid Beam Search (GBS) to impose lexical constraints on transformers. GBS was originally applied to stateful Neural Translation Models for seq2seq translation. We developed a new pipeline for generating structured outputs from a given prompt and set of lexical constraints that supports any pretrained autoregressive language model, which we chose to be GPT2. Then, we fine-tuned GPT-2 on a corpus of Chekhov’s stories. Our subjective analysis showed that GBS + fine-tuned GPT2 gave more interesting and meaningful domain-specific results than GBS + GPT2 alone.