Arxiv ACL Anthology Presented at NLP4DH 2025 @ NAACL
Previously popular language modeling techniques (Ngram, Bengio-style neural modeling) have once again entered the spotlight of current research. Recent projects have both updated them using now-ubiqitous modeling strategies and used them to supplement their now-ubiquitous cousin, the pretrained LLM. I employ the latter approach to transfer extreme versions of linguistic style at generation time. This approach results in an efficient and accurate means of transfer that does not rely on (potentially meager) in-weight knowledge or (potentially brittle) prompting.