Grammar-Forced Translation of Natural Language to Temporal Logic using LLMs

Published in ICML, 2025

excerpt: ‘In this paper, we propose a framework for NL to TL translation called Grammar Forced Translation (GraFT). The framework is based on the observation that previous work solves both the grounding and translation steps by letting a language model iteratively predict tokens from its full vocabulary. GraFT reduces the complexity of both tasks by restricting the set of valid output tokens from the full vocabulary to only a handful in each step. Compared with state-of-the-art translation approaches, it can be observed that GraFT improves the end-to-end translation accuracy by 5.49% and out-of-domain translation accuracy by 14.06% on average.’

Recommended citation: William English, Dominic Simon, Sumit Jha, and Rickard Ewetz, “Grammar-Forced Translation of Natural Language to Temporal Logic using LLMs”, International Conference on Machine Learning (ICML), 2025. https://icml.cc/virtual/2025/poster/44027