Ask, and it shall be given: Turing completeness of prompting
In this work, we show that prompting is in fact Turing-complete: there exists a finite-size Transformer such that for any computable function, there exists a corresponding prompt following which the Transformer computes the function. Furthermore, we show that even though we use only a single finite-size Transformer, it can still achieve nearly the same complexity bounds as that of the class of all unbounded size Transformers.
In this work, we have shown that prompting is in fact Turing-complete: there exists a finite-size Transformer such that for any computable function, there exists a corresponding prompt following which the Transformer computes the function. Furthermore, we have shown that even though we use only a single finite-size Transformer, it can still achieve nearly the same complexity bounds as that of the class of all unbounded-size Transformers. Overall, our result reveals that prompting can enable a single finite-size Transformer to be efficiently universal, which establishes a theoretical underpinning for prompt engineering in practice. Our current work focuses on expressive power rather than learnability.
While we have shown the existence of a Transformer on which prompting is Turing-complete, it does not necessarily imply that a Transformer effectively learns to simulate any 2-PTMs through CoT steps. Investigating the learnability of such a Transformer is an intriguing direction for future research.