A Linear-time Simulation of Deterministic $d$-Limited Automata
Abstract: A $d$-limited automaton is a Turing machine that uses only the cells with the input word (and end-markers) and rewrites symbols only in the first $d$ visits. This model was introduced by T. Hibbard in 1967 and he showed that $d$-limited automata recognize context-free languages for each $d \geq 2$. He also proved that languages recognizable by deterministic $d$-limited automata form a hierarchy and it was shown later by Pighizzini and Pisoni that it begins with deterministic context-free languages (DCFLs) (for $d=2$). As well-known, DCFLs are widely used in practice, especially in compilers since they are linear-time recognizable and have the corresponding CF-grammars subclass (LR$(1)$-grammars). In this paper we present a linear time recognition algorithm for deterministic $d$-limited automata (in the RAM model) which opens an opportunity for their possible practical applications. We also generalize this algorithm to deterministic $d(n)$-limited automata: the extension of deterministic $d$-limited automata, where $d$ is not a constant, but a function depending on the input length $n$.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.