Automata are abstract mathematical models of machines that perform computations on an input by moving through a series of states or configurations. If the computation of an automaton reaches an accepting configuration it accepts that input. At each stage of the computation, a transition function determines the next configuration on the basis of a finite portion of the present configuration.

Turing machines are the most general automata. They consist of a finite set of states and an infinite tape which contains the input and is used to read and write symbols during the computation. Since Turing machines can leave symbols on their tape at the end of the computation, they can be viewed as computing functions: the partial recursive functions. Despite the simplicity of these automata, any algorithm that can be implemented on a computer can be modeled by some Turing machine.

Turing machines are used in the characterization of the complexity of problems. The complexity of a problem is determined by the efficiency of the best algorithm that solves it. Measures of an algorithm's efficiency are the amount of time or space that a Turing machine requires to implement the algorithm. A computation's time is the number of configurations involved in that computation, and its space corresponds to the number of positions on its tape that were used.

Automata theory has close ties to formal language theory, since there is a correspondence between certain families of automata and classes of languages generated by grammar formalisms. A language is accepted by an automaton when it accepts all of the strings in the language and no others. The family of Turing machines accept exactly the class of languages produced by Type 0 grammars. Turing machines whose space cannot be more than that occupied by the input (linear-bounded) accept the class of languages generated by context-sensitive (Type 1) grammars. The most restricted family of automata are finite automata consisting of only a finite number of states and a "read-only" tape containing the input to be read in one direction. Finite automata recognize the class of languages generated by regular (Type 3) grammars. These automata can be given a limited amount of extra power with the addition of certain forms of storage. For example, pushdown automata involve a pushdown store: a sequence in which symbols can only be added and removed from one end, with the effect that the first symbols in, are the last ones out. Pushdown automata accept the languages generated by context-free (Type 2) grammars.

Automata theory gave rise to the notion of deterministic computation, hence deterministic languages. In a deterministic computation each configuration of the machine has only one possible successor. For some families of automata (e.g., finite automata and Turing machines) deterministic and nondeterministic automata are equivalent. For others (e.g., pushdown automata) there are languages that can be accepted by a non-deterministic automata of that family but cannot be accepted by any deterministic automata.

Back to Mathematical Linguistics Area