List of important publications in theoretical computer science


This is a list of important publications in theoretical computer science, organized by field.
Some reasons why a particular publication might be regarded as important:

''Cutland's ''Computability: An Introduction to Recursive Function Theory'' (Cambridge)''

The review of this early text by Carl Smith of Purdue University, reports that this a text with an "appropriate blend of intuition and rigor… in the exposition of proofs" that presents "the fundamental results of classical recursion theory ... in a style... accessible to undergraduates with minimal mathematical background". While he states that it "would make an excellent introductory text for an introductory course in for mathematics students", he suggests that an "instructor must be prepared to substantially augment the material… " when it used with computer science students.

''Decidability of second order theories and automata on infinite trees''

Description: The paper presented the tree automaton, an extension of the automata. The tree automaton had numerous applications to proofs of correctness of programs.

''Finite automata and their decision problems''

Description: Mathematical treatment of automata, proof of core properties, and definition of non-deterministic finite automaton.

''Introduction to Automata Theory, Languages, and Computation''

Description: A popular textbook.

''On certain formal properties of grammars''

Description: This article introduced what is now known as the Chomsky hierarchy, a containment hierarchy of classes of formal grammars that generate formal languages.

''On computable numbers, with an application to the Entscheidungsproblem''

Description: This article set the limits of computer science. It defined the Turing Machine, a model for all computations.
On the other hand, it proved the undecidability of the halting problem and Entscheidungsproblem and by doing so found the limits of possible computation.

''Rekursive Funktionen''

The first textbook on the theory of recursive functions. The book went through many editions and earned Péter the Kossuth Prize from the Hungarian government. Reviews by Raphael M. Robinson and Stephen Kleene praised the book for providing an effective elementary introduction for students.

''Representation of Events in Nerve Nets and Finite Automata''

Description: this paper introduced finite automata, regular expressions, and regular languages, and established their connection.

[Computational complexity theory]

''Arora & Barak's ''Computational Complexity'' and Goldreich's ''Computational Complexity'' (both Cambridge)''

Besides the estimable press bringing these recent texts forward, they are very positively reviewed in ACM's SIGACT News by Daniel Apon of the University of Arkansas, who identifies them as "textbooks for a course in complexity theory, aimed at early graduate… or... advanced undergraduate students… numerous, unique strengths and very few weaknesses," and states that both are:
The reviewer notes that there is "a definite attempt in to include very up-to-date material, while Goldreich focuses more on developing a contextual and historical foundation for each concept presented," and that he "applaud all… authors for their outstanding contributions."

''A machine-independent theory of the complexity of recursive functions''

Description: The Blum axioms.

''Algebraic methods for interactive proof systems''

Description: This paper showed that PH is contained in IP.

''The complexity of theorem proving procedures''

Description: This paper introduced the concept of NP-Completeness and proved that Boolean satisfiability problem is NP-Complete. Note that similar ideas were developed independently slightly later by Leonid Levin at "Levin, Universal Search Problems. Problemy Peredachi Informatsii 9:265–266, 1973".

''Computers and Intractability: A Guide to the Theory of NP-Completeness''

Description: The main importance of this book is due to its extensive list of more than 300 NP-Complete problems. This list became a common reference and definition. Though the book was published only few years after the concept was defined such an extensive list was found.

''Degree of difficulty of computing a function and a partial ordering of recursive sets''

Description: This technical report was the first publication talking about what later was renamed computational complexity

''How good is the simplex method?''

Description: Constructed the "Klee–Minty cube" in dimension D, whose 2D corners are each visited by Dantzig's simplex algorithm for linear optimization.

''How to construct random functions''

Description: This paper showed that the existence of one way functions leads to computational randomness.

''IP = PSPACE''

Description: IP is a complexity class whose characterization is quite different from the usual time/space bounded computational classes. In this paper, Shamir extended the technique of the previous paper by Lund, et al., to show that PSPACE is contained in IP, and hence IP = PSPACE, so that each problem in one complexity class is solvable in the other.

''Reducibility among combinatorial problems''

Description: This paper showed that 21 different problems are NP-Complete and showed the importance of the concept.

''The Knowledge Complexity of Interactive Proof Systems''

Description: This paper introduced the concept of zero knowledge.

''A letter from Gödel to von Neumann''

Description: Gödel discusses the idea of efficient universal theorem prover.

''On the computational complexity of algorithms''

Description: This paper gave computational complexity its name and seed.

''Paths, trees, and flowers''

Description: There is a polynomial time algorithm to find a maximum matching in a graph that is not bipartite and another step toward the idea of computational complexity. For more information see .

''Theory and applications of trapdoor functions''

Description: This paper creates a theoretical framework for trapdoor functions and described some of their applications, like in cryptography. Note that the concept of trapdoor functions was brought at "New directions in cryptography" six years earlier.

''Computational Complexity''

Description: An introduction to computational complexity theory, the book explains its author's characterization of P-SPACE and other results.

''Interactive proofs and the hardness of approximating cliques''

''Probabilistic checking of proofs: a new characterization of NP''

''Proof verification and the hardness of approximation problems''

Description: These three papers established the surprising fact that certain problems in NP remain hard even when only an approximative solution is required. See PCP theorem.

''The Intrinsic Computational Difficulty of Functions''

Description: First definition of the complexity class P. One of the founding papers of complexity theory.

[Algorithms]

"A machine program for theorem proving"

Description: The DPLL algorithm. The basic algorithm for SAT and other NP-Complete problems.

"A machine-oriented logic based on the resolution principle"

Description: First description of resolution and unification used in automated theorem proving; used in Prolog and logic programming.

"The traveling-salesman problem and minimum spanning trees"

Description: The use of an algorithm for minimum spanning tree as an approximation algorithm for the NP-Complete travelling salesman problem. Approximation algorithms became a common method for coping with NP-Complete problems.

"A polynomial algorithm in linear programming"

Description: For long, there was no provably polynomial time algorithm for the linear programming problem. Khachiyan was the first to provide an algorithm that was polynomial. Later, Narendra Karmarkar presented a faster algorithm at: Narendra Karmarkar, "A new polynomial time algorithm for linear programming", Combinatorica, vol 4, no. 4, p. 373–395, 1984.

"Probabilistic algorithm for testing primality"

Description: The paper presented the Miller-Rabin primality test and outlined the program of randomized algorithms.

"Optimization by simulated annealing"

Description: This article described simulated annealing which is now a very common heuristic for NP-Complete problems.

''The Art of Computer Programming''

Description: This monograph has four volumes covering popular algorithms. The algorithms are written in both English and MIX assembly language. This makes algorithms both understandable and precise. However, the use of a low-level programming language frustrates some programmers more familiar with modern structured programming languages.

''Algorithms + Data Structures = Programs''

Description: An early, influential book on algorithms and data structures, with implementations in Pascal.

''The Design and Analysis of Computer Algorithms''

Description: One of the standard texts on algorithms for the period of approximately 1975–1985.

''How to Solve It By Computer''

Description: Explains the Whys of algorithms and data-structures. Explains the Creative Process, the Line of Reasoning, the Design Factors behind innovative solutions.

''Algorithms''

Description: A very popular text on algorithms in the late 1980s. It was more accessible and readable than Aho, Hopcroft, and Ullman. There are more recent editions.

''Introduction to Algorithms''

Description: This textbook has become so popular that it is almost the de facto standard for teaching basic algorithms. The 1st edition was published in 1990, the 2nd edition in 2001, and the 3rd in 2009.

[Algorithmic information theory]

"On Tables of Random Numbers"

Description: Proposed a computational and combinatorial approach to probability.

"A formal theory of inductive inference"

Description: This was the beginning of algorithmic information theory and Kolmogorov complexity. Note that though Kolmogorov complexity is named after Andrey Kolmogorov, he said that the seeds of that idea are due to Ray Solomonoff. Andrey Kolmogorov contributed a lot to this area but in later articles.

"Algorithmic information theory"

Description: An introduction to algorithmic information theory by one of the important people in the area.

[Information theory]

"A mathematical theory of communication"

Description: This paper created the field of information theory.

"Error detecting and error correcting codes"

Description: In this paper, Hamming introduced the idea of error-correcting code. He created the Hamming code and the Hamming distance and developed methods for code optimality proofs.

"A method for the construction of minimum redundancy codes"

Description: The Huffman coding.

"A universal algorithm for sequential data compression"

Description: The LZ77 compression algorithm.

''Elements of Information Theory''

Description: A popular introduction to information theory.

[Formal verification]

Assigning Meaning to Programs

Description: Robert Floyd's landmark paper Assigning Meanings to Programs introduces the method of inductive assertions and describes how a program annotated with first-order assertions may be shown to satisfy a pre- and post-condition specification – the paper also introduces the concepts of loop invariant and verification condition.

An Axiomatic Basis for Computer Programming

Description: Tony Hoare's paper An Axiomatic Basis for Computer Programming describes a set of inference rules for fragments of an Algol-like programming language described in terms of Hoare-triples.

Guarded Commands, Nondeterminacy and Formal Derivation of Programs

Description: Edsger Dijkstra's paper Guarded Commands, Nondeterminacy and Formal Derivation of Programs proposes that, instead of formally verifying a program after it has been written, programs and their formal proofs should be developed hand-in-hand, a method known as program refinement, or sometimes "correctness-by-construction".

''Proving Assertions about Parallel Programs''

Description: The paper that introduced invariance proofs of concurrent programs.

''An Axiomatic Proof Technique for Parallel Programs I''

Description: In this paper, along with the same authors paper "Verifying Properties of Parallel Programs: An Axiomatic Approach. Commun. ACM 19: 279–285 ", the axiomatic approach to parallel programs verification was presented.

''A Discipline of Programming''

Description: Edsger Dijkstra's classic postgraduate-level textbook A Discipline of Programming extends his earlier paper Guarded Commands, Nondeterminacy and Formal Derivation of Programs and firmly establishes the principle of formally deriving programs from their specification.

''Denotational Semantics''

Description: Joe Stoy's Denotational Semantics is the first book-length exposition of the mathematical approach to the formal semantics of programming languages.

The Temporal Logic of Programs

Description: The use of temporal logic was suggested as a method for formal verification.

''Characterizing correctness properties of parallel programs using fixpoints (1980)''

Description: Model checking was introduced as a procedure to check correctness of concurrent programs.

''Communicating Sequential Processes (1978)''

Description: Tony Hoare's communicating sequential processes paper introduces the idea of concurrent processes that do not share variables but instead cooperate solely by exchanging synchronous messages.

''A Calculus of Communicating Systems''

Description: Robin Milner's A Calculus of Communicating Systems paper describes a process algebra permitting systems of concurrent processes to be reasoned about formally, something which has not been possible for earlier models of concurrency.

''Software Development: A Rigorous Approach''

Description: Cliff Jones' textbook Software Development: A Rigorous Approach is the first full-length exposition of the Vienna Development Method, which had evolved at IBM's Vienna research lab over the previous decade and which combines the idea of program refinement as per Dijkstra with that of data refinement whereby algebraically-defined abstract data types are formally transformed into progressively more "concrete" representations.

''The Science of Programming''

Description: David Gries' textbook The Science of Programming describes Dijkstra's weakest precondition method of formal program derivation, except in a very much more accessible manner than Dijkstra's earlier A Discipline of Programming.
It shows how to construct programs that work correctly. It does this by showing how to use precondition and postcondition predicate expressions and program proving techniques to guide the way programs are created.
The examples in the book are all small-scale, and clearly academic. They emphasize basic algorithms, such as sorting and merging, and string manipulation. Subroutines are included, but object-oriented and functional programming environments are not addressed.

''Communicating Sequential Processes (1985)''

Description: Tony Hoare's Communicating Sequential Processes textbook presents an updated CSP model in which cooperating processes do not even have program variables and which, like CCS, permits systems of processes to be reasoned about formally.

''Linear logic (1987)''

Description: Girard's linear logic was a breakthrough in designing typing systems for sequential and concurrent computation, especially for resource conscious typing systems.

''A Calculus of Mobile Processes (1989)''

Description: This paper introduces the Pi-Calculus, a generalisation of CCS which allows process mobility. The calculus is extremely simple and has become the dominant paradigm in the theoretical study of programming languages, typing systems and program logics.

''Communication and Concurrency''

Description: Robin Milner's textbook Communication and Concurrency is a more accessible, although still technically advanced, exposition of his earlier CCS work.

''a Practical Theory of Programming''

Description: the up-to-date version of Predicative programming. The basis for C.A.R. Hoare's UTP. The simplest and most comprehensive formal methods.