Computational particle physics


Computational particle physics refers to the methods and computing tools developed in and used by particle physics research. Like computational chemistry or computational biology, it is, for particle physics both a specific branch and an interdisciplinary field relying on computer science, theoretical and experimental particle physics and mathematics.
The main fields of computational particle physics are: lattice field theory, automatic calculation of particle interaction or decay and event generators.

Computing tools

Particle physics played a role in the early history of the internet, the World-Wide Web was created by Tim Berners-Lee when working at CERN in 1991.

Computer Algebra

Note: This section contains an excerpt from 'Computer Algebra in Particle Physics' by Stefan Weinzierl
Particle physics is an important field of application for computer algebra and exploits the capabilities of Computer Algebra Systems. This leads to valuable feed-back for the development of CAS. Looking at the history of computer algebra systems, the first programs date back to the 1960s. The first systems were almost entirely based on LISP. LISP is an interpreted language and, as the name already indicates, designed for the manipulation of lists. Its importance for symbolic computer programs in the early days has been compared to the importance of FORTRAN for numerical programs in the same period. Already in this first period, the program REDUCE had some special features for the application to high energy physics. An exception to the LISP-based programs was SCHOONSHIP, written in assembler language by Martinus J. G. Veltman and specially designed for applications in particle physics. The use of assembler code lead to an incredible fast program and allowed the calculation of more complex scattering processes in high energy physics. It has been claimed the program's importance was recognized in 1998 by awarding the half of the Nobel prize to Veltman. Also the program MACSYMA deserves to be mentioned explicitly, since it triggered important development with regard to algorithms. In the 1980s new computer algebra systems started to be written in C. This enabled the better exploitation of the resources of the computer and at the same time allowed to maintain portability. This period marked also the appearance of the first commercial computer algebra system, among which Mathematica and Maple are the best known examples. In addition, also a few dedicated programs appeared, an example relevant to particle physics is the program FORM by J. Vermaseren as a successor to SCHOONSHIP. More recently issues of the maintainability of large projects became more and more important and the overall programming paradigma changed from procedural programming to object-oriented design. In terms of programming languages this was reflected by a move from C to C++. Following this change of paradigma, the library GiNaC was developed. The GiNac library allows symbolic calculations in C++.
Code generation for computer algebra can also be used in this area.

Lattice field theory

was created by Kenneth Wilson in 1974. Simulation techniques were later developed from statistical mechanics.
Since the early 1980s, LQCD researchers have pioneered the use of massively parallel computers in large scientific applications, using virtually all available computing systems including traditional main-frames, large PC clusters, and high-performance systems. In addition, it has also been used as a benchmark for high-performance computing, starting with the IBM Blue Gene supercomputer.
Eventually national and regional QCD grids were created: LATFOR, UKQCD and USQCD. The ILDG is an international venture comprising grids from the UK, the US, Australia, Japan and Germany, and was formed in 2002.