Profile-guided optimization


Profile-guided optimization, also known as profile-directed feedback, and feedback-directed optimization is a compiler optimization technique in computer programming that uses profiling to improve program runtime performance.

Method

Optimization techniques based on analysis of the source code alone are based on general ideas as to possible improvements, often applied without much worry over whether or not the code section was going to be executed frequently though also recognising that code within looping statements is worth extra attention.
The first high-level compiler, introduced as the Fortran Automatic Coding System in 1957, broke the code into blocks and devised a table of the frequency each block is executed via a simulated execution of the code in a Monte Carlo fashion in which the outcome of conditional transfers is determined by a random number generator suitably weighted by whatever FREQUENCY statements were provided by the programmer.
Rather than programmer-supplied frequency information, profile-guided optimisation uses the results of profiling test runs of the instrumented program to optimize the final generated code. The compiler is used to access data from a sample run of the program across a representative input set. The results indicate which areas of the program are executed more frequently, and which areas are executed less frequently. All optimizations benefit from profile-guided feedback because they are less reliant on heuristics when making compilation decisions. The caveat, however, is that the sample of data fed to the program during the profiling stage must be statistically representative of the typical usage scenarios; otherwise, profile-guided feedback has the potential to harm the overall performance of the final build instead of improving it.
Just-in-time compilation can make use of runtime information to dynamically recompile parts of the executed code to generate a more efficient native code. If the dynamic profile changes during execution, it can deoptimize the previous native code, and generate a new code optimized with the information from the new profile.

Adoption

There is support for building Firefox using PGO. Even though PGO is effective, it has not been widely adopted by software projects, due to its tedious dual-compilation model. It is also possible to perform PGO without instrumentation by collecting a profile using hardware performance counters. This sampling-based approach has a much lower overhead and does not require a special compilation.
The HotSpot Java virtual machine uses profile-guided optimization to dynamically generate native code. As a consequence, a software binary is optimized for the actual load it is receiving. If the load changes, adaptive optimization can dynamically recompile the running software to optimize it for the new load. This means that all software executed on the HotSpot JVM effectively make use of profile-guided optimization.
PGO has been adopted in the Windows version of Google Chrome. PGO was enabled in the 64-bit edition of Chrome starting with version 53 and version 54 for the 32-bit edition.

Implementations

Examples of compilers that implement PGO are: