Apparent magnitude


Apparent magnitude is a measure of the brightness of a star or other astronomical object observed from the Earth. An object's apparent magnitude depends on its intrinsic luminosity, its distance from Earth, and any extinction of the object's light caused by interstellar dust along the line of sight to the observer.
The magnitude scale is reverse logarithmic: the brighter an object is, the lower its magnitude. An object that is measured to be 5 magnitudes higher than another object is 100 times dimmer. Consequently, a difference of 1.0 in magnitude corresponds to a brightness ratio of, or about 2.512. For example, a star of magnitude 2.0 is 2.512 times brighter than a star of magnitude 3.0 and is 100 times brighter than one of magnitude 7.0. The brightest astronomical objects have negative apparent magnitudes: for example, Venus at −4.2 or Sirius at −1.46. The faintest stars visible with the naked eye on the darkest night have apparent magnitudes of about +6.5, though this varies depending on a person's eyesight and with altitude and atmospheric conditions. The apparent magnitudes of known objects range from the Sun at −26.7 to objects in deep Hubble Space Telescope images of around magnitude +30.
The measurement of apparent magnitude is called
photometry. Photometric measurements are made in the ultraviolet, visible, or infrared wavelength bands using standard passband filters belonging to photometric systems such as the UBV system or the Strömgren uvbyβ system.
Absolute magnitude is a measure of the intrinsic luminosity of a celestial object rather than its apparent brightness and is expressed on the same reverse logarithmic scale. Absolute magnitude is defined as the apparent magnitude that a star or object would have if it were observed from a distance of. When referring to just "magnitude", apparent magnitude rather than absolute magnitude is normally intended.

History

Visible to
typical
human
eye
Apparent
magnitude
Bright-
ness
relative
to Vega
Number of stars

brighter than
apparent magnitude
in the night sky
Yes−1.0251%1
Yes0.0100%4
Yes1.040%15
Yes2.016%48
Yes3.06.3%171
Yes4.02.5%513
Yes5.01.0%
Yes6.00.4%
Yes6.50.25%
No7.00.16%
No8.00.063%
No9.00.025%
No10.00.010%

The scale used to indicate magnitude originates in the Hellenistic practice of dividing stars visible to the naked eye into six magnitudes. The brightest stars in the night sky were said to be of first magnitude, whereas the faintest were of sixth magnitude, which is the limit of human visual perception. Each grade of magnitude was considered twice the brightness of the following grade, although that ratio was subjective as no photodetectors existed. This rather crude scale for the brightness of stars was popularized by Ptolemy in his Almagest and is generally believed to have originated with Hipparchus. This cannot be proved or disproved because Hipparchus's original star catalogue is lost. The only preserved text by Hipparchus himself clearly documents that he did not have a system to describe brightnesses with numbers: He always uses terms like "big" or "small", "bright" or "faint" or even descriptions like "visible at fullmoon".
In 1856, Norman Robert Pogson formalized the system by defining a first magnitude star as a star that is 100 times as bright as a sixth-magnitude star, thereby establishing the logarithmic scale still in use today. This implies that a star of magnitude is about 2.512 times as bright as a star of magnitude. This figure, the fifth root of 100, became known as Pogson's Ratio. The zero point of Pogson's scale was originally defined by assigning Polaris a magnitude of exactly 2. Astronomers later discovered that Polaris is slightly variable, so they switched to Vega as the standard reference star, assigning the brightness of Vega as the definition of zero magnitude at any specified wavelength.
Apart from small corrections, the brightness of Vega still serves as the definition of zero magnitude for visible and near infrared wavelengths, where its spectral energy distribution closely approximates that of a black body for a temperature of. However, with the advent of infrared astronomy it was revealed that Vega's radiation includes an infrared excess presumably due to a circumstellar disk consisting of dust at warm temperatures. At shorter wavelengths, there is negligible emission from dust at these temperatures. However, in order to properly extend the magnitude scale further into the infrared, this peculiarity of Vega should not affect the definition of the magnitude scale. Therefore, the magnitude scale was extrapolated to all wavelengths on the basis of the black-body radiation curve for an ideal stellar surface at uncontaminated by circumstellar radiation. On this basis the spectral irradiance for the zero magnitude point, as a function of wavelength, can be computed. Small deviations are specified between systems using measurement apparatuses developed independently so that data obtained by different astronomers can be properly compared, but of greater practical importance is the definition of magnitude not at a single wavelength but applying to the response of standard spectral filters used in photometry over various wavelength bands.
Telescope
aperture
Limiting
Magnitude
3511.3
6012.3
10213.3
15214.1
20314.7
30515.4
40615.7
50816.4

With the modern magnitude systems, brightness over a very wide range is specified according to the logarithmic definition detailed below, using this zero reference. In practice such apparent magnitudes do not exceed 30. The brightness of Vega is exceeded by four stars in the night sky at visible wavelengths as well as the bright planets Venus, Mars, and Jupiter, and these must be described by negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has a magnitude of −1.4 in the visible. Negative magnitudes for other very bright astronomical objects can be found in the [|table] below.
Astronomers have developed other photometric zeropoint systems as alternatives to the Vega system. The most widely used is the AB magnitude system, in which photometric zeropoints are based on a hypothetical reference spectrum having constant flux per unit frequency interval, rather than using a stellar spectrum or blackbody curve as the reference. The AB magnitude zeropoint is defined such that an object's AB and Vega-based magnitudes will be approximately equal in the V filter band.

Measurement

Precision measurement of magnitude requires calibration of the photographic or electronic detection apparatus. This generally involves contemporaneous observation, under identical conditions, of standard stars whose magnitude using that spectral filter is accurately known. Moreover, as the amount of light actually received by a telescope is reduced due to transmission through the Earth's atmosphere, the airmasses of the target and calibration stars must be taken into account. Typically one would observe a few different stars of known magnitude which are sufficiently similar. Calibrator stars close in the sky to the target are favoured. If those stars have somewhat different zenith angles then a correction factor as a function of airmass can be derived and applied to the airmass at the target's position. Such calibration obtains the brightnesses as would be observed from above the atmosphere, where apparent magnitude is defined.

Calculations

The dimmer an object appears, the higher the numerical value given to its magnitude, with a difference of 5 magnitudes corresponding to a brightness factor of exactly 100. Therefore, the magnitude, in the spectral band, would be given by
which is more commonly expressed in terms of common logarithms as
where is the observed flux density using spectral filter, and is the reference flux for that photometric filter. Since an increase of 5 magnitudes corresponds to a decrease in brightness by a factor of exactly 100, each magnitude increase implies a decrease in brightness by the factor ≈ 2.512. Inverting the above formula, a magnitude difference implies a brightness factor of

Example: Sun and Moon

What is the ratio in brightness between the Sun and the full Moon?
The apparent magnitude of the Sun is −26.74, and the mean magnitude of the full moon is −12.74.
Difference in magnitude:
Brightness factor:
The Sun appears about times brighter than the full moon.

Magnitude addition

Sometimes one might wish to add brightnesses. For example, photometry on closely separated double stars may only be able to produce a measurement of their combined light output. How would we reckon the combined magnitude of that double star knowing only the magnitudes of the individual components? This can be done by adding the brightnesses corresponding to each magnitude.
Solving for yields
where is the resulting magnitude after adding the brightnesses referred to by and.

Apparent bolometric magnitude

While magnitude generally refers to a measurement in a particular filter band corresponding to some range of wavelengths, the apparent or absolute bolometric magnitude is a measure of an object's apparent or absolute brightness integrated over all wavelengths of the electromagnetic spectrum. The zeropoint of the apparent bolometric magnitude scale is based on the definition that an apparent bolometric magnitude of 0 mag is equivalent to a received irradiance of 2.518×10−8 watts per square metre.

Absolute magnitude

While apparent magnitude is a measure of the brightness of an object as seen by a particular observer, absolute magnitude is a measure of the intrinsic brightness of an object. Flux decreases with distance according to an inverse-square law, so the apparent magnitude of a star depends on both its absolute brightness and its distance. For example, a star at one distance will have the same apparent magnitude as a star four times brighter at twice that distance. In contrast, the intrinsic brightness of an astronomical object, does not depend on the distance of the observer or any extinction.
The absolute magnitude, of a star or astronomical object is defined as the apparent magnitude it would have as seen from a distance of. The absolute magnitude of the Sun is 4.83 in the V band and 5.48 in the B band.
In the case of a planet or asteroid, the absolute magnitude rather means the apparent magnitude it would have if it were from both the observer and the Sun, and fully illuminated.

Standard reference values

The magnitude scale is a reverse logarithmic scale. A common misconception is that the logarithmic nature of the scale is because the human eye itself has a logarithmic response. In Pogson's time this was thought to be true, but it is now believed that the response is a power law.
Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U, B and V. The V band was chosen for spectral purposes and gives magnitudes closely corresponding to those seen by the human eye. When an apparent magnitude is discussed without further qualification, the V magnitude is generally understood.
Because cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV regions of the spectrum, their power is often under-represented by the UBV scale. Indeed, some L and T class stars have an estimated magnitude of well over 100, because they emit extremely little visible light, but are strongest in infrared.
Measures of magnitude need cautious treatment and it is extremely important to measure like with like. On early 20th century and older orthochromatic photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse irregular variable star are reversed compared to what human eyes perceive, because this archaic film is more sensitive to blue light than it is to red light. Magnitudes obtained from this method are known as photographic magnitudes, and are now considered obsolete.
For objects within the Milky Way with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object. For objects at very great distances, this relationship must be adjusted for redshifts and for non-Euclidean distance measures due to general relativity.
For planets and other Solar System bodies, the apparent magnitude is derived from its phase curve and the distances to the Sun and observer.

List of apparent magnitudes

Some of the listed magnitudes are approximate. Telescope sensitivity depends on observing time, optical bandpass, and interfering light from scattering and airglow.