The magnitude of a star is a logarithmic measure of its brightness. It provides a convenient way to compare the apparent or intrinsic luminosities of celestial objects across an enormous range of brightnesses.
There are two main types of magnitudes used in astronomy: apparent magnitude and absolute magnitude.
The apparent magnitude, denoted by \( m \), expresses how bright a celestial object appears from Earth. It is based on the ancient scale introduced by Hipparchus, where the brightest stars were of the *first magnitude* and the faintest visible stars of the *sixth magnitude*. This historical system was later formalized into a logarithmic scale defined by Norman Pogson (1856):
$$ m_2 - m_1 = 2.5 \log_{10}\!\left(\frac{F_1}{F_2}\right), $$
where \( F_1 \) and \( F_2 \) are the observed fluxes (energy per unit area per unit time) from two sources. A difference of 5 magnitudes corresponds exactly to a factor of 100 in brightness:
$$ \frac{F_1}{F_2} = 100^{(m_2 - m_1)/5}. $$
Hence, a decrease of 1 magnitude means an increase in brightness by a factor of about 2.512. Apparent magnitude depends on both the intrinsic luminosity of a star and its distance from the observer.
The apparent magnitude of an individual star is defined with respect to a reference flux \( F_0 \), such that
$$ m = -2.5 \log_{10}\!\left(\frac{F}{F_0}\right), $$
where \( F \) is the measured flux of the star and \( F_0 \) is the reference (zero-point) flux for the photometric band. For example, in the \(V\)-band (visual magnitude), \(F_0 = 3.6\times10^{-8}~\text{W\,m}^{-2}\,\text{µm}^{-1}\).
Example: The Sun has \( m_V = -26.74 \), the full Moon about \( -12.6 \), Sirius \( -1.46 \), and the faintest stars visible to the naked eye about \( +6 \).
The absolute magnitude, denoted by \( M \), is the apparent magnitude that a star would have if it were placed at a standard distance of 10 parsecs (\( 32.6 \) light-years) from the observer. It provides a measure of a star’s intrinsic luminosity, independent of distance.
The relationship between apparent and absolute magnitude is given by the distance modulus:
$$ m - M = 5 \log_{10}\!\left(\frac{d}{10\,\text{pc}}\right), $$
where \( d \) is the distance to the star in parsecs. Thus, knowing \( m \) and \( d \), one can compute \( M \), and vice versa.
For example, if a star has \( m = 5.0 \) and lies at a distance of 100 pc, then
$$ M = m - 5 \log_{10}\!\left(\frac{100}{10}\right) = 0.0. $$
This means that if it were at 10 pc, it would appear as bright as a zero-magnitude star.
The absolute magnitude is directly related to the luminosity \( L \) of a star through
$$ M - M_\odot = -2.5 \log_{10}\!\left(\frac{L}{L_\odot}\right), $$
where \( M_\odot \) and \( L_\odot \) are the Sun’s absolute magnitude and luminosity, respectively. This equation allows magnitudes to be converted into physical energy output.
Magnitudes can also be defined over specific wavelength ranges. The visual magnitude \( m_V \) refers to brightness measured through the \(V\) (visual) filter, centered around 550 nm. Similarly, magnitudes in other passbands such as \(B\) (blue) or \(R\) (red) are denoted by \(m_B\), \(m_R\), etc.
The bolometric magnitude, \( M_{\text{bol}} \), measures the total luminosity emitted over all wavelengths. It is related to the visual magnitude by a bolometric correction (BC):
$$ M_{\text{bol}} = M_V + \text{BC}. $$
Hot stars emit more energy in the ultraviolet, leading to a negative BC, while cool stars emit more in the infrared, giving a positive BC.
The difference between magnitudes measured in two passbands defines a color index, such as
$$ B - V = m_B - m_V. $$
This index provides an observational measure of stellar temperature: smaller \(B - V\) indicates a bluer (hotter) star, while larger \(B - V\) indicates a redder (cooler) star. The color index is used as the horizontal axis in the Hertzsprung–Russell diagram.