2 Answers
HI HOPE THIS HELPS FOUND IT ON GOOGLE SEARCH ENGINE
Calculations
As the amount of light received actually depends on the thickness of the Earth's atmosphere in the line of sight to the object, the apparent magnitudes are normalized to the value it would have in the absence of the atmosphere. The dimmer an object appears, the higher its apparent magnitude. Note that brightness varies with distance; an extremely bright object may appear quite dim, if it is far away. Brightness varies inversely with the square of the distance. The absolute magnitude, M, of a celestial body (outside of the solar system) is the apparent magnitude it would have if it were 10 parsecs (~32.6 light years) away; that of a planet (or other solar system body) is the apparent magnitude it would have if it were 1 astronomical unit away from both the Sun and Earth. The absolute magnitude of the Sun is 4.83 in the V band (yellow) and 5.48 in the B band (blue).[41]
The apparent magnitude, m, in the band, x, can be defined as mx below (noting that )
where is the observed flux in the band x, and is a reference flux in the same band x, such as the Vega star's for example. See Aller et al. 1982 for the most commonly used system.
Since an increase of 1 in the magnitude scale corresponds to a decrease in brightness by a certain factor, the factor would be , which is 2.512...
The variation in brightness between two luminous objects can be calculated another way by subtracting the magnitude number of the brighter object from the magnitude number of the fainter object, then using the difference as an exponent for the base number 2.512; that is to say (mf − mb = x; and 2.512x = variation in brightness).
[edit] Example 1 - Sun & Moon
What is the ratio in brightness between the Sun and the full moon?
2.512x = variation in brightness
The apparent magnitude of the Sun is -26.74, and the mean apparent magnitude of the full moon is -12.74. The full moon is the fainter of the two objects, while the Sun is the brighter.
Difference in magnitude
Variation in Brightness
variation in brightness = 398,359
In terms of apparent magnitude, the Sun is about 398,359 times brighter than the full moon.
[edit] Example 2 - Sirius & Polaris
What is the ratio in brightness between Sirius and Polaris?
variation in brightness
The apparent magnitude of Sirius is -1.44, and the apparent magnitude of Polaris is 1.97. Polaris is the fainter of the two stars, while Sirius is the brighter.
Difference in magnitude
Variation in brightness
In terms of apparent magnitude, Sirius is 23.124 times brighter than Polaris the North Star.
The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... A common misconception is that the logarithmic nature of the scale is because the human eye itself has a logarithmic response. In Pogson's time this was thought to be true (see Weber-Fechner law), but it is now believed that the response is a power law (see Stevens' power law).[42]
Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way it va
12 years ago. Rating: 1 | |