Thursday, February 9, 2012

Apparent vs. Absolute Magnitude



Last ADYK we discussed the magnitude scale and how astronomers use it to quantify how bright a star is. But there's a little more to this whole magnitude idea. Think about this scenario… The sun is very bright, about -27 magnitude, and very big in the sky. What would the sun look like if I moved it very far away? The sun would still emit the same amount of light, but it would look much smaller and dimmer on the sky. I would no longer say it's magnitude -27, but rather some larger (dimmer) magnitude. In other words, stars that are close are going to appear brighter and therefore have a lower (brighter) magnitude. So how do astronomers correct for this distance bias? They have two different magnitude definitions: apparent and absolute magnitude. Apparent magnitude is the one we discussed last time. It answers the question "how bright does that star appear to be in the sky?". Absolute magnitude corrects for the fact that stars are different distances from Earth, and answers the question "If I assume that all the stars are the same distance from Earth (often assume 10parsecs), how bright does the star appear to be?" Absolute magnitudes allow you to directly compare the light output of two stars without worrying about the fact that they might be different distances away. With some basic algebra, you can switch between the absolute and apparent magnitude of a star, as long as you know how far away it is. Both of these magnitude scales are used by astronomers and are very handy when you are trying to observe or compare the properties of two stars.

Image Credit: http://mrscreath.edublogs.org/2011/12/01/hr-diagram-day-2/