Thursday, February 9, 2012

Apparent vs. Absolute Magnitude



Last ADYK we discussed the magnitude scale and how astronomers use it to quantify how bright a star is. But there's a little more to this whole magnitude idea. Think about this scenario… The sun is very bright, about -27 magnitude, and very big in the sky. What would the sun look like if I moved it very far away? The sun would still emit the same amount of light, but it would look much smaller and dimmer on the sky. I would no longer say it's magnitude -27, but rather some larger (dimmer) magnitude. In other words, stars that are close are going to appear brighter and therefore have a lower (brighter) magnitude. So how do astronomers correct for this distance bias? They have two different magnitude definitions: apparent and absolute magnitude. Apparent magnitude is the one we discussed last time. It answers the question "how bright does that star appear to be in the sky?". Absolute magnitude corrects for the fact that stars are different distances from Earth, and answers the question "If I assume that all the stars are the same distance from Earth (often assume 10parsecs), how bright does the star appear to be?" Absolute magnitudes allow you to directly compare the light output of two stars without worrying about the fact that they might be different distances away. With some basic algebra, you can switch between the absolute and apparent magnitude of a star, as long as you know how far away it is. Both of these magnitude scales are used by astronomers and are very handy when you are trying to observe or compare the properties of two stars.

Image Credit: http://mrscreath.edublogs.org/2011/12/01/hr-diagram-day-2/

Monday, February 6, 2012

The Magnitude Scale

If you've ever listened to a group of amateur or professional astronomers talk, you've probably heard them say something like: "Yeah, I should be able to image that star, it's magnitude 4." But what does magnitude 4 mean? In astronomy, we use a magnitude scale to define how bright stars and other objects are in the sky. To make it super confusing, the magnitude of a source can be a positive or negative value, and larger positive numbers mean the source is dimmer. You can thank Hipparchus for this, he was the first to catalog the brightness of stars, defining magnitude 1 as the brightest stars in the sky and magnitude 6 as the dimmest. Since then, astronomers have come up with equations to calculate the magnitude of stars, so that the system is not based on how good your eyesight is. The magnitude system is defined such that a difference of 5 magnitudes equals 100 times brighter or dimmer. So how much brighter is star A at mag=2 than star B at mag=3? By definition, 1 magnitude difference equals ~2.5 times as bright, so star A is 2.5 times brighter than star B.

 With today's telescopes, we can see stars that are as dim as about mag 30. Without a telescope, our eyes can't see anything dimmer than magnitude 6. The chart below shows you some common sky objects and how bright they appear. Don't forget, the bigger the number the dimmer the object!