Friday, December 2, 2011

Brightness difference based on magnitude difference

This is pretty basic information but I'm still sharing it for two reasons. First, it has been completely overcast and windy for the past two days, so what would have been observing time gave way to projects like this. Second, even though Astronomy 101 teaches that a star with an apparent magnitude of 5.0 is 2.512 times brighter than a 6.0 magnitude star, these brightness differences amaze me.

That's particularly true when I apply these differences as I observe. For example, my site's visual limiting magnitude is about 6.0. My telescope lets me see stars with an apparent magnitude fainter than 13. That difference of 7 magnitudes means the dimmest star I can see with my naked eye is 631 times brighter than what I can see through my telescope.

Or consider a variable star that has a range of 4 magnitudes between its brightest and dimmest. Small numbers like that are deceptive; they just don't sound like that big of a deal. But a range of four magnitudes means the star is 40 times brighter at its brightest. That's an impressive change.

Thinking about brightness when you look at apparent magnitudes or magnitude differences puts what I'm looking at in perspective and makes going out in the cold weather this time of year well worth the effort.

Here's an Excel table I made showing the brightness difference for every 1/10th of apparent magnitude difference from 0.1 to 15.9. The table is too wide to fit on this blog page so I apologize for posting it sideways. At least you can read it and copy/paste it if you want to use it.

I don't yet know how to attach a link to the Excel spreadsheet (I'm new at blogging). If you would like copy of the spreadsheet, let me know and I'll email it to you. I'll also try to figure out how to post it, or at least a larger version.

No comments:

Post a Comment

Please comment - there's lots of room - after all, "The universe is a big place, perhaps the biggest." (Kilgore Trout)