Star magnitude. Magnitude of the sun and moon

  • 25.09.2019

magnitude

Dimensionless physical quantity characterizing , created by a celestial object near the observer. Subjectively, its meaning is perceived as (y) or (y). In this case, the brightness of one source is indicated by comparing it with the brightness of another, taken as a standard. Such standards are usually specially selected non-variable stars. The magnitude was first introduced as an indicator of the apparent brightness of optical stars, but later extended to other radiation ranges:,. The magnitude scale is logarithmic, as is the decibel scale. In the magnitude scale, a difference of 5 units corresponds to a 100-fold difference in the fluxes of light from the measured and reference sources. Thus, a difference of 1 magnitude corresponds to a ratio of light fluxes of 100 1/5 = 2.512 times. Designate the magnitude of the Latin letter "m"(from Latin magnitudo, value) as a superscript in italics to the right of the number. The direction of the magnitude scale is reversed, i.e. the larger the value, the weaker the brilliance of the object. For example, a star of 2nd magnitude (2 m) is 2.512 times brighter than a 3rd magnitude star (3 m) and 2.512 x 2.512 = 6.310 times brighter than a 4th magnitude star (4 m).

Apparent magnitude (m; often referred to simply as "magnitude") indicates the radiation flux near the observer, i.e. the observed brightness of a celestial source, which depends not only on the actual radiation power of the object, but also on the distance to it. The scale of apparent magnitudes originates from the stellar catalog of Hipparchus (before 161 ca. 126 BC), in which all the stars visible to the eye were first divided into 6 classes according to brightness. The stars of the Bucket of the Great Bear have a shine of about 2 m, Vega has about 0 m. For particularly bright luminaries, the magnitude value is negative: for Sirius, about -1.5 m(i.e. the flux of light from it is 4 times greater than from Vega), and the brightness of Venus at some moments almost reaches -5 m(i.e. the light flux is almost 100 times greater than from Vega). We emphasize that the apparent stellar magnitude can be measured both with the naked eye and with the help of a telescope; both in the visual range of the spectrum, and in others (photographic, UV, IR). In this case, "apparent" (English apparent) means "observed", "apparent" and is not specifically related to the human eye (see:).

Absolute magnitude(M) indicates what apparent stellar magnitude the luminary would have if the distance to it were 10 and there would be no . Thus, the absolute stellar magnitude, in contrast to the visible one, allows one to compare the true luminosities of celestial objects (in a given range of the spectrum).

As for the spectral ranges, there are many systems of magnitudes that differ in the choice of a specific measurement range. When observed with the eye (with the naked eye or through a telescope), it is measured visual magnitude(m v). From the image of a star on a conventional photographic plate, obtained without additional light filters, the photographic magnitude(mP). Since photographic emulsion is sensitive to blue light and insensitive to red light, blue stars appear brighter (than it appears to the eye) on the photographic plate. However, with the help of a photographic plate, using orthochromatic and yellow, one obtains the so-called photovisual magnitude scale(m P v), which almost coincides with the visual one. By comparing the brightness of a source measured in different ranges of the spectrum, one can find out its color, estimate the surface temperature (if it is a star) or (if it is a planet), determine the degree of interstellar absorption of light, and other important characteristics. Therefore, standard ones have been developed, mainly determined by the selection of light filters. The most popular tricolor: ultraviolet (Ultraviolet), blue (Blue) and yellow (Visual). At the same time, the yellow range is very close to the photovisual one (B m P v), and blue to photographic (B m P).

Imagine that somewhere in the sea in the darkness of the night, a light flickers quietly. If an experienced sailor does not explain to you what it is, you often will not know whether it is a flashlight on the bow of a passing boat in front of you, or a powerful searchlight from a distant lighthouse.

We are in the same position on a dark night, looking at the twinkling stars. Their apparent brilliance also depends on their true power of light, called luminosity, and from their distance to us. Only knowing the distance to a star allows us to calculate its luminosity compared to the Sun. Thus, for example, the luminosity of a star ten times less luminous in reality than the Sun is expressed by the number 0.1.

The true strength of the light of a star can be expressed in another way, by calculating what magnitude it would seem to us if it were at a standard distance of 32.6 light years from us, that is, at such that light, rushing at a speed of 300,000 km /sec, would pass it during this time.

Accepting such a standard distance proved to be convenient for various calculations. The brightness of a star, like any light source, varies inversely with the square of the distance from it. This law allows you to calculate the absolute magnitudes or luminosities of stars, knowing the distance to them.

When the distances to the stars became known, we were able to calculate their luminosities, that is, we could, as it were, line them up in one line and compare them with each other in same conditions. It must be confessed that the results were astonishing, since it had previously been assumed that all stars were "similar to our Sun." The luminosities of the stars turned out to be amazingly diverse, and they cannot be compared in our line with any line of pioneers.

Let us give only extreme examples of luminosity in the world of stars.

The weakest known for a long time was a star, which is 50 thousand times weaker than the Sun, and its absolute luminosity value: +16.6. However, even fainter stars were subsequently discovered, the luminosity of which, compared to the sun, is millions of times less!

Dimensions in space are deceptive: Deneb from Earth shines brighter than Antares, but the Pistol is not visible at all. However, to an observer from our planet, both Deneb and Antares seem to be just insignificant points compared to the Sun. How wrong this is can be judged by a simple fact: A gun releases as much light in a second as the Sun does in a year!

On the other side of the line of stars stands "S" Dorado, visible only in the countries of the Southern Hemisphere of the Earth as an asterisk (that is, not even visible without a telescope!). In fact, it is 400 thousand times brighter than the Sun, and its absolute luminosity value is -8.9.

Absolute the magnitude of the luminosity of our Sun is +5. Not so much! From a distance of 32.6 light years, we would not have seen it well without binoculars.

If the brightness of an ordinary candle is taken as the brightness of the Sun, then in comparison with it, the “S” of Doradus will be a powerful searchlight, and the faintest star is weaker than the most miserable firefly.

So, the stars are distant suns, but their light intensity can be completely different from that of our luminary. Figuratively speaking, it would be necessary to change our Sun for another one with caution. From the light of one we would be blind, by the light of the other we would wander as in twilight.

Magnitudes

Since the eyes are the first instrument of measurement, we must know simple rules, to which our estimates of the brightness of light sources obey. Our estimate of the brightness difference is relative rather than absolute. Comparing two faint stars, we see that they differ noticeably from each other, but for two bright stars the same difference in brightness remains unnoticed by us, since it is negligible compared to the total amount of light emitted. In other words, our eyes evaluate relative, but not absolute gloss difference.

Hipparchus first divided the stars visible to the naked eye into six classes, according to their brightness. Later, this rule was somewhat improved without changing the system itself. The magnitude classes were distributed so that a 1st magnitude star (the middle of 20) gave a hundred times more light than a 6th magnitude star, which is at the limit of visibility for most people.

A difference of one magnitude equals the square of 2.512. A difference of two magnitudes corresponds to 6.31 (2.512 squared), three magnitudes to 15.85 (2.512 to the third power), four magnitudes to 39.82 (2.512 to the fourth power), and five magnitudes to 100 (2.512 to the power of fifth degree).

A 6th magnitude star gives us a hundred times less light than a 1st magnitude star, and an 11th magnitude star ten thousand times less. If we take a star of the 21st magnitude, then its brightness will be less than 100,000,000 times.

As it is already clear - the absolute and relative driving value,
things are completely incomparable. To a "relative" observer from our planet, Deneb in the constellation Cygnus looks something like this. And in fact, the entire orbit of the Earth would barely be enough to completely contain the circumference of this star.

To correctly classify stars (and they all differ from each other), care must be taken to maintain a brightness ratio of 2.512 along the entire interval between neighboring stellar magnitudes. It is impossible to do such work with a simple eye; special tools are needed, according to the type photometers Pickering, who use the Polar Star or even an "average" artificial star as a standard.

Also, for the convenience of measurements, it is necessary to weaken the light of very bright stars; this can be achieved either with a polarizing device, or with the help of photometric wedge.

Purely visual methods, even with the help of large telescopes, cannot extend our scale of stellar magnitudes to faint stars. In addition, visual methods of measurement should (and can) be made only directly at the telescope. Therefore, a purely visual classification has already been abandoned in our time, and the photoanalysis method is used.

How can you compare the amount of light received by a photographic plate from two stars of different brightness? To make them appear the same, it is necessary to attenuate the light from the brighter star by a known amount. The easiest way to do this is to put the aperture in front of the telescope lens. The amount of light entering a telescope varies with the area of ​​the lens, so that any star's light attenuation can be accurately measured.

Let's choose some star as a standard one and photograph it with a full aperture of the telescope. Then we will determine which aperture should be used at a given exposure in order to obtain the same image when shooting a brighter star as in the first case. The ratio of the areas of the reduced and full holes gives the ratio of the brightness of the two objects.

This method of measurement gives an error of only 0.1 magnitude for any of the stars in the range from 1st to 18th magnitude. The magnitudes obtained in this way are called photovisual.

Even people far from astronomy know that stars have different brilliance. The brightest stars are easily visible in overexposed urban skies, while the faintest stars are barely visible under ideal viewing conditions. To characterize the brightness of stars and other celestial bodies (for example, planets, meteors, the Sun and the Moon), scientists have developed a scale of stellar magnitudes.

concept "star magnitude" used by astronomers for over 2000 years. Probably the first to introduce it was the famous ancient Greek astronomer and mathematician Hipparchus in the 2nd century BC. Regularly observing the starry sky from the island of Rhodes in the Aegean Sea, Hipparchus once witnessed the appearance of a new bright star in the constellation Scorpio. Impressed by this event, the astronomer decided to compile a catalog of stars in order to quickly find new stars in the future, if any. As a result, the astronomer rewrote 1025 stars: he not only gave the coordinates for each star, but also divided them into 6 magnitudes.

by the most bright Hipparchus appropriated the stars first stellar magnitude, and most dim, barely visible eye, - sixth. At the same time, the stars of the 2nd magnitude were considered as many times weaker than the stars of the 1st, as the stars of the 3rd magnitude were weaker than the stars of the 2nd, and so on: an arithmetic progression was obtained. In the catalog of Hipparchus, there were 15 stars of the first magnitude, 45 stars of the second, 208 of the third, 474 of the fourth, 217 of the fifth and 49 stars of the sixth magnitude (plus several nebulae).

Why Hipparchus called the characteristic of the brightness of stars magnitude?

In ancient times, people believed that the stars were located in the celestial sphere at the same distance from the Earth, so the difference in the brightness of the stars was explained by the difference in their actual size or magnitude.

Hence the stars of the first magnitude must have been much larger than the stars of the sixth magnitude.

According to the scale introduced by Hipparchus, such stars as Deneb or Capella had the first magnitude (abbreviated as 1 m), and these were the largest, "important" stars. The stars of the Ursa Major bucket had an average of 2 m, these were already "smaller" stars. Over time, astronomers realized that the stellar magnitude does not determine the real size of the star, but only its brilliance, that is the illumination it creates on earth, however, continued to use the Hipparchus scale.

It should be remembered that the scale of stellar magnitudes is the opposite: the brighter the star, the smaller its magnitude. Conversely, the dimmer the star, the big size she has.

By the middle of the XIX century, the development of science required to determine the brightness of the stars more accurately. In particular, it turned out that human vision is arranged in a special way: when the illumination changes exponentially, it conveys sensations to us in an arithmetic progression. It turned out that not 6 stars of the 6th magnitude will create the same illumination as the star of the 1st (as previously assumed), but a whole hundred!

In 1856, the English astronomer Norman Pogson proposed to build a scale of stellar magnitudes, taking into account the psychophysical law of vision. According to Pogson, a 1st magnitude star, by definition, created an illumination exactly 100 times greater than a 6m star. Thus, it turns out that the modern magnitude scale is logarithmic: a 1st magnitude star is about 2.512 times brighter than a 2nd magnitude star, and that, in turn, is 2.512 times brighter than a 3rd magnitude star, and so on.

Magnitude is a dimensionless characteristic of the brilliance of a heavenly body. This image shows the famous double cluster in the constellation Perseus. The brightest stars in the photo have a magnitude of 6, the dimmest - about the 17th. According to Pogson's formula, the brightest stars in a photo are 25,000 times brighter than barely visible ones. © New Forest Observatory

But why report? What is taken as a zero point?

As you know, astronomy is an exact science, and therefore any physical characteristic must be measured in some quantities. So, force is measured in newtons, energy - in joules. In this sense, the stellar magnitude is a dimensionless characteristic of the brilliance of celestial bodies. Pogson proposed to consider the brightness of the Polar Star equal to exactly 2 m (just like Celsius took the freezing point of water as 0 °), and determine the magnitudes of the remaining stars, starting from it. But later it turned out that the brightness of the North Star is not constant, and then Vega was already taken as a standard. Today, 0 m is taken as a well-defined illumination equal to the energy value E=2,48*10^-8 W/m².

Actually, exactly illumination and are determined by astronomers during observations, and only then it is specially translated into stellar magnitudes.

They do this not only because “it’s more familiar,” but also because the magnitude turned out to be a very convenient concept. Measure light intensity in watts square meter is extremely cumbersome: for the Sun, the value is large, and for faint telescopic stars, it is very small. At the same time, it is much easier to operate with stellar magnitudes (precisely due to the fact that this is a logarithmic scale). Thus, the brightness of the Sun is -26.73 m, and the brightness of the faintest objects that can be photographed using the Hubble telescope is approximately 31.50 m. As you can see, the difference is only 58 "steps".

In the beginning, magnitude was used as an indication of the brightness of stars that were observed optically (i.e., visually or photographically). Later, the scale was extended to the ultraviolet and infrared radiation ranges. It is clear that stars radiate unevenly at different wavelengths, so the stellar magnitude of a heavenly body depends on the spectral sensitivity of the radiation receiver.

visual magnitude mv corresponds to the spectral sensitivity of the human eye (the maximum falls on the wavelength lambda = 555 μm).

Photovisual magnitude V(or yellow) practically coincides with the visual one, and at present it is in the scale of photovisual magnitudes that the brilliance of stars and other celestial bodies is indicated in catalogs intended for astronomy lovers ..

photographic magnitude B(or blue) is determined by measuring the brightness of a star on a photographic plate sensitive to blue rays, or using a photomultiplier with a blue filter.

Finally, bolometric magnitude mbol corresponds to the total radiation power of the star in all ranges of the spectrum. For example, the bolometric stellar magnitude of the Sun is only slightly less than the visual magnitude, since almost all of the star's radiation is in the visible range. On the other hand, the bolometric sound led. red dwarfs are much smaller than their visual stars. magnitude, since most of the radiation energy is in the infrared range. The same situation is observed with hot stars of spectral classes O and B, which emit mainly in the ultraviolet.

Scale of stellar magnitudes. Drawing: Big Universe

Until now, speaking of magnitude, we meant apparent magnitude , i.e., the one that is recorded directly when observing a celestial body. Apparent stellar magnitude means "observed", "apparent" and does not say anything about what real luminosity of a celestial body. For example, Venus in the sky looks much brighter than any star; its maximum brightness reaches -4.67 m. However, this does not mean that the planet "emits" more light than the stars; the great brilliance of Venus is due to its proximity to the Earth.

To compare the real fluxes of light energy coming from celestial bodies, astronomers conditionally place them at a standard distance of 10 parsecs from the Earth. Absolute magnitude (M) shows what apparent magnitude would heavenly body in the event that the distance to it was 10 parsecs.

Apparent stellar magnitudes of some celestial bodies

The sun: -26,73
Moon (on a full moon): -12,74
Venus (at maximum brightness): -4,67
Jupiter (at maximum brightness): -2,91
Sirius: -1,44
Vega: 0,03
The faintest stars visible to the naked eye are: about 6.0
The sun from a distance of 100 light years: 7,30
Proxima Centauri: 11,05
The brightest quasar 12,9
The faintest objects photographed by the Hubble telescope: 31,5

Let's continue our algebraic excursion to the heavenly bodies. In the scale that is used to evaluate the brightness of stars, they can, in addition to fixed stars; find a place for yourself and other luminaries - the planets, the Sun, the Moon. We will talk separately about the brightness of the planets; here we indicate the stellar magnitude of the Sun and Moon. The star magnitude of the Sun is expressed by the number minus 26.8, and the full1) Moon - minus 12.6. Why both numbers are negative, the reader must think, is understandable after all that has been said before. But, perhaps, he will be perplexed by the insufficiently large difference between the magnitude of the Sun and the Moon: the first is “only twice as large as the second.”

Let's not forget, however, that the designation of magnitude is, in essence, a certain logarithm (based on 2.5). And just as it is impossible, when comparing numbers, to divide their logarithms one by another, so it makes no sense, when comparing stellar magnitudes, to divide one number by another. What is the result of a correct comparison, shows the following calculation.

If the magnitude of the Sun is "minus 26.8", then this means that the Sun is brighter than a star of the first magnitude

2.527.8 times. The moon is brighter than a star of the first magnitude

2.513.6 times.

This means that the brightness of the sun is greater than the brightness of the full moon at

2.5 27.8 2.5 14.2 times. 2.5 13.6

Calculating this value (using tables of logarithms), we get 447,000. Here, therefore, is the correct ratio of the brightness of the Sun and the Moon: a daytime star in clear weather illuminates the Earth 447,000 times stronger than the full Moon on a cloudless night.

Considering that the amount of heat thrown off by the Moon is proportional to the amount of light scattered by it - and this is probably close to the truth - we must admit that the Moon sends us heat 447,000 times less than the Sun. It is known that every square centimeter on the border earth's atmosphere receives from the Sun about 2 small calories of heat in 1 minute. This means that the Moon sends to 1 cm2 of the Earth every minute no more than 225,000th part of a small calorie (i.e., it can heat 1 g of water in 1 minute by 225,000th part of a degree). This shows how unsubstantiated all attempts to attribute any influence to the moonlight on the earth's weather2) .

1) In the first and last quarter, the magnitude of the Moon is minus 9.

2) The question of whether the Moon can influence the weather by its attraction will be considered at the end of the book (see "Moon and Weather").

The common belief that clouds often melt under the action of the rays of the full moon is a gross misconception, explained by the fact that the disappearance of clouds at night (due to other reasons) becomes noticeable only under moonlight.

Let us now leave the Moon and calculate how many times the Sun is brighter than the most brilliant star in the entire sky - Sirius. Arguing in the same way as before, we obtain the ratio of their brightness:

2,5 27,8

2,5 25,2

2,52,6

i.e., the Sun is 10 billion times brighter than Sirius.

The following calculation is also very interesting: how many times the illumination given by the full moon is brighter than the total illumination of the entire starry sky, i.e., all the stars visible to the naked eye in one celestial hemisphere? We have already calculated that stars from the first to the sixth magnitude inclusive shine together like a hundred stars of the first magnitude. The problem, therefore, comes down to calculating how many times the moon is brighter than a hundred stars of the first magnitude.

This ratio is equal

2,5 13,6

100 2700.

So, on a clear moonless night, we receive from the starry sky only 2700th of the light that the full moon sends, and 2700 × 447,000, that is, 1200 million times less than the sun gives on a cloudless day.

We also add that the magnitude of the normal international

"candles" at a distance of 1 m is equal to minus 14.2, which means that a candle at a specified distance illuminates brighter than the full moon by 2.514.2-12.6 i.e. four times.

It is perhaps also interesting to note that the searchlight of an aviation beacon with a power of 2 billion candles would be visible from the distance of the Moon as a star of 4½th magnitude, i.e., could be distinguished by the naked eye.

The true brilliance of the stars and the sun

All the brightness estimates we have made so far have only referred to their apparent brightness. The given numbers express the brightness of the luminaries at the distances at which each of them is actually located. But we know well that the stars are not equally distant from us; the apparent brilliance of the stars tells us, therefore, both of their true brilliance and of their distance from us—or, rather, of neither, until we have dissected both factors. Meanwhile, it is important to know what would be the comparative brightness or, as they say, the "luminosity" of various stars if they were at the same distance from us.

Putting the question in this way, astronomers introduce the concept of the "absolute" magnitude of stars. The absolute magnitude of a star is the one that the star would have if it were at a distance from us.

standing 10 "parsecs". Parsec is a special measure of length used for stellar distances; we will discuss its origin separately later, here we will only say that one parsec is about 30,800,000,000,000 km. It is not difficult to calculate the absolute magnitude itself, if you know the distance of the star and take into account that the brightness should decrease in proportion to the square of the distance1).

We will acquaint the reader with the result of only two such calculations: for Sirius and for our Sun. The absolute value of Sirius is +1.3, the Sun is +4.8. This means that from a distance of 30,800,000,000,000 km, Sirius would shine for us with a star of 1.3 magnitude, and for our Sun of 4.8 magnitude, i.e., weaker than Sirius in

2.5 3.8 2.53.5 25 times,

2,50,3

although the apparent brilliance of the Sun is 10,000,000,000 times that of Sirius.

We have seen that the Sun is far from being the brightest star in the sky. However, one should not consider our Sun as a completely pygmy among the stars surrounding it: its luminosity is still above average. According to stellar statistics, the average luminosity of the stars surrounding the Sun up to a distance of 10 parsecs are stars of the ninth absolute magnitude. Since the absolute magnitude of the Sun is 4.8, it is brighter than the average of the "neighboring" stars, in

2,58

2,54,2

50 times.

2,53,8

Being 25 times absolutely dimmer than Sirius, the Sun is still 50 times brighter than the average of the stars surrounding it.

The brightest star known

The largest luminosity is possessed by an asterisk of the eighth magnitude inaccessible to the naked eye in the constellation Dorado, designated

1) The calculation can be carried out according to the following formula, the origin of which will become clear to the reader when a little later he becomes more familiar with "parsec" and "parallax":

Here M is the absolute magnitude of the star, m is its apparent magnitude, π is the parallax of the star in

seconds. Successive transformations are as follows: 2.5M \u003d 2.5m 100π 2,

M lg 2.5 \u003d m lg 2.5 + 2 + 2 lgπ, 0.4M \u003d 0.4m +2 + 2 lgπ,

M = m + 5 + 5 lgπ .

For Sirius, for example, m = –1.6π = 0 "38. Therefore, its absolute value

M = –l.6 + 5 + 5 log 0.38 = 1.3.

Latin letter S. The constellation Dorado is located in the southern hemisphere of the sky and is not visible in the temperate zone of our hemisphere. The mentioned asterisk is part of the neighboring star system - the Small Magellanic Cloud, the distance of which from us is estimated to be about 12,000 times greater than the distance to Sirius. At such a huge distance, a star must have a completely exceptional luminosity in order to appear even eighth magnitude. Sirius, thrown just as deep in space, would shine as a star of the 17th magnitude, i.e., would be barely visible in the most powerful telescope.

What is the luminosity of this wonderful star? The calculation gives the following result: minus the eighth value. This means that our star is absolutely: 400,000 times (roughly) brighter than the Sun! With such exceptional brightness, this star, if placed at a distance from Sirius, would appear nine magnitudes brighter than it, that is, it would have approximately the brightness of the Moon in the quarter phase! A star that, from the distance of Sirius, could flood the Earth with such a bright light, has an indisputable right to be considered the brightest star known to us.

The star magnitude of the planets on the earth and alien sky

Let us now return to the mental journey to other planets (done by us in the section “Alien skies”) and evaluate more accurately the brilliance of the luminaries shining there. First of all, let us indicate the stellar magnitudes of the planets at their maximum brightness in the earth's sky. Here is the plate.

In the sky of the Earth:

Venus.............................

Saturn..............................

Mars..................................

Uranus..................................

Jupiter...........................

Neptune.............................

Mercury......................

Looking through it, we see that Venus is brighter than Jupiter by almost two magnitudes, i.e. 2.52 = 6.25 times, and Sirius 2.5-2.7 = 13 times

(the brightness of Sirius is 1.6th magnitude). From the same plate it can be seen that the dim planet Saturn is still brighter than all the fixed stars, except for Sirius and Canopus. Here we find an explanation for the fact that the planets (Venus, Jupiter) are sometimes visible to the naked eye during the day, while the stars in daylight are completely inaccessible to the naked eye.