Astronomers use weird units to measure celestial objects, and the reasons behind these strange units are not so strange after all. These units are created to accommodate the vast and immense sizes of the objects in space, and to make calculations and comparisons more manageable. In this article, we will explore some of the unusual units used by astronomers, and the reasons behind their usage.
In recent times, journalists and social media users have become fascinated by the use of weird units by astronomers. The sizes and distances involved in space are often so vast that familiar units of measurement become impractical. This has led to the creation of unique and unfamiliar units of measurement to describe celestial objects. While some of these units may seem absurd at first, they are necessary for astronomers to make sense of the universe.
Why do astronomers use strange units?
The universe is vast, and many of the objects in space are too large or too far away to be measured in familiar units. For instance, using kilometers to measure the distance between two galaxies would require an astronomical number. Similarly, using kilograms to measure the mass of a celestial object like a pulsar or a black hole would be unwieldy.
In light of this, astronomers use unique units of measurement that make more sense for the vast distances and sizes involved in space. The following are some of the units used by astronomers:
Astronomical units and parsecs
Astronomers use the astronomical unit (AU) to measure the distance between the Earth and the sun. One astronomical unit is equivalent to about 149.6 million kilometers. This unit is convenient because it is based on the average distance between the Earth and the sun, which is a fundamental relationship in the solar system.
However, this unit is not enough when measuring distances beyond the solar system. For such distances, astronomers use the parsec. One parsec is equivalent to about 3.26 light-years or 206,265 astronomical units. The parsec is a useful unit because it allows astronomers to measure distances to nearby stars and galaxies.
Astronomers use a unit called magnitude to measure the brightness of celestial objects. The ancient Greek astronomer Hipparchus created this unit in the second century BC. In this system, the brightest stars are assigned a value of 1, while the faintest stars are assigned a value of 6. Therefore, a brighter star has a lower magnitude value.
However, the magnitude system is not a linear scale, but a logarithmic one. This means that every increase in magnitude value corresponds to a decrease in brightness by a factor of 2.512. For instance, a star with magnitude 1 is about 2.512 times brighter than a star with magnitude 2.
Astronomers use a unit called milliCrab to measure the brightness of X-ray sources. The Crab is a pulsar in the remains of an exploded star, and it is extremely bright when observed using X-ray telescopes. The brightness of the Crab has been used to calibrate X-ray telescopes since the 1970s.
The milliCrab is one-thousandth of the brightness of the Crab. For instance, if an X-ray source is five thousandths as bright as the Crab, astronomers say it is 5 milliCrab bright. However, the brightness of the Crab varies depending on the energy of X-ray light observed and over time.