Skip to main content
added 2181 characters in body
Source Link
ProfRob
  • 155.7k
  • 9
  • 371
  • 584

That is an accurate summary, at least with current instrumentation.

Edit (for the dedicated reader)

To further explain a few things. The answer above considers an "object" to be a resolved thing in the sky. Clearly galaxies, consisting of unresolved stars, are mostly empty space and so the vast majority of sightlines will not intercept the surface of a star or anything else. However, that does not mean that sightline is "dark" because all instrumentation we have has a finite resolution that blurs the light from these stars into an image of a galaxy.

Some have commented on the finite age of the universe, Olber's paradox and possibly misunderstood what is meant by "depth" in the quote above. Galaxies and stars have a vast range of luminosities and the least luminous things are much more common than the more luminous. Even if you can only observe to a set distance (e.g. set by the finite time since stars and galaxies were first formed), then increasing the exposure time, or "depth" of your image will still reveal more and more of the less luminous objects.

If there were a lower limit to the luminosity of a galaxy, then in principle yes, there might come a time when instrumentation was so good that increasing exposure time would not reveal more objects, but we aren't there yet. Even if that were the case, there is no guarantee at all, even with excellent angular resolution, that sightlines will not intercept any galaxies, because they have a finite angular size - and angular size actually increases with large redshifts in the currently accepted cosmological model.

Finally we should talk about wavelength. It is far easier to find "empty sky" in the optical (e.g. the Hubble Deep Field), because the light from distant galaxies gets redshifted out of the visible range. It will be interesting to see how crowded JWST deep fields will be in the infrared at an equivalent depth. They will certainly be more crowded, but whether they present an "infrared wall" will depend on the uncertain details of the formation timescale, size scale and star formation history of early galaxies and the shape of the bottom end of the galaxy luminosity function with redshift.

That is an accurate summary.

That is an accurate summary, at least with current instrumentation.

Edit (for the dedicated reader)

To further explain a few things. The answer above considers an "object" to be a resolved thing in the sky. Clearly galaxies, consisting of unresolved stars, are mostly empty space and so the vast majority of sightlines will not intercept the surface of a star or anything else. However, that does not mean that sightline is "dark" because all instrumentation we have has a finite resolution that blurs the light from these stars into an image of a galaxy.

Some have commented on the finite age of the universe, Olber's paradox and possibly misunderstood what is meant by "depth" in the quote above. Galaxies and stars have a vast range of luminosities and the least luminous things are much more common than the more luminous. Even if you can only observe to a set distance (e.g. set by the finite time since stars and galaxies were first formed), then increasing the exposure time, or "depth" of your image will still reveal more and more of the less luminous objects.

If there were a lower limit to the luminosity of a galaxy, then in principle yes, there might come a time when instrumentation was so good that increasing exposure time would not reveal more objects, but we aren't there yet. Even if that were the case, there is no guarantee at all, even with excellent angular resolution, that sightlines will not intercept any galaxies, because they have a finite angular size - and angular size actually increases with large redshifts in the currently accepted cosmological model.

Finally we should talk about wavelength. It is far easier to find "empty sky" in the optical (e.g. the Hubble Deep Field), because the light from distant galaxies gets redshifted out of the visible range. It will be interesting to see how crowded JWST deep fields will be in the infrared at an equivalent depth. They will certainly be more crowded, but whether they present an "infrared wall" will depend on the uncertain details of the formation timescale, size scale and star formation history of early galaxies and the shape of the bottom end of the galaxy luminosity function with redshift.

added 55 characters in body
Source Link
ProfRob
  • 155.7k
  • 9
  • 371
  • 584

It is really quite hard to answer the question as posed because as you observe deeper and deeper (e.g. using a larger telescope or observing for longer) then more and more (fainter) objects become apparent.

Every telescope that you use (and indeed your eye) has a finite angular resolution - the smallest angle between two objects that can be resolved. i.e. The closest two objects can be where there might still be some perceived "gap" between them.

Since every telescope has a finite resolution, but you could in principle just observe deeper and deeper, then eventually you reach the stage where the whole sky is almost full of very faint objects with few discernable gaps between them (at least for the telescopes in use today).

As a rough idea, there are often said to be at least at least $10^{11}$ galaxies in the observable universe and at leat $10^{11}$ stars in our own galaxy. If we were to spread these evenly over the sky (probably ok for galaxies), then the separation between adjacent galaxies or stars is about 2 arcseconds. For the stars, most big telescopes at good observing sites would be able to resolve these (though the stars are unevenly distributed and telescopes are incapable of resolving all the stars towards the plane of the Milky WyWay for example because the density is much higher). Galaxies though have a finite size - e.g. a galaxy of diameter 10 kpc seen at a distance of 1 Gpc has an angular size of 2 arcseconds.

Thus with the best telescopes and the deepest exposures, if you look very closely there is a galaxy (or at least the blurred image of a galaxy) intercepting almost every line of sight.

However, if you were to define some brightness limit to your pictures and ignore the likelihood that there were fainter objects in the "gaps" then you could attempt to put some percentage figure on it. e.g. Here is an image from the Hubble Ultra Deep Field.

Hubble Ultra Deep Field

You might estimate (by eye) that about 20% of the pixels are filled with a galaxy of some sort. There are about 10,000 identified galaxies in this 3.1x3.1 arcmin$^2$ image, so each galaxy is actually only separated by about 2 arcsec (see the calculation above) and you would be hard pressed to count those 10,000 galaxies by eye since most of them are extremely faint blurs that occupy what you might perceive initially as gaps.

Finally, the answer you get will depend not only on the depth of your image but the resolution of the instrument taking it. To quote your own comment:

For a given resolution, driving the depth to infinity sends the proportion of "nothing" to 0, but for a given depth (i.e. exposure time), increasing the resolution increases the proportion of sky that has nothing?

That is an accurate summary.

It is really quite hard to answer the question as posed because as you observe deeper and deeper (e.g. using a larger telescope or observing for longer) then more and more (fainter) objects become apparent.

Every telescope that you use (and indeed your eye) has a finite angular resolution - the smallest angle between two objects that can be resolved. i.e. The closest two objects can be where there might still be some perceived "gap" between them.

Since every telescope has a finite resolution, but you could in principle just observe deeper and deeper, then eventually you reach the stage where the whole sky is almost full of very faint objects with few discernable gaps between them.

As a rough idea, there are often said to be at least $10^{11}$ galaxies in the observable universe and at leat $10^{11}$ stars in our own galaxy. If we were to spread these evenly over the sky (probably ok for galaxies), then the separation between adjacent galaxies is about 2 arcseconds. For the stars, most big telescopes at good observing sites would be able to resolve these (though the stars are unevenly distributed and telescopes are incapable of resolving all the stars towards the plane of the Milky Wy for example because the density is much higher). Galaxies though have a finite size - e.g. a galaxy of diameter 10 kpc seen at a distance of 1 Gpc has an angular size of 2 arcseconds.

Thus with the best telescopes and the deepest exposures, if you look very closely there is a galaxy (or at least the blurred image of a galaxy) intercepting almost every line of sight.

However, if you were to define some brightness limit to your pictures and ignore the likelihood that there were fainter objects in the "gaps" then you could attempt to put some percentage figure on it. e.g. Here is an image from the Hubble Ultra Deep Field.

Hubble Ultra Deep Field

You might estimate (by eye) that about 20% of the pixels are filled with a galaxy of some sort. There are about 10,000 identified galaxies in this 3.1x3.1 arcmin$^2$ image, so each galaxy is actually only separated by about 2 arcsec (see the calculation above) and you would be hard pressed to count those 10,000 galaxies by eye since most of them are extremely faint blurs that occupy what you might perceive initially as gaps.

Finally, the answer you get will depend not only on the depth of your image but the resolution of the instrument taking it. To quote your own comment:

For a given resolution, driving the depth to infinity sends the proportion of "nothing" to 0, but for a given depth (i.e. exposure time), increasing the resolution increases the proportion of sky that has nothing?

That is an accurate summary.

It is really quite hard to answer the question as posed because as you observe deeper and deeper (e.g. using a larger telescope or observing for longer) then more and more (fainter) objects become apparent.

Every telescope that you use (and indeed your eye) has a finite angular resolution - the smallest angle between two objects that can be resolved. i.e. The closest two objects can be where there might still be some perceived "gap" between them.

Since every telescope has a finite resolution, but you could in principle just observe deeper and deeper, then eventually you reach the stage where the whole sky is almost full of very faint objects with few discernable gaps between them (at least for the telescopes in use today).

As a rough idea, there are often said to be at least $10^{11}$ galaxies in the observable universe and at leat $10^{11}$ stars in our own galaxy. If we were to spread these evenly over the sky (probably ok for galaxies), then the separation between adjacent galaxies or stars is about 2 arcseconds. For the stars, most big telescopes at good observing sites would be able to resolve these (though the stars are unevenly distributed and telescopes are incapable of resolving all the stars towards the plane of the Milky Way for example because the density is much higher). Galaxies though have a finite size - e.g. a galaxy of diameter 10 kpc seen at a distance of 1 Gpc has an angular size of 2 arcseconds.

Thus with the best telescopes and the deepest exposures, if you look very closely there is a galaxy (or at least the blurred image of a galaxy) intercepting almost every line of sight.

However, if you were to define some brightness limit to your pictures and ignore the likelihood that there were fainter objects in the "gaps" then you could attempt to put some percentage figure on it. e.g. Here is an image from the Hubble Ultra Deep Field.

Hubble Ultra Deep Field

You might estimate (by eye) that about 20% of the pixels are filled with a galaxy of some sort. There are about 10,000 identified galaxies in this 3.1x3.1 arcmin$^2$ image, so each galaxy is actually only separated by about 2 arcsec (see the calculation above) and you would be hard pressed to count those 10,000 galaxies by eye since most of them are extremely faint blurs that occupy what you might perceive initially as gaps.

Finally, the answer you get will depend not only on the depth of your image but the resolution of the instrument taking it. To quote your own comment:

For a given resolution, driving the depth to infinity sends the proportion of "nothing" to 0, but for a given depth (i.e. exposure time), increasing the resolution increases the proportion of sky that has nothing?

That is an accurate summary.

added 404 characters in body
Source Link
ProfRob
  • 155.7k
  • 9
  • 371
  • 584

It is really quite hard to answer the question as posed because as you observe deeper and deeper (e.g. using a larger telescope or observing for longer) then more and more (fainter) objects become apparent.

Every telescope that you use (and indeed your eye) has a finite angular resolution - the smallest angle between two objects that can be resolved. i.e. The closest two objects can be where there might still be some perceived "gap" between them.

Since every telescope has a finite resolution, but you could in principle just observe deeper and deeper, then eventually you reach the stage where the whole sky is almost full of very faint objects with few discernable gaps between them.

As a rough idea, there are often said to be at least $10^{11}$ galaxies in the observable universe and at leat $10^{11}$ stars in our own galaxy. If we were to spread these evenly over the sky (probably ok for galaxies), then the separation between adjacent galaxies is about 2 arcseconds. For the stars, most big telescopes at good observing sites would be able to resolve these (though the stars are unevenly distributed and telescopes are incapable of resolving all the stars towards the plane of the Milky Wy for example because the density is much higher). Galaxies though have a finite size - e.g. a galaxy of diameter 10 kpc seen at a distance of 1 Gpc has an angular size of 2 arcseconds.

Thus with the best telescopes and the deepest exposures, if you look very closely there is a galaxy (or at least the blurred image of a galaxy) intercepting almost every line of sight.

However, if you were to define some brightness limit to your pictures and ignore the likelihood that there were fainter objects in the "gaps" then you could attempt to put some percentage figure on it. e.g. Here is an image from the Hubble Ultra Deep Field.

Hubble Ultra Deep Field

You might estimate (by eye) that about 20% of the pixels are filled with a galaxy of some sort. There are about 10,000 identified galaxies in this 3.1x3.1 arcmin$^2$ image, so each galaxy is actually only separated by about 2 arcsec (see the calculation above) and you would be hard pressed to count those 10,000 galaxies by eye since most of them are extremely faint blurs that occupy what you might perceive initially as gaps.

Finally, the answer you get will depend not only on the depth of your image but the resolution of the instrument taking it. To quote your own comment:

For a given resolution, driving the depth to infinity sends the proportion of "nothing" to 0, but for a given depth (i.e. exposure time), increasing the resolution increases the proportion of sky that has nothing?

That is an accurate summary.

It is really quite hard to answer the question as posed because as you observe deeper and deeper (e.g. using a larger telescope or observing for longer) then more and more (fainter) objects become apparent.

Every telescope that you use (and indeed your eye) has a finite angular resolution - the smallest angle between two objects that can be resolved. i.e. The closest two objects can be where there might still be some perceived "gap" between them.

Since every telescope has a finite resolution, but you could in principle just observe deeper and deeper, then eventually you reach the stage where the whole sky is almost full of very faint objects with few discernable gaps between them.

As a rough idea, there are often said to be at least $10^{11}$ galaxies in the observable universe and at leat $10^{11}$ stars in our own galaxy. If we were to spread these evenly over the sky (probably ok for galaxies), then the separation between adjacent galaxies is about 2 arcseconds. For the stars, most big telescopes at good observing sites would be able to resolve these (though the stars are unevenly distributed and telescopes are incapable of resolving all the stars towards the plane of the Milky Wy for example because the density is much higher). Galaxies though have a finite size - e.g. a galaxy of diameter 10 kpc seen at a distance of 1 Gpc has an angular size of 2 arcseconds.

Thus with the best telescopes and the deepest exposures, if you look very closely there is a galaxy (or at least the blurred image of a galaxy) intercepting almost every line of sight.

However, if you were to define some brightness limit to your pictures and ignore the likelihood that there were fainter objects in the "gaps" then you could attempt to put some percentage figure on it. e.g. Here is an image from the Hubble Ultra Deep Field.

Hubble Ultra Deep Field

You might estimate (by eye) that about 20% of the pixels are filled with a galaxy of some sort. There are about 10,000 identified galaxies in this 3.1x3.1 arcmin$^2$ image, so each galaxy is actually only separated by about 2 arcsec (see the calculation above) and you would be hard pressed to count those 10,000 galaxies by eye since most of them are extremely faint blurs that occupy what you might perceive initially as gaps.

It is really quite hard to answer the question as posed because as you observe deeper and deeper (e.g. using a larger telescope or observing for longer) then more and more (fainter) objects become apparent.

Every telescope that you use (and indeed your eye) has a finite angular resolution - the smallest angle between two objects that can be resolved. i.e. The closest two objects can be where there might still be some perceived "gap" between them.

Since every telescope has a finite resolution, but you could in principle just observe deeper and deeper, then eventually you reach the stage where the whole sky is almost full of very faint objects with few discernable gaps between them.

As a rough idea, there are often said to be at least $10^{11}$ galaxies in the observable universe and at leat $10^{11}$ stars in our own galaxy. If we were to spread these evenly over the sky (probably ok for galaxies), then the separation between adjacent galaxies is about 2 arcseconds. For the stars, most big telescopes at good observing sites would be able to resolve these (though the stars are unevenly distributed and telescopes are incapable of resolving all the stars towards the plane of the Milky Wy for example because the density is much higher). Galaxies though have a finite size - e.g. a galaxy of diameter 10 kpc seen at a distance of 1 Gpc has an angular size of 2 arcseconds.

Thus with the best telescopes and the deepest exposures, if you look very closely there is a galaxy (or at least the blurred image of a galaxy) intercepting almost every line of sight.

However, if you were to define some brightness limit to your pictures and ignore the likelihood that there were fainter objects in the "gaps" then you could attempt to put some percentage figure on it. e.g. Here is an image from the Hubble Ultra Deep Field.

Hubble Ultra Deep Field

You might estimate (by eye) that about 20% of the pixels are filled with a galaxy of some sort. There are about 10,000 identified galaxies in this 3.1x3.1 arcmin$^2$ image, so each galaxy is actually only separated by about 2 arcsec (see the calculation above) and you would be hard pressed to count those 10,000 galaxies by eye since most of them are extremely faint blurs that occupy what you might perceive initially as gaps.

Finally, the answer you get will depend not only on the depth of your image but the resolution of the instrument taking it. To quote your own comment:

For a given resolution, driving the depth to infinity sends the proportion of "nothing" to 0, but for a given depth (i.e. exposure time), increasing the resolution increases the proportion of sky that has nothing?

That is an accurate summary.

Source Link
ProfRob
  • 155.7k
  • 9
  • 371
  • 584
Loading