2
$\begingroup$

Why are the (extremely) distant type Ia supernovae dimmer than expected? Relative to what? Their cosmological red shift? Well, what else. Please don't just say "Because the expansion is accelerating". Of course I accept that but what's the actual mechanism? Is it simply because the CRS\z increases linearly and the intensity decreases as the inverse-square with distance, so it gets affected more, by the increased distance?

$\endgroup$
1

1 Answer 1

3
$\begingroup$

The supernovae are observed at some fixed redshift. Their peak brightness will depend on how far away they are. Changing the cosmological parameters changes the relationship between redshift and distance. In this case, introducing the cosmological constant, means the supernovae are further away, compared to the expectations of a decelerating expansion, for a given redshift and are hence dimmer.

A point of confusion here is that redshift is only directly related to proper distance now. The Hubble "constant" in fact changes with time in a complex way (it was bigger in the past) depending on the values of the matter and dark energy densities.

If we estimate the distance to the supernovae using Hubble's law and the vale of $H_0$ today then we overestimate the light-travel distance to the supernovae if the expansion has been slowing down. On the other hand, if the expansion has accelerated over the last few billion years then that brings the estimated light travel time back up towards the naive Hubble's law value and even exceeds it for redshifts of $z\sim 0.5$. Thus when we say the distant supernovae are dimmer than expected, we mean they are further away than expected for a decelerating universe containing only matter.

$\endgroup$
1
  • $\begingroup$ U are right. The C/RS\z and I/ntensity\d do not disagree. It is simply relative to what was expected, which was a decelerating expansion. $\endgroup$ Commented Jan 30 at 6:51

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .