2
$\begingroup$

The $n$-point correlation functions of QCD, which define the theory, are computed by performing functional derivatives on $Z_{QCD}[J]$, the generating functional of QCD,

$$\frac{\delta^nZ_{QCD}[J]}{\delta J(x_1)...\delta J(x_n)},$$

where $J$ are the different field sources that are to be set to zero. In a perturbative approach, we can expand on the weak gluon coupling constant, but doing so will require taking into account the presence of the loop integrals that, generally, diverge and one has to deal with these singularities that are a consequence of allowing the integrations to run over every possible energy at every point in space-time.

The first step is regularizing the theory, where a regulator is introduced in the loop integrals such that they become convergent. Doing this then allows one to redefine the gluon coupling constant, quarks masses and fields strength, that appear in the generation functional of QCD, in such a way that they cancel these infinities. This redefinition is called renormalization and by doing so we can compute the finite renormalized correlation functions.

From this perturbative approach, we also see that the coupling and mass will be dependent on the energy scale, such that the correlation functions are independent of this scale.

Now, starting again from the step of computing the correlation functions by functional derivatives on $Z_{QCD}$ but now from a non-perturbative approach (and forgetting everything about the perturbative approach), I understand that there isn't a way to do this computation and one has to use other non-perturbative methods such as lattice QCD. But where does the need to regularize and renormalize QCD come in? Everywhere I have looked for answers they always rely on the perturbative approach, using arguments such as problematic divergencies or scale-dependent mass and coupling, and I don't see how they are present in the non-perturbative approach.

The only 2 leads that in my eyes could lead to problems are the fact that the coupling and mass are bare parameters with no physical value and that the theory is defined as allowing every possible energy at every point in space-time, but I don't understand where these apparent pathologies come in.

$\endgroup$
2
  • 2
    $\begingroup$ Lattice QCD (any lattice approach) inherently introduces a regularization simply by discretizing space and computing in a finite „box“. Roughly speaking: the lattice spacing introduces a UV regularization and the lattice extend an IR regularization. To extract results a continuum limit and an infinite volume limit have to be performed and those to involve „Non-perturbative Renormalization in Lattice Field Theory“ which basically entails how to get physical values for observable from lattice ones. I am no expert on the details but maybe this gave an idea an the correct buzzword. $\endgroup$
    – N0va
    Commented Aug 17, 2023 at 22:38
  • 1
    $\begingroup$ just to add a bit: as Nova mentioned, the lattice is a regulator itself, and all measured quantities will be finite and well defined at fixed spacing. If you take the continuum limit (lattice spacing -> 0) naively, you'll find that your measured quantities will numerically diverge - performing renormalization of your lattice operators will ensure you obtain the correct, finite, physical, renormalized results in the continuum limit. You also don't have to do nonperturbative renormalization as Nova mentioned, the more naive way is just to do lattice perturbation theory to renormalize $\endgroup$ Commented Aug 18, 2023 at 20:28

1 Answer 1

10
$\begingroup$

The Wilsonian viewpoint of renormalization (see this excellent answer by Abdelmalek Abdesselam) is not conceptually tied to perturbative expansions at all. Rather, it conceives of a quantum field theory as having an inherent scale $\Lambda$ (in the simplest case a hard momentum cutoff for Fourier modes in the path integral), and "renormalizing" is starting from a theory at scale $\Lambda$ and going to a scale $\Lambda'$ by integrating out more modes - this transformation is the renormalization (semi-)group flow.

In this framework, QFT really deals with the trajectory under the RG flow, where at each scale $\Lambda$ we have one theory, connected by renormalization to all the other theories along the trajectory. It's not that we start with one theory and then renormalize it, it's that "the theory" is really given by its versions at all scales, but because of the renormalization group equations it sufficies to give one point along the trajectory to determine the full trajectory, so we tend to think of this as "starting" at one scale and renormalizing to the others.

Now, the "scaleless" theory you mean when you talk about the non-perturbative QCD $Z_\text{QCD}$ is really the $\Lambda\to\infty$ limit of this trajectory. In practice for most realistic theories it will turn out that you cannot compute this without hitting divergences even outside of perturbation theory: The standard approach is to put the theory on the lattice with a momentum cutoff and you will typically find that the limit where the lattice spacing goes to zero and the cutoff to infinity introduces ugly divergences in the correlators: The problem is just that something like $\langle \phi(x)^2\rangle$ will always diverge unless you renormalize.

From yet another viewpoint, renormalization is simply "resolving" an ambiguity in the definition of the quantum field theory: While its concrete implementation is related to perturbation theory, the core insight of Epstein-Glaser renormalization is that something like $\phi(x)^4$ is actually ill-defined. While there is no problem with writing down something like that in a classical field theory, in quantum field theory the quantum field has to be an operator-valued distribution (see this answer of mine), not a function, and the pointwise product of distribution does in general not exist - at least not uniquely and without further specification.

In this viewpoint, all the infinities the other approaches encounter are just the price they have to pay for ignoring this fundamental flaw in the setup of the theory, and getting rid of the infinities is a post-hoc fix for this. Garbage in (an action containing ill-defined quantities), garbage out (divergences).

So here renormalization turns out to simply be what's missing to make the theory well-defined - we have to specify, for each of the pointwise products of fields in the Lagrangian, how that product is supposed to work. The renormalization parameters then arise as freedoms of choice during this specification.

In any case, both the Wilsonian and the Epstein-Glaser viewpoint agree that renormalization is neither inherently perturbative nor inherently related to "infinities" - it is simply a necessary part of what you have to consider when you really think in depth about what "a QFT" really is.

$\endgroup$
10
  • $\begingroup$ I like the "ill-defined product" approach the most. Does this mean a good way to think of it is that quantization - i.e. the idea of trying to "turn quantum" a "classical" theory is simply insufficient to define the theory when it comes to field systems? To what extent can we fix the distributional product using the total corpus of empirical data we have today? Can we make that "approximate" product fully Lorentz symmetric? With the approximate product, can we, say, prove a bound hydrogen atom exists in fully relativistic treatment? $\endgroup$ Commented Aug 18, 2023 at 0:13
  • $\begingroup$ That is to say, given we have the Standard Model Lagrangian, w/lots of examples of distributional products, and given we have a massive pile of data from all the particle experiments so far that haven't yet really contradicted it, can we take all that data and somehow "big data" out of it the "shape" of the specific distributional product Nature/our universe "prefers"? $\endgroup$ Commented Aug 18, 2023 at 0:14
  • $\begingroup$ So the idea that we should conceive "quantum field theory as having an inherent scale" comes from the fact that "In practice, for most realistic theories it will turn out that you cannot compute this without hitting divergences even outside of perturbation theory"? (not taking into account the viewpoint of Epstein-Glaser renormalization) $\endgroup$
    – orochi
    Commented Aug 18, 2023 at 0:38
  • $\begingroup$ @The_Sympathizer I probably wasn't clear enough about this point: We cannot, in general, define "the distributional product" - this is mathematically impossible. What we need to do is to define for each specific "product" term we write into our Lagrangian what we actually mean by it, and defining that is precisely choosing the renormalization parameters. E-G renormalization is not more powerful than the usual renormalization procedures, it just avoids running into explicit infinities. $\endgroup$
    – ACuriousMind
    Commented Aug 18, 2023 at 7:27
  • 1
    $\begingroup$ @orochi This whole affair is much more complicated than I can explain in a single answer. In the end, the Wilsonian viewpoint has notions of UV limits/fixed points that do correspond to sending the cutoff to infinity without running into divergences, it's just that because we already "dealt" with renormalization properly in this approach this limit isn't related to the "naive" scale-less theory you'd have written down in a simple way. $\endgroup$
    – ACuriousMind
    Commented Aug 18, 2023 at 7:37

Not the answer you're looking for? Browse other questions tagged or ask your own question.