0
$\begingroup$

I have a (Gaussian) random function (aka "stochastic process" or "random field") $(f(t))_{t\in \mathbb{R}^d}$. I now want to consider the vector valued random function $g=(f, \nabla f)$. The question is: When is the covariance function of $g$ strict positive definite?

I am looking for sufficient conditions. Especially for the case where $f$ is isotropic in particular the covariance functions valid in all dimensions, which are of the form $$ \tag{standard isotropic} C_f(x,y) = \int_{[0,\infty)} \exp\bigl( -\frac{\|x-y\|^2t^2}2\bigr) \mu(dt) $$ for some finite measure $\mu$ on $[0,\infty)$. Ideally this is true for all of them.

Positive Definite Refresher

The covariance function $C_f(x,y) = \text{Cov}(f(x),f(y))$ of $g$ is (strict) positive definite if for any finite number of distinct points $x_1,\dots x_n$ the matrix $$ (C_f(x_i, x_j))_{i,j=1,\dots,n} $$ is (strict) positive definite. Strict positive definiteness is thus equivalent to $\forall v\in \mathbb{C}^n\setminus\{0\}$, we have $$ \sum_{i,j=1}^n v_i \bar{v}_j C_f(x_i, x_j) > 0. $$ For matrix valued covariance functions such as $C_g$, the natural extension is to either treat the index of $g$ as another input, i.e. $g(i,x) = g_i(x)$ ensuring that $g$ is scalar valued again, or alternatively requrie for vector valued $v_i$ $$ \sum_{i,j=1}^n v^*_j C_f(x_i, x_j)v_i > 0. $$ Both approaches are equivalent. Another equivalent formulation of positive definiteness for random functions is, that $$ \mathbb{V}\Bigl(\sum_{i=1}^n v_i f(x_i) \Bigr) > 0 $$ i.e. it is impossible to get something deterministic out of a finite linear combination of random function evaluations. In the case of $g$ this implies $$ 0<\mathbb{V}\Bigl(\sum_{j=0}^n\sum_{i=1}^n v_i^j g_j(x_i) \Bigr) = \mathbb{V}\Bigl[\sum_{i=1}^n \Bigl(v^0_i f(x_i)+\sum_{j=1}^n v_i^j \partial_j f(x_i)\Bigr)\Bigr] $$

Possible Approaches to this Problem

Sufficient conditions for $C_f$ to be strict positive definite

  1. If $C_f$ is continuous, stationary and there exists no non-zero trigonometric polynomial which vanishes on the support of the spectral measure of $C_f$ (Sasvári 2013, Theorem 3.1.4). Special cases:

    1. If the support of the spectral measure of $C_f$ includes a non-empty open set (Theorem 3.1.6)
    2. If $C_f$ is radial (aka $f$ is isotropic) and the dimension is greater 2. (Theorem 3.1.5)
  2. If $C_f$ is a universal kernel, it is strictly positive definite Carmeli et al. 2010 although this statement is made without proof (I am missing a reference). And it appears to be sufficient, that the support of the spectral measure has positive Lebesgue measure (https://math.stackexchange.com/a/4446283/445105)

So under the standard isotropy assumption, we know that $f$ is certainly strictly positive definite as a the covariance is radial. The question is, how to get anything about $g$. Since $g(i,x)=g_i(x)$ results in the problem, that the covariance function of $g$ is not even stationary with regard to the input $(i,x)$, it is impossible to obtain a spectral measure for $g$. So I am a bit out of ideas.

$\endgroup$
0

1 Answer 1

1
$\begingroup$

We need to show for all $a\in \mathbb{R}^n$ and all $v_1,\dots, v_n\in \mathbb{R}^d$ $$ \text{Var}\Bigl[\sum_{k=1}^n a_k f(x_k) + D_{v_k}f(x_k)\Bigr] > 0 $$ We assume $f$ to be centered by switching to $\tilde{f}=f - \mathbb{E}[f]$ if needed. Then \begin{align*} \text{Var}\Bigl[\sum_{k=1}^n a_k f(x_k) + D_{v_k} f(x_k)\Bigr] = \sum_{k,j=1}^n \begin{aligned}[t] \Bigl(&a_k a_j\text{Cov}(f(x_k), f(x_j)) \\ &+ a_k \text{Cov}(f(x_k), D_{v_j}f(x_j)) \\ &+ a_j \text{Cov}(D_{v_k}f(x_k), f(x_j)) \\ &+ \text{Cov}(D_{v_k} f(x_k), D_{v_j}f(x_j)) \Bigr) \end{aligned} \end{align*} We use the spectral representation of $C_f$, i.e. $$ C_f(x,y) = \int e^{i \langle x-y, t\rangle} \sigma(dt) $$ Then using the notation $f(D_v x, y) = \frac{d}{dt} f(x+tv, y)$ we have \begin{align*} C_f(D_v x, y) &= \int e^{i \langle x-y, t\rangle} i \langle v, t\rangle \sigma(dt) \\ C_f(D_v x, D_w y) &= \int e^{i \langle x-y, t\rangle} \langle v, t\rangle\langle w, t\rangle \sigma(dt) \end{align*} Which implies \begin{align*} &\text{Var}\Bigl[\sum_{k=1}^n a_k f(x_k) + D_{v_k} f(x_k)\Bigr] \\ &= \int \sum_{k,j=1}^n \begin{aligned}[t] \Bigl(&a_k a_j e^{i \langle x_k - x_j, t\rangle} - a_k e^{i \langle x_k - x_j, t\rangle}i \langle v_j, t\rangle \\ &+ a_j e^{i \langle x_k - x_j, t\rangle} i \langle v_k, t\rangle + e^{i \langle x_k - x_j, t\rangle} \langle v_k,t\rangle\langle v_j,t\rangle \Bigr) \sigma(dt) \end{aligned} \\ &= \int \begin{aligned}[t] &\Bigl|\sum_{k=1}^n a_k e^{i \langle x_k, t\rangle}\Bigr|^2 \\ &+\Bigl(\sum_{k=1}^n a_k e^{i \langle x_k, t\rangle}\Bigr)\Bigl( e^{-i\langle x_j, t\rangle}(-i\langle v_j,t\rangle)\Bigr) \\ &+ \Bigl(\sum_{k=1}^n e^{i\langle x_k,t\rangle}i\langle v_k, t\rangle\Bigr)\Bigl(\sum_{j=1}^n a_j e^{-i \langle x_j, t\rangle}\Bigr) \\ &+ \Bigl( \sum_{k=1}^n e^{i \langle x_k, t\rangle} i\langle v_k,t\rangle\Bigr) \Bigl( \sum_{j=1}^n e^{-i \langle x_j, t\rangle} (-i\langle v_j,t\rangle)\Bigr) \sigma(dt) \end{aligned} \\ &= \int P_a(t)\overline{P_a(t)} + P_a(t)\overline{P_v(t)} + P_v(t)\overline{P_a(t)} + P_v(t)\overline{P_v(t)} \sigma(dt) \\ &= \int \bigl|P_a(t) + P_v(t)\bigr|^2 \sigma(dt) \end{align*} with \begin{align*} P_a(t) &:= \sum_{k=1}^n a_k e^{i \langle x_k, t\rangle} \\ P_v(t) &:= i\sum_{k=1}^n\langle v_k,t\rangle e^{i \langle x_k, t\rangle} \end{align*} In the standard isotropic case, the spectral measure has support $\mathbb{R}^n$ so by the continuity of $$ P(t) = P_a(t) + P_v(t) = \sum_{k=1}^n (a_k + i\langle v_k,t\rangle) e^{i \langle x_k, t\rangle} $$ we only need to show that $P= 0$ implies $a_k = 0$ and $v_k=0$ (This is shown here: https://math.stackexchange.com/a/4903537/445105).

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.