0
$\begingroup$

Given an infinite series of Bernoulli RVs $X_1,X_2,...$ (which may be differently distributed and mutually dependent), we are given that for every $n>0$, $$\mathbb{E}\left[\sum_{t=1}^{n}(1-X_t)\right] \leq a\log n~,$$ with $a>0$ some deterministic parameter. Intuitively, this means that when $n$ is large enough, the expected number of failures until time $n$ becomes negligible.

My question is whether this implies a bound on $\mathbb{E}\tau$ as a function of $a$ (I will be glad as long as $\mathbb{E}\tau$ is guaranteed to not be infinite), where $$\tau \triangleq \min \{t>0:X_t=1\} $$ is the time of the first success.

Thank you!

Edit: Sorry for the confusion in my original post; I changed it to clarify that the bound on $\mathbb{E}\tau$ should not depend on $n$ and that $\tau$ stands for the first success at any time.

$\endgroup$
1
  • $\begingroup$ You ask whether $E[\tau]$ can be bounded in terms of $n$; so to be sure, are you assuming a finite collection $X_1,...,X_n$? $\endgroup$ Commented Mar 21, 2023 at 3:59

1 Answer 1

3
$\begingroup$

Let's say $X_t\sim Bern(p_t).$ Your given inequality equivalently states

$$E\left[\sum_{t=1}^n (1-X_t)\right]=n-(p_1+...+p_n)\leq a\log n.$$

Also recall since $\tau:=\min\{t:1\leq t\leq n, X_t=1\}$ is nonnegative, we have the following:

$$E[\tau]=\sum_{T=1}^n P(\tau \geq T).$$

Note for $T\geq 2$, $$P(\tau\geq T)=P(X_1=0,X_2=0,...,X_{T-1}=0)\leq P(X_{T-1}=0)=1-p_{T-1},$$

and note this inequality a) can be extended for $T=1$ by defining $p_0=0$, and b) cannot be sharpened in general because it holds with equality if $X_i$ are identically distributed and perfectly correlated such that they all share the same sign. Thus, we have

$$E[\tau]=\sum_{T=1}^n P(\tau\geq T)\leq n-(p_1+...+p_n)-(p_0-p_n)\leq a\log n-(p_0-p_n)\leq 1+a\log n.$$

$\endgroup$
5
  • $\begingroup$ Thank you for your comment! I'm sorry for the confusion, but actually I'm looking for a bound on the expected value of $\tau\triangleq\min \{t>0:X_t=1\} $, which may depend on $a$, but I will be glad as long as it's not infinite (I'll edit my post to clarify this). For instance, for the following cases: (where the bound scales as $n$ instead of $\log n$) 1) i.i.d series: $\mathbb{E}\left[\sum_{t=1}^{n}(1-X_t)\right] =n(1-p)$, and still $\mathbb{E} \tau =\frac{1}{p}$. 2) Perfectly correlated: $\mathbb{E}\left[\sum_{t=1}^{n}(1-X_t)\right] =n(1-p)$ but $\mathbb{E}\tau$ is infinite. $\endgroup$
    – Tanakak
    Commented Mar 21, 2023 at 8:11
  • $\begingroup$ @OrRaveh But your question is not well-posed now. You want the expected value to be finite, but as you mention, it can be infinite (as the perfectly correlated case shows). $\endgroup$ Commented Mar 21, 2023 at 12:52
  • $\begingroup$ But unless I'm missing something, the perfectly correlated case doesn't satisfy the condition that $\mathbb{E}\left[\sum_{t=1}^{n}(1-X_t)\right] \leq a\log n~$, as the expected sum until time $n$ depends linearly on $n$. $\endgroup$
    – Tanakak
    Commented Mar 21, 2023 at 12:55
  • 1
    $\begingroup$ @OrRaveh I see. That condition is met when $1-a{\log n \over n}\leq {1 \over n}\sum_{i=}^n p_i$ so it holds when $p_i$ is large enough in the long run, e.g. something like the limsup of $p_i$ to be 1. To get finite expectation, you would need to sharpen the first inequality in the last line of my proof, so I have to think about it. Anyway, I'll leave the answer up in case it helps anyone else. $\endgroup$ Commented Mar 21, 2023 at 13:23
  • $\begingroup$ Thank you, I will also keep thinking about it. $\endgroup$
    – Tanakak
    Commented Mar 21, 2023 at 14:18

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .