I have seen a lot of posts that describe the case for just 2 random variables.
Independent random variables and function of them
Are functions of independent variables also independent?
If $X$ and $Y$ are independent then $f(X)$ and $g(Y)$ are also independent.
If $X$ and $Y$ are independent. How about $X^2$ and $Y$? And how about $f(X)$ and $g(Y)$?
Are squares of independent random variables independent?
Prove that if $X$ and $Y$ are independent, then $h(X)$ and $g(Y)$ are independent in BASIC probability -- can we use double integration? (oh I actually asked the 2 variable elementary case here, but there's no answer)
I have yet to see a post that describes the case for at least 3.
Please answer in 2 situations
1 - for advanced probability theory:
Let $X_i: \Omega \to \mathbb R$ be independent random variables in $(\Omega, \mathscr F, \mathbb P)$. Let $i \in I$ for any index set I think (or maybe has to be countable). Of course, assume $card(I) \ge 3$. Then show $f_i(X_i)$ are independent. Give conditions on $f_i$ such that $f_i(X_i)$ is independent. I read in above posts that the condition is 'measurable', which I guess means $\mathscr F$- measurable, but I could've sworn that I read before that the condition is supposed to be 'bounded and Borel-measurable', as in bounded and $\mathscr B(\mathbb R)$-measurable for $(\mathbb R, \mathscr B(\mathbb R), Lebesgue)$
2 - for elementary probability theory
Let $X_i: \Omega \to \mathbb R$ be independent random variables that have pdf's. Use the elementary probability definition of independence that is 'independent if the joint pdf splits up', or something. I guess the index set $I$ need not be finite, in which case I think the definition is that the joint pdf of any finite subset of is independent. Give conditions on $f_i$ such that $f_i(X_i)$ is independent. Of course we can't exactly say that $f_i$ is 'measurable'.
Context for the elementary case: I'm trying to justify the computation for the formula for the moment-generating function for linear combination of independent random variables. See here: Proving inequality of probabilty to derive upper bound for moment-generating functions
Based on the application of Riemann–Stieltjes integral (or Lebesgue–Stieltjes integral) to probability, I think the condition is any $f_i$ such that $E[f_i(X_i)]$ exists (i.e. $E[|f_i(X_i)|]$ is finite).
This is the same condition in Larsen and Marx - Introduction to Mathematical Statistics and Its Applications.
I think $f$ bounded implies this but not conversely.
Update: Also related through another question If $g$ is a continuous and increasing function of $x$, prove that $g(X)$ is a random variable. --> More generally for what functions $g$ is $g(X)$ is a random variable? Of course in advanced probability just say $g$ is Borel-measurable or $\mathscr F$-measurable or whatever, but I think in elementary probability we say $g$ such that $E[g(X)]$ exists i.e. $E[|g(X)|] < \infty$, EVEN THOUGH this is, I believe, a stronger condition than that $g$ is 'measurable', whatever this means in elementary probability. But then again this is kind of weird since we don't even necessarily expect $E[X]$ to exist (i.e. $E[|X|] < \infty$) or well any higher moment $E[X^n]$ I guess.