For questions involving the notion of independence of events, of independence of collections of events, or of independence of random variables. Use this tag along with the tags (probability), (probability-theory) or (statistics). Do not use for linear independence of vectors and such.
For events: Two events $A$ and $B$ are independent if $$P(A\cap B)=P(A)P(B)$$ More generally, a family $\mathscr F$ of events is independent if, for every finite number of distinct events $A_1$, $A_2$, $\ldots$, $A_n$ in $\mathscr F$, $$P\left(\bigcap_{i=1}^nA_i\right) =\prod_{i=1}^nP(A_i)$$
Two collections of events (for example, two $\sigma$-algebras) $\mathscr F$ and $\mathscr G$ are mutually independent (or simply, independent) if every $A$ in $\mathscr F$ and every $B$ in $\mathscr G$ are independent.
More generally, some collections $\mathscr F_i$ of events, indexed by some finite or infinite set $I$, are mutually independent (or simply, independent) if, for every finite subset $\\{i_1,i_2,\ldots,i_n\\}$ of $I$ and every event $A_k$ in $\mathscr F_{i_k}$, the family $\\{A_1,\ldots,A_n\\}$ is independent.
For random variables: Two random variables $X$ and $Y$ (defined on the same probability space) are independent if their $\sigma$-algebras $\sigma(X)$ and $\sigma(Y)$ are (mutually) independent.
In particular, 2 events $A$ and $B$ are independent if and only if the indicator random variables $1_A$ and $1_B$ are independent.
More generally, a family $\mathscr X$ of random variables (defined on the same probability space) is independent if, for every finite sub-family $\\{X_1,X_2,\ldots,X_n\\}$ of $\mathscr X$, the $\sigma$-algebras $\sigma(X_{1})$, $\sigma(X_{2})$, $\dots$, $\sigma(X_{n})$ are (mutually) independent.