I am paraphrasing a question once again from Professor Ery Arias-Castro's Principles of Statistical Analysis.
Suppose we toss a fair coin $n$ times. Define the event $A_i$ as the event that the $i$th toss results in heads, and show that $A_1, A_2, \dots, A_n$ are mutually independent, that is,
$$ \mathbb{P}(A_{i_1} \cap A_{i_2} \cap \cdots \cap A_{i_k}) = \mathbb{P}(A_{i_1})\mathbb{P}(A_{i_2}) \cdots \mathbb{P}(A_{i_k}) $$
for any $k$-tuple $1 \leq i_1 < \cdots < i_k \leq n$. In fact, show that the distribution is the uniform distribution $(\mathbb{U}(A) = |A|/|\Omega|$ for any $A \subseteq \Omega$) if and only if the tosses are fair and mutually independent.
I edited the question for clarity (the question depended on several other exercises in the text). Mutual independence follows from the law of multiplication $$ \mathbb{P}(A_1 \cap A_2 \cap \cdots \cap A_n) = \prod_{k = 1}^n \mathbb{P}(A_k | A_1 \cap A_2 \cap \cdots \cap A_{k - 1}). $$ For the if and only if, I believe the forward direction follows since for $A_i$ we have
\begin{align} \mathbb{U}(A_i) &= \frac{|A_i|}{|\Omega|}\\ &= \frac{2^{n - 1}}{2^n}\\ &= \frac{1}{2}, \end{align}
proving fairness. Additionally, for events $1 \leq i_1 < \cdots < i_k \leq n$, we have
\begin{align} \mathbb{U}(A_{i_1} \cap A_{i_2} \cap \cdots \cap A_{i_k}) &= \frac{1}{2^k}\\ &= \mathbb{U}(A_{i_1})\mathbb{U}(A_{i_2}) \cdots \mathbb{U}(A_{i_k}), \end{align}
proving mutual independence.
I am not quite sure how to approach the backwards direction. We need to show that for any $A \subseteq \Omega$ we have $\mathbb{P}(A) = |A| / |\Omega|$, and we know that the coin is fair (I am assuming this means $\mathbb{P}(A_i) = 1/2$ since no rigorous definition was provided) and that the events $A_1, A_2, \dots, A_n$ are mutually independent. We know that $\cup_{i = 1}^n A_i = \Omega$, so maybe there's something there to get any $A \subseteq \Omega$ from the $A_i$s. I would appreciate any help.