![enter image description here](https://cdn.statically.io/img/i.sstatic.net/DWRYB.png)
In the above universe, $a,b,c$ and $d$ denote probabilities. \begin{align}&\text{events }X \text{ and } Y \text{ are }\textbf{independent}
\\\iff &P(X\cap Y)=P(X)P(Y)
\\\iff &\frac{c}{a+b+c+d}=\frac{b+c}{a+b+c+d}\times\frac{c+d}{a+b+c+d}
\\\iff &ac=bd.\end{align}
Imagine two events $X$ and $Y$ that are not mutually exclusive, such that $P(X) = 0.3 \,$ and $P(Y)=0.4.$ Consider their Venn diagram.
It should be clear that the only value of $P(X \cap Y)$ that makes $X$ and $Y$ independent is $P(X \cap Y) = 0.12.$
$$b=0.3-c\tag1$$$$d=0.4-c\tag2$$$$a=1-0.3-(0.4-c)=0.3+c\tag3$$
For events $X$ and $Y$ to be independent, $$ac=bd\tag4.$$
Solving $(1),(2),(3),(4)$ gives $$c=0.12,$$ as required.
Independence of events is not straightforward to intuit from Venn diagrams (unlike mutual exclusivity, which is observed by inspecting their intersection).
For example:
![enter image description here](https://cdn.statically.io/img/i.sstatic.net/vCZSb.png)
$$\begin{array}{r} \begin{array}{c|c|c}
\style{font-family:inherit}{} & \style{font-family:inherit}{U_1} & \style{font-family:inherit}{U_2} & \style{font-family:inherit}{U_3}
\\\hline
P(X\cap Y) & 0 & \frac14 & \frac14
\\\hline
P(X)P(Y) & \frac14\times\frac12=\frac18 & \frac14\times\frac34=\frac38 & \frac12\times\frac12=\frac14
\\\hline
\text{$\therefore X$ and $Y$ are$\ldots$} & \textbf{dependent} & \textbf{dependent} & \textbf{independent}
\end{array}\hskip-5.5pt \end{array}$$
[Universe $U_1$ above is also an example of the fact that for events with nonzero probabilities, $\big(\text{mutual exclusivity}\implies\text{dependence}\big)$.]
For these multiple-event examples, note that $\big(\,\text{(i) and (ii)}\;\iff\text{(iii)}\,\big)$:
![enter image description here](https://cdn.statically.io/img/i.sstatic.net/vnFYELo7.png)
$$\begin{array}{r} \begin{array}{c|c|c|c}
\style{font-family:inherit}{} & \style{font-family:inherit}{U_4} & \style{font-family:inherit}{U_5} & \style{font-family:inherit}{U_6} & \style{font-family:inherit}{U_7}
\\\hline
\text{(i) } \mathbf{P(X\cap Y\cap Z)=P(X)P(Y)P(Z)} & \text{no} & \text{no} & \text{yes} & \text{yes}
\\\hline
\text{(ii) } \text{$X,Y,Z$ are }\textbf{pairwise independent} & \text{yes} & \text{yes} & \text{no} & \text{yes}
\\\hline
\text{(iii) } \text{$X,Y,Z$ are }\textbf{(mutually) independent} & \text{no} & \text{no} & \text{no} & \text{yes}
\end{array}\hskip-5.5pt \end{array}$$
When the probability experiment has just $2$ trials, a table like this is a good way to understand/visualise conditional probability as working in a reduced sample space:
$$ \begin{array}{r} \begin{array}{c|c|c}
\style{font-family:inherit}{\text{time of complaint}\bigg\\ \text{reason for complaint}} & \style{font-family:inherit}{\textbf E\text{lectrical}} & \style{font-family:inherit}{\textbf M\text{echanical}} & \style{font-family:inherit}{\textbf L\text{ooks}}
\\\hline
\style{font-family:inherit}{\textbf D\text{uring guarantee period}} & 18\% & 13\% & 32\%
\\[0pt]\hline
\style{font-family:inherit}{\textbf A\text{fter guarantee period}} & 12\% & 22\% & 3\%
\end{array}\hskip-5.5pt \end{array} $$
The calculation (notice that the figure ‘$32$’ was obtained from the intersection of column $L$ and row $D$) $$P(L|D)=\frac{P(L\cap D)}{P(D)}=\frac{32}{18+13+32}=51\%\neq32\%+3\%=P(L)$$ shows that $L$ and $D$ are dependent events.