0
$\begingroup$

My understanding is that events are subsets of the total outcomes in a sample space. So if two events are mutually exclusive, then they (the sets) do not overlap in the sample space. This can be seen graphically.

If events are independent, I understand it to mean that the occurrence of one does not affect the probability of the other occurring and vice versa and if events are dependent I understand that the occurrence of one affects the probability of another. So if we define $P(A)$ = $\frac{|A|}{|S|}$ where $A$ is the event and $S$ is the sample space so if events $A$ and $B$ are dependent, does that mean that the cardinalities of $A$ and $B$ have some relation to each other wherein one affects the other somehow? Shouldn't it be true since the cardinality of the sets directly determine the probability of the event? If true, would we be able to see these affects graphically when we talk about conditional probability?

$\endgroup$

2 Answers 2

1
$\begingroup$

Recall that we can think about the dependence between two random variables $X_1$ and $X_2$ in a vector space. If the two vectors are independent, then they should exhibit a perpendicular property, i.e. the angle between the two vectors is 90 degrees. To demonstrate this, recall that the angle between two vectors $\theta$ satisfies the following condition \begin{equation} cosine(\theta) = \frac{X_1^\prime X_2}{ \mid\mid X_1 \mid \mid \cdot \mid \mid X_2 \mid \mid } \end{equation} with $\mid \mid v \mid \mid$ denoting the second norm of the vector $v$.

To see demonstrate this point, consider the following R code:

> N <- 10^5
> x1 <- as.matrix(rnorm(N))
> x2 <- as.matrix(rnorm(N))
> x1 <- x1/sqrt(sum(x1^2))
> x2 <- x2/sqrt(sum(x2^2))
> (acos(sum(x1*x2))/pi)*180
[1] 89.94086

By generating two random independent variables, we can see that the angle between the two is 90 degrees. On the other hand, suppose we simulate two independent variables with correlation 50\%, then we can see that the angle decreases

> library(MASS)
> rho <- 0.5
> Sig <-  matrix(c(1,rho,rho,1),2,2)
> X <- mvrnorm(N,mu=c(0,0),Sigma = Sig)
> x1 <- as.matrix(X[,1])
> x2 <- as.matrix(X[,2])
> x1 <- x1/sqrt(sum(x1^2))
> x2 <- x2/sqrt(sum(x2^2))
> (acos(sum(x1*x2))/pi)*180
[1] 59.90822

This should be intuitive, since if the correlation were 1, then the vectors become identical with an angle of 0. On the other hand, if we repeat the above with negative correlation, we observe the opposite

> rho <- -0.5
> Sig <-  matrix(c(1,rho,rho,1),2,2)
> X <- mvrnorm(N,mu=c(0,0),Sigma = Sig)
> x1 <- as.matrix(X[,1])
> x2 <- as.matrix(X[,2])
> x1 <- x1/sqrt(sum(x1^2))
> x2 <- x2/sqrt(sum(x2^2))
> (acos(sum(x1*x2))/pi)*180
[1] 119.8541

I found these notes to be helpful to better understand the idea behind this.

$\endgroup$
0
$\begingroup$

If two events are independent (say $A$ and $B$), then $$P(A) = P(A|B)$$ In words, the probability of $A$ happening (at all, or with the entire sample space) is the same as the probability of $A$ happening conditioned on the fact that $B$ also happens.

However, using your cardinality notation: $$P(A) = \frac{|A|}{|S|}$$ and $$P(A|B) = \frac{|A \cap B|}{|B|}$$ If the events are independent, and these two expressions are equal, then $$\frac{|A|}{|S|} = \frac{|A \cap B|}{|B|}$$ or $$\frac{|A|}{|S|}|B| = |A \cap B|$$ Stating this result in common parlance, the proportion of the overlap of events $A$ and $B$ (out of $B$) is the same proportion of $A$ (out of $S$).

If it helps, you can think of it as if 35% of $S$ is $A$, then 35% of $B$ is $A \cap B$...independence requires these percentages to be the same. That is, knowing $B$ does not affect the probability of $A$.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.