Normally when we want to find the Prime Factorization of a number, we will keep dividing that number by the smallest prime number (2), until it can't be divided then we move on to the next prime number after that (3) and move on until it become 1, right?
For example number $18$: we will keep dividing $18$ by $2$, $(18/2 = 9)$, $9$ can't be divided by $2$ so we move on $3$, $9/3 = 3$, $3/3 = 1$. So $18 = 2\cdot3\cdot3$
But the above method requires us to know the prime number first $(2,3,5,...)$. I found a method on the internet which doesn't care about the prime numbers at all. Instead of dividing by $2,3,5$ (prime number), keep dividing by $2,3,4,5,6,....n$ (the same rule as above). Problem is I don't know why it works.
For example number $420$:
$$420/2 = 210;\quad 210/2 = 105;\quad 2 \nmid 105$$
$$105/3 = 35;\quad 3 \nmid 35$$
$$35/4,\quad 4 \nmid 35$$ $$35/5 = 7,\quad 5,6 \nmid7,\quad 7/7=1.$$ so $420 = 2\cdot2\cdot3\cdot5\cdot7.$
My question is, how can after being divided by the prime number as many times as possible, the remaining can't be divided by other normal numbers anymore? For example, after being divided by $2,3$ as many times as possible, the remain can't be divided by $4$ anymore; after being divided by $2,3,5,7$, as many times as possible, the remain can't be divided by 8,9,10 anymore; etc. How to prove this?
I think this is quite important. Since if you write a program finding Prime Factorization of a number, we don't need to write another program to check for Prime Number any more. Just divided from $2$ to $n$.