The conclusion is true even if $X_{n}$ converges in prpbability to $0$ which is a weaker assumption than yours. You can prove it by in two ways.
Method 1
Fix $\epsilon>0$ and a $\delta>0$.
As $Y_{n}=O_{p}(\beta_{n})$, find and $M_{\epsilon}>0$ such that $P(|Y_{n}/\beta_{n}|>M_{\epsilon})<\epsilon$.
Now, be definition of convergence in probability, there exists $N(\delta,\epsilon)$ such that for $n\geq N$, you have $P(|X_{n}|>\frac{\delta}{M_{\epsilon}})<\epsilon$
Now $$P(|X_{n}Y_{n}/\beta_{n}|>\delta)\leq P(|X_{n}|>\frac{\delta}{M_{\epsilon}})+\cdot P(|Y_{n}/\beta_{n}|>M_{\epsilon})+P(|X_{n}|>\frac{\delta}{M_{\epsilon}})\leq 3\epsilon$$ for all $n\geq N(\epsilon,\delta)$ .
The above is due to \begin{align}\{|X_{n}Y_{n}/\beta_{n}|>\delta\}&=\{|X_{n}Y_{n}/\beta_{n}|>\delta , |X_{n}|>\delta/M_{n},|Y_{n}/\beta_{n}|\leq M_{\epsilon}\}\\\\&\bigcup\{|X_{n}Y_{n}/\beta_{n}|>\delta,|X_{n}|\leq\delta/M_{\epsilon},|Y_{n}/\beta_{n}|>M_{\epsilon}\}\\\\&\bigcup\{|X_{n}Y_{n}/\beta_{n}|>\delta, X_{n}>\delta/M_{\epsilon},|Y_{n}/\beta_{n}|>M_{\epsilon}\}\end{align},
This holds for all $\epsilon>0$. This means $P(|X_{n}Y_{n}/\beta_{n}|>\delta)\xrightarrow{n\to\infty} 0$.
And the above holds for all $\delta>0$. Thus $X_{n}Y_{n}/\beta_{n}\xrightarrow{P}0$.
Second Method is a subsequential method and is more for measure theory enthusiasts.
First note that for a sequence of random variables $X_{n}$, you have that $X_{n}\xrightarrow{P}X$ if and only if for every subsequence $X_{n_{k}}$, there exists a further subsequence $X_{n_{k_{l}}}$ such that $ X_{n_{k_{l}}}\xrightarrow{P} X$. This is often a very useful characterization of convergence in probability.
The proof is very simple. You take the real sequence $a_{n}=P(|X_{n}-X|>\epsilon)$. And now, you have that for each subsequence $a_{n_{k}}$, there exists a further subsequence $a_{n_{k_{l}}}$ which converges to $0$. This means that $a_{n}$ converges to $0$. (This is a property of sequences in arbitrary metric spaces).
Now, let $X_{n_{k}}Y_{n_{k}}/\beta_{n_{k}}$ be an arbitrary subsequence of $X_{n}Y_{n}/\beta_{n}$.
But, WLOG for simplicity of notation, assume that the subsequence is the sequence itself. i.e. We want to work with $X_{n}Y_{n}/\beta_{n}$ itself.
By definition of $O_{p}$, For each $k>0$, there exists an $M_{k}$ such that $P(|\frac{Y_{n}}{\beta_{n}}|>M_{k})\leq \frac{1}{2^k}$ for all $n\geq n_{k}$.
So you have that $\sum_{k=1}^{\infty}P(|\frac{Y_{n_{k}}}{\beta_{n_{k}}}|>M_{k})<\infty$.
Thus, the Borel-Cantelli Lemma implies that $Y_{n_{k}}(\omega)/\beta_{n_{k}}\leq M_{k}$ almost surely for all large enough $k$. (Say this occurs for all $\omega\in A$ such that $P(A)=1$).
Case 1: Let $\sup_{k}M_{k}=\infty$. WLOG assume $M_{k}\uparrow \infty$.
Then, by definition of convergence in probability for the sequence $X_{n_{k}}\xrightarrow{P} 0$ we have for each $l$, $P(|X_{n_{k}}|> \frac{1}{M_{k_{l}}^{2}})\to 0$
So now, choose $n_{k_{l}}$ such that $$P(|X_{n_{k_{l}}}|>\frac{1}{M_{k_{l}}^{2}})\leq \frac{1}{2^{k_{l}}}$$
So again, by Borel-Cantelli Lemma, you have that $\displaystyle X_{n_{k_{l}}}(\omega)\leq \frac{1}{M_{k_{l}}^{2}}$ almost surely for all large enough $l$. (Say this occurs for all $\omega$ such that $P(B)=1$).
Thus, you have that in a set of probability $1$ (i.e. in $A\cap B$),
$$|X_{n_{k_{l}}}(\omega)Y_{n_{k_{l}}}(\omega)/\beta_{n_{k_{l}}}|\leq \dfrac{M_{n_{k_{l}}}}{M_{n_{k_{l}}}^{2}} $$ almost surely for all large enough $l$. And the right hand side goes to $0$.
Thus $X_{n_{k_{l}}}(\omega)Y_{n_{k_{l}}}(\omega)/\beta_{n_{k_{l}}}\to 0$ almost surely in $A\cap B$.
Thus $X_{n_{k_{l}}}Y_{n_{k_{l}}}/\beta_{n_{k_{l}}}\xrightarrow{a.s.}0$ which implies $X_{n_{k_{l}}}Y_{n_{k_{l}}}/\beta_{n_{k_{l}}}\xrightarrow{P}0$.
Case 2:
If $\sup_{k}M_{k}<\infty$. Then you have that $Y_{n_{k}}/\beta_{n_{k}}\leq M_{k}$ almost surely for all large enogu $k$.
Thus $\sup_{k}Y_{n_{k}}(\omega)/\beta_{n_{k}}\leq C(\omega)$ almost surely .
Now as $X_{n_{k}}$ converges in probability to $0$, find a subsequence $X_{n_{k_{l}}}$ that converges almost surely to $0$.
Thus you have $\bigg|X_{n_{k_{l}}}Y_{n_{k_{l}}}/\beta_{n_{k_{l}}}\bigg|\leq C|X_{n_{k_{l}}}|$ for all large enough $l$ and the RHS goes to $0$.
Thus $X_{n_{k_{l}}}Y_{n_{k_{l}}}/\beta_{n_{k_{l}}}\xrightarrow{a.s.}0$ which also means $X_{n_{k_{l}}}Y_{n_{k_{l}}}/\beta_{n_{k_{l}}}\xrightarrow{P}0$
Thus combining all cases, what we have shown is that given any subsequence of the original sequence, there exists a further subsequence which converges in probability to $0$. This means that $\dfrac{X_{n}Y_{n}}{\beta_{n}}\xrightarrow{P}0$.
NOTE Essentially, Case $2$ is just a subcase of $1$ because if $M_{k}$ is bounded, we can always replace this sequence by a sequence $L_{k}$ such that $L_{k}>M_{k}$ and $L_{k}\uparrow\infty$ and work with that instead. This works because $P(|Y_{n}/\beta_{n}|>L_{k})\leq P(|Y_{n}/\beta_{n}|>M_{k})$