The standard transpose operation ${{}\cdot{}}^T$ (which flips a matrix along its diagonal) is equivalent to the Euclidean metric. If you have defined ${{}\cdot{}}^T,$ then you can define the (Euclidean) dot product as $x\cdot y=x^Ty.$ If you instead have just the dot product but not the transpose, you can define the transpose $x^T$ of any vector $x$ as the unique row vector $r$ that has $ry=x\cdot y$ for all $y.$ Because the standard transpose is tied to the Euclidean metric, it has no physical meaning in relativity. In particular, you are wrong to identify ${\Lambda_\rho}^\nu$ as being the transpose $\Lambda^T$ of a Lorentz transform ${\Lambda^\rho}_\nu.$
In relativity, indices are "raised" or "lowered" by applying the metric. To invent a word, the "relativistic transpose" of a vector $x^\mu$ is the covector $x_\mu=\eta_{\mu\nu}x^\nu.$ ($x_\mu$ is the unique covector $r_\mu$ having $r_\mu y^\mu=\eta_{\nu\mu}x^\nu y^\mu$ for all $y^\mu,$ so the situation is just like with the standard transpose and Euclidean metric.) To turn a covector $k_\mu$ into a vector, you instead apply the inverse of the metric like so: $k^\mu=\eta^{\mu\nu}k_\nu.$ The process extends to any number of indices: contract the metric against any contravariant index you would like to lower and contract the inverse metric against any covariant index you would like to raise. For the case in your question, ${\Lambda_\rho}^\nu=\eta_{\rho\alpha}\eta^{\nu\beta}{\Lambda^\alpha}_\beta.$
All that said, to answer your question, a Lorentz transform $\Lambda$ is orthogonal under the Minkowski inner product. That is, $\eta_{\mu\nu}{\Lambda^\mu}_\alpha{\Lambda^\nu}_\beta=\delta_{\alpha\beta}=\eta_{\beta\rho}{\delta_\alpha}^\rho=\eta_{\alpha\beta}.$ (Replace $\eta$ with the Euclidean metric and you get the more common definition of orthogonal.) $\Lambda$ will generally not be orthogonal in the usual sense because the usual sense of orthogonal is defined with the wrong metric! (The set of all Lorentz transforms is even called $O(1,3),$ generalizing the notation $O(n)$ used to talk about the set of $n$-by-$n$ orthogonal matrices.)