In CLRS, there's a question in the first chapter that says for certain functions $run time = f(n)$, to calculate the max number of $n$ given the run time. The math works out simply. For example, for $f(n) = \sqrt{n}$ microseconds, given a 1 second run time there is ${\frac{1\ second}{10^{-6}\ seconds}}^2$ inputs $n$, which I got from $\sqrt{n}\ microseconds = 1\ second$ and solving for n.
My confusion comes from trying to make a proportion for the system like I would for a dimensional analysis problem. I have $\frac{\sqrt{n}\ microseconds}{n\ inputs}$ and the other fraction is $\frac{1\ second}{x\ inputs}$. Here when I solve for $x$ I get $10^6*\sqrt{n}$. I think my dimensions don't make sense somehow, but in my head, saying 'it takes square root of n microseconds for n inputs' comes out as $\frac{\sqrt{n}\ microseconds}{n\ inputs}$.
Thank you!!