84
$\begingroup$

While I do know that $\frac{dy}{dx}$ isn't a fraction and shouldn't be treated as such, in many situations, doing things like multiplying both sides by $dx$ and integrating, cancelling terms, doing things like $\frac{dy}{dx} = \frac{1}{\frac{dx}{dy}}$ works out just fine.

So I wanted to know: Are there any particular cases (in single-variable calculus) we have to look out for, where treating $\frac{dy}{dx}$ as a fraction gives incorrect answers, in particular, at an introductory level?

Note: Please provide specific instances and examples where treating $\frac{dy}{dx}$ as a fraction fails

$\endgroup$
8
  • 1
    $\begingroup$ Linking a related post here: Is $\displaystyle\frac{dy}{dx}$ not a ratio? $\endgroup$
    – Frenzy Li
    Commented Aug 28, 2016 at 13:07
  • 7
    $\begingroup$ Duplicates? math.stackexchange.com/questions/774145/…, math.stackexchange.com/questions/46530/… $\endgroup$ Commented Aug 28, 2016 at 13:17
  • 2
    $\begingroup$ Huh. I've always found it helpful to always treat it like a fraction. :/ $\endgroup$
    – user345895
    Commented Aug 28, 2016 at 21:06
  • $\begingroup$ @HansLundmark At least the first one isn't, as this question is dealing specifically with single-variable calculus, and there the answers are all multivariable. $\endgroup$
    – user345895
    Commented Aug 28, 2016 at 21:10
  • 4
    $\begingroup$ @HansLundmark The reason I created this thread is because in all of the duplicates you've mentioned, none of the answers have shown an example where treating $dy/dx$ as a fraction in single variable calculus has led to incorrect answers or given circumstances where we should not treat it like a fraction, i.e, none of them have really answered the question. $\endgroup$
    – xasthor
    Commented Aug 29, 2016 at 15:56

8 Answers 8

26
$\begingroup$

It is because of the extraordinary power of Leibniz's differential notation, which allows you to treat them as fractions while solving problems. The justification for this mechanical process is apparent from the following general result:

Let $ y=h(x)$ be any solution of the separated differential equation $A(y)\dfrac{dy}{dx} = B(x)$... (i) such that $h'(x)$ is continuous on an open interval $I$, where $B(x)$ and $A(h(x))$ are assumed to be continuous on $I$. If $g$ is any primitive of $A$ (i.e. $g'=A$) on $I$, then $h$ satisfies the equation $g(y)=\int {B(x)dx} + c$...(ii) for some constant $c$. Conversely, if $y$ satisfies (ii) then $y$ is a solution of (i).

Also, it would be advisable to say $\dfrac{dy}{dx}=\dfrac{1}{\dfrac{dx}{dy}}$ only when the function $y(x)$ is invertible.

Say you are asked to find the equation of normal to a curve $y(x)$ at a particular point $(x_1,y_1)$. In general you should write the slope of the equation as $-\dfrac{1}{\dfrac{dy}{dx}}\big|_{(x_1,y_1)}$ instead of simply writing it as $-\dfrac{dx}{dy}\big|_{(x_1,y_1)}$ without checking for the invertibility of the function (which would be redundant here). However, the numerical calculations will remain the same in any case.

EDIT.

The Leibniz notation ensures that no problem will arise if one treats the differentials as fractions because it beautifully works out in single-variable calculus. But explicitly stating them as 'fractions' in any exam/test could cost one the all important marks. One could be criticised in this case to be not formal enough in his/her approach.

Also have a look at this answer which explains the likely pitfalls of the fraction treatment.

$\endgroup$
8
  • $\begingroup$ I think you mixed up $h$ and $g$ in the yellow box. For example, you write "then $h$ satisfies the equation [...]" but there is no $h$ in that equation. $\endgroup$
    – Vincent
    Commented Aug 28, 2016 at 17:07
  • $\begingroup$ @Vincent It means $h$ is a solution of that equation. $\endgroup$ Commented Aug 28, 2016 at 17:26
  • 12
    $\begingroup$ -1, because I don't think this answers the question. $\endgroup$ Commented Aug 28, 2016 at 19:36
  • 2
    $\begingroup$ @MartinArgerami nothing here answers the real question. $\endgroup$
    – user312097
    Commented Aug 28, 2016 at 23:03
  • $\begingroup$ @MartinArgerami I guess it is clear now. $\endgroup$ Commented Aug 29, 2016 at 6:36
20
$\begingroup$

In calculus we have this relationship between differentials: $dy = f^{\prime}(x) dx$ which could be written $dy = \frac{dy}{dx} dx$. If you have $\frac{dy}{dx} = \sin x$, then it's legal to multiply both sides by $dx$. On the left you have $\frac{dy}{dx} dx$. When you replace it with $dy$ using the above relationship, it looks just like you've cancelled the $dx$'s. Such a replacement is so much like division we can hardly tell the difference.

However if you have an implicitly defined function $f(x,y) = 0$, the total differential is $f_x \;dx + f_y \;dy = 0$. "Solving" for $\frac{dy}{dx}$ gives $$\frac{dy}{dx} = -\frac{f_x}{f_y} = -\frac{\partial f / \partial x}{\partial f /\partial y}.$$ This is the correct formula for implicit differentiation, which we arrived at by treating $\frac{dy}{dx}$ as a ration, but then look at the last fraction. If you simplify it, it makes the equation $$\frac{dy}{dx} = -\frac{dy}{dx}.$$ That pesky minus sign sneeks in because we reversed the roles of $x$ and $y$ between the two partial derivatives. Maddening.

$\endgroup$
8
  • 2
    $\begingroup$ I like the last example . . . I found it last year myself and showed it to my classmates . . . no one was able to understand what happened (they did not know partial differentitation) $\endgroup$
    – Kartik
    Commented Aug 28, 2016 at 14:19
  • 40
    $\begingroup$ But the last fraction has partial derivatives, which are not only written differently, but also are something different. So to arrive at your final result, you would first have to replace $\partial$ with $d$, and only then "cancel" $df$. So it's not a real counterexample (not to mention that the question explicitly asked for single-variable calculus). $\endgroup$
    – celtschk
    Commented Aug 28, 2016 at 14:28
  • 9
    $\begingroup$ I second @celtschk's remark. Partial derivatives are too often used misleadingly in attempts to discredit the approach to $\frac{dy}{dx}$ as a ratio. It is not a valid objection. $\endgroup$ Commented Aug 28, 2016 at 16:24
  • 8
    $\begingroup$ This is a bit like worrying whether we hurt Pluto's feelings when we decided it wasn't a planet. I'm not trying to "discredit" anything. In this case, 1. Try to define the difference between $d$ and $\partial$ so that it matters for this question. 2. No, you don't have to change the $\partial$ to $d$ to make the $\partial F$'s cancel. 3. I am assuming the OP is a student, in which case communication needs to precede rigor. $\endgroup$
    – B. Goddard
    Commented Aug 28, 2016 at 17:51
  • 2
    $\begingroup$ This is not an example from single-variable calculus, as the question requests. Partial derivatives belong to multivariable calculus. $\endgroup$
    – pregunton
    Commented Aug 30, 2016 at 15:16
13
$\begingroup$

There are places where it is "obvious" that we should not blindly apply the laws of arithmetic to $\frac{dy}{dx}$ as if it were a ratio of real numbers $dy$ and $dx$. An example from another question is $$ \frac{dy}{dx}+\frac{du}{dv} \overset ?= \frac{dy\,dv+dx\,du}{dx\, dv}, $$ where the left-hand side has a clear interpretation but the right-hand side does not.

As for any false equation that you might actually be tempted to write by treating $\frac{dy}{dx}$ as a ratio, however, I have not seen any actual counterexamples in any of the several related questions and their answers (including the question already mentioned, this question, or this question).

In practice, the problem I see with treating $\frac{dy}{dx}$ as if it were a ratio is not whether an equation is true or not, but how we know that it is true. For example, if you write $\frac{dy}{dx} \, \frac{dx}{dt} = \frac{dy}{dt}$ because it seems to you that the $dx$ terms cancel, without having first learned (or discovered) the chain rule and having recognized that it justifies this particular equation, then I would say you're just making an ill-educated guess about this equation rather than doing mathematics. (I'll grant that the equation is valid mathematics even if you don't remember that it's called the "chain rule". I think that particular detail is mainly important when teaching or when answering questions on calculus exams that are designed to test whether you were paying attention when that rule was introduced.)

$\endgroup$
7
$\begingroup$

In single-variable calculus, I am not aware of a single instance of getting incorrect results by treating $\frac{dy}{dx}$ as a ratio. This is why in fact there are so few mistakes in Leibniz who did treat it as a ratio. However, there are certainly times when you should not treat it as a ratio. You shouldn't do that at the time of the exams in the course, because the instructor's reaction will surely be to take off points.

$\endgroup$
7
$\begingroup$

The big thing is that there is a thing called a "differential", and we can make things like $\mathrm{d}y$ or $\mathrm{d}f(t)$ mean one of those.

We can multiply differentials by functions (e.g. $x^2 \mathrm{d}x$), and we can add differentials, and these operations will behave like you expect them to.

Don't try to multiply two differentials, though: the right way to do that probably does not behave like you expect them to.

$\mathrm{d}$ satisfies the 'laws' of differentiation; e.g. $\mathrm{d}f(t) = f'(t) \mathrm{d}t$ and $\mathrm{d}(xy) = x \mathrm{d}y + y \mathrm{d}x$.

Don't try to differentiate a differential either; the usual way to do that again doesn't behave how you expect, and is probably unrelated to what you wanted to do anyways.

Anyways, if you have an equation like $\mathrm{d}y = 2x \mathrm{d}x$ (e.g. by applying $\mathrm{d}$ to the equation $y = x^2$) and $\mathrm{d}x$ is "nonzero" in a suitable sense, then it makes sense to define $\frac{\mathrm{d}y}{\mathrm{d}x}$ to mean the ratio between the differentials.

Single variable calculus is peculiar in that all of the variables and expressions you work with will have differentials that are multiples of one another. This isn't true in general; e.g. if $x$ and $y$ are independent, then $\mathrm{d}x$ and $\mathrm{d}y$ are not multiples of one another, and $\frac{\mathrm{d}y}{\mathrm{d}x}$ is utter nonsense.

Differentials are still very useful in such a setting, though, although the "usual" approach tends to neglect them.

There is a notion called a "partial derivative", often given similar notation $\frac{\partial y}{\partial x}$, but it really doesn't pay to treat it like a fraction, and there isn't really a corresponding notion of $\partial x$.

$\endgroup$
2
$\begingroup$

I don't know if this qualifies as an answer or not, but the problems come in the second derivative. While $\frac{dy}{dx}$ can be used as a fraction, the standard notation for the second derivative, $\frac{d^2y}{dx^2}$, cannot, at least in that form. An alternative notation for the second derivative, which can be used as a fraction, is $\frac{d^2y}{dx^2} - \frac{dy}{dx}\frac{d^2x}{dx^2}$, which can be derived simply from applying the quotient rule to the first derivative (which shows another place where $\frac{dy}{dx}$ can be treated as a quotient!).

Someone also mentioned partial derivatives. You can treat those as a fraction, too, provided you use sufficient notation. The reason why you can't split $\frac{\partial y}{\partial x}$ is because there is information about the numerator contained in the denominator. In other words, the $\partial y$ in $\frac{\partial y}{\partial x}$ is a different term than the $\partial y$ in $\frac{\partial y}{\partial t}$. If you used sufficient notation to distinguish these as different values (say, $\partial_ty$ and $\partial_xy$), then you can, in fact, split these fractions successfully. If you do this, the problem specified by B. Goddard goes away.

$\endgroup$
1
  • $\begingroup$ Just to note, additional details for these points can be found in the paper "Total and Partial Differentials as Algebraically Manipulable Entities". arxiv.org/abs/2210.07958 $\endgroup$
    – johnnyb
    Commented Jul 12 at 14:39
1
$\begingroup$

Treating derivatives as ratios allows us to come up with a heuristic argument for the product rule, if we agree that the product of two differentials is negligible: \begin{align} d(xy) &= (x+dx)(y+dy) - xy \\[5pt] &=x \, dy+y\,dx+dx \, dy \\[5pt] &= x \, dy+y \, dx \\[5pt] \frac{d(xy)}{dt} &= x\frac{dy}{dt}+y\frac{dx}{dt} \end{align} However, try doing the same thing with the quotient rule, and things don't work out so nicely: \begin{align} d\left(\frac{x}{y}\right)&=\frac{x+dx}{y+dy}-\frac{x}{y} \\[5pt] &= \frac{y(x+dx)-x(y+dy)}{y(y+dy)} \\[5pt] &= \frac{y \, dx-x \, dy}{y^2+y \, dy} \, . \end{align} Here, it is difficult to argue why the term $y \, dy$ should be negligible, meaning that the quotient rule cannot be derived in a straightforward manner by treating derivatives as ratios.

$\endgroup$
0
$\begingroup$

Proposition 1 is an example of the situation when $\mathrm dy\over\mathrm dx$ can't be seen as a quotient.

(the definition of differential and derivative is at the end)

Proposition 1:
Forall function $f$ satisfied with $x=f(t)$, if its derivative at $t_0$ is equal to $\mathrm dx\over\mathrm dt$, $g$ satisfied with $y=g(x)$ is also derivable at $f(t_0)$, then the derivative of $g\circ f$ at $t_0$ is $${\mathrm dy\over\mathrm dt}={\mathrm dy\over \mathrm dx}\cdot{\mathrm dx\over\mathrm dt}$$ (End of proposition.)

The conclusion is that least one of ${\mathrm dy\over \mathrm dx}$ and ${\mathrm dx\over \mathrm dt}$ can not be understood as quotient. Below is the explanation. If both of them are quotient, then $\mathrm dx$ will be cancelled out; however, in this case, chain rule is not meaningful, it is not even a mathematics statement (only a sentence without maths meaning); nevertheless, chain rule is not a meaningless sentence. Let's look at the reason for if seeing both as a quotient, then chain rule is a meaningless sentence: not all function $f$ which has a derivative at $x_0$ is satisfied with the statement, for $\mathrm dx$ can be $0$ which can not be placed at denominator; furthermore, the quantity with the same notion (such as $\mathrm dx$ in that sentence) is seen as identical in maths. Therefor, there is at least one of them can not be seen as quotient.

Proposition 2 is an example of $\mathrm dy\over\mathrm dx$ can only be seen as a quotient.

Proposition 2:
Suppose function $f$ satisfied with $x=f(t)$ is derivable at $t_0$, $g$ satisfied with $y=g(u)$ is also derivable at $u_0$, then$${\mathrm dx\over\mathrm dt}\cdot{\mathrm dy\over\mathrm du}={\mathrm dy\over \mathrm dx}\cdot{\mathrm dx\over\mathrm du}$$ (End of proposition)

In this proposition, both of them can only be understood as quotients, otherwise, it is a fake statement or meaningless sentence. There is two situation if one understand it the other way round:

  1. If $y$ is not with respects to $t$, both ${\mathrm dx\over\mathrm du}$ and ${\mathrm dy\over\mathrm dt}$ are not defined, the sentence is meaningful.
  2. If $y$ is with respects to $t$, both ${\mathrm dx\over\mathrm du}={\mathrm dy\over\mathrm dt}=0$, the sentence is meaningful, but it is a fake statement.

It is concludeble that, both of ${\mathrm dx\over\mathrm du}$ and ${\mathrm dy\over\mathrm dt}$ in this proposition must be treated as quotient.

Definition of derivative:
Suppose $f$ of is a function defined on a subset of real number set, $D$; also, $x_0\in D$ $$\forall A(\forall\varepsilon\in\mathbf R_+\exists\delta\in\mathbf R_+\forall x\in\mathring U(x_0,\delta)(|{f(x)-f(x_0)\over x-x_0}-A|<\varepsilon)$$ $$\leftrightarrow A\mathrm{\ equals\ the\ derivative\ of}\ f\mathrm{\ at}\ x_0)$$ $A$ is denoted as $\displaystyle{\mathrm df\over\mathrm dx}$

Definition of differential:
Suppose the derivative of function $f$ at the point $x_0$ is $A$, $$\forall g(\forall h\in\mathbf R(g(x_0,h)=Ah)\leftrightarrow g\mathrm{\ equal\ the\ differential\ of\ }f\ \mathrm{at}\ x_0)$$ $Ah$ is denoted as $\mathrm df$, called the dependent variable of differential

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .