Skip to main content
deleted 12 characters in body
Source Link
M W
  • 9.9k
  • 1
  • 7
  • 29

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but ratherinstead about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

Final clarification

[upgraded from comments since it relates to the original question]

I see near the end of your post you mentioned the example of $f(x)=x^2$. Here there is a subtle distinction at play. It is true that $f$ is increasing over $[0,\infty)$. However, when we say $f$ is increasing at a point, that's usually shorthand for "increasing over some open interval containing that point."

This creates two slightly different questions:

  1. What's the largest interval over which $f$ is increasing? (answer: $[0,\infty)$).
  2. What's the set of points at which $f$ is increasing? (answer: $(0,\infty)$).

However, the first question doesn't always have a well defined answer - consider something like $f(x)=2x^3-3x^2$, which is increasing on $(-\infty,0]$ and on $[1,\infty)$, but not over the union of those intervals, (since $f(1)<f(0)$), so which interval would you pick to be the answer? Therefore, usually we ask the second question instead (in this final example the answer would be $(-\infty,0)\cup(1,\infty)$).

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

Final clarification

[upgraded from comments since it relates to the original question]

I see near the end of your post you mentioned the example of $f(x)=x^2$. Here there is a subtle distinction at play. It is true that $f$ is increasing over $[0,\infty)$. However, when we say $f$ is increasing at a point, that's usually shorthand for "increasing over some open interval containing that point."

This creates two slightly different questions:

  1. What's the largest interval over which $f$ is increasing? (answer: $[0,\infty)$).
  2. What's the set of points at which $f$ is increasing? (answer: $(0,\infty)$).

However, the first question doesn't always have a well defined answer - consider something like $f(x)=2x^3-3x^2$, which is increasing on $(-\infty,0]$ and on $[1,\infty)$, but not over the union of those intervals, (since $f(1)<f(0)$), so which interval would you pick to be the answer? Therefore, usually we ask the second question instead (in this final example the answer would be $(-\infty,0)\cup(1,\infty)$).

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but instead about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

Final clarification

[upgraded from comments since it relates to the original question]

I see near the end of your post you mentioned the example of $f(x)=x^2$. Here there is a subtle distinction at play. It is true that $f$ is increasing over $[0,\infty)$. However, when we say $f$ is increasing at a point, that's usually shorthand for "increasing over some open interval containing that point."

This creates two slightly different questions:

  1. What's the largest interval over which $f$ is increasing? (answer: $[0,\infty)$).
  2. What's the set of points at which $f$ is increasing? (answer: $(0,\infty)$).

However, the first question doesn't always have a well defined answer - consider something like $f(x)=2x^3-3x^2$, which is increasing on $(-\infty,0]$ and on $[1,\infty)$, but not over the union of those intervals, (since $f(1)<f(0)$), so which interval would you pick to be the answer? Therefore, usually we ask the second question instead (in this final example the answer would be $(-\infty,0)\cup(1,\infty)$).

added 1049 characters in body
Source Link
M W
  • 9.9k
  • 1
  • 7
  • 29

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

Final clarification

[upgraded from comments since it relates to the original question]

I see near the end of your post you mentioned the example of $f(x)=x^2$. Here there is a subtle distinction at play. It is true that $f$ is increasing over $[0,\infty)$. However, when we say $f$ is increasing at a point, that's usually shorthand for "increasing over some open interval containing that point."

This creates two slightly different questions:

  1. What's the largest interval over which $f$ is increasing? (answer: $[0,\infty)$).
  2. What's the set of points at which $f$ is increasing? (answer: $(0,\infty)$).

However, the first question doesn't always have a well defined answer - consider something like $f(x)=2x^3-3x^2$, which is increasing on $(-\infty,0]$ and on $[1,\infty)$, but not over the union of those intervals, (since $f(1)<f(0)$), so which interval would you pick to be the answer? Therefore, usually we ask the second question instead (in this final example the answer would be $(-\infty,0)\cup(1,\infty)$).

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

Final clarification

[upgraded from comments since it relates to the original question]

I see near the end of your post you mentioned the example of $f(x)=x^2$. Here there is a subtle distinction at play. It is true that $f$ is increasing over $[0,\infty)$. However, when we say $f$ is increasing at a point, that's usually shorthand for "increasing over some open interval containing that point."

This creates two slightly different questions:

  1. What's the largest interval over which $f$ is increasing? (answer: $[0,\infty)$).
  2. What's the set of points at which $f$ is increasing? (answer: $(0,\infty)$).

However, the first question doesn't always have a well defined answer - consider something like $f(x)=2x^3-3x^2$, which is increasing on $(-\infty,0]$ and on $[1,\infty)$, but not over the union of those intervals, (since $f(1)<f(0)$), so which interval would you pick to be the answer? Therefore, usually we ask the second question instead (in this final example the answer would be $(-\infty,0)\cup(1,\infty)$).

added 8 characters in body
Source Link
M W
  • 9.9k
  • 1
  • 7
  • 29

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

When we say $f$ is increasing over an interval $I$, we are not making a statement about $f’$ at all, we are making a statement about $f$, namely that for $a,b\in I$ with $a<b$ we have $f(a)<f(b)$.

Since this is true for all intervals for the function $f(x)=x^3$, we say the function is increasing on all intervals.

Note this definition applies for functions that are not even differentiable everywhere, so we certainly don’t need or want to use the derivative in our definition of when a function is increasing.

Remark

A $C^1$ function (i.e., one with a continuous derivative) satisfying $f’(a)>0$ will be guaranteed to be increasing on some interval containing $a$, but as your example of $x^3$ shows, the converse is not necessarily true.

Update

Having seen the discussion in the comments, I'm gathering that you have something of a philosophical objection to the above characterization of "increasing", and would prefer an infinitesimal definition based on the derivative, as more in keeping with "the very idea of calculus", as you put it in the main post.

The definition of "increasing" I've given is essentially standard (some would add the modifier "strictly" for clarity), but based on your concerns, let me also offer a sort of philosophical justification for why we define things the way we do.

While infinitesimal concepts (like the derivative) are certainly an important part of calculus, a perhaps even more important part of the subject concerns the relationship between infinitesimal information, and local and global behavior of a function.

This is perhaps most famously seen in the fundamental theorem of calculus, which relates the infinitesimal rate of change with the global change over an interval, but is also seen in statements like "If $f$ has a local extremum at $x$, and $f$ is differentiable at $x$, then $f'(x)=0$" (note that the reverse is not true, as the example of $x^3$ shows).

In other words, while we obtain infinitesimal information from the derivative, we are often doing so not for its own sake, but rather in order to answer questions that are not about infinitesimal behavior, but rather about local and global behavior of a function. And whether $f$ is increasing or not is one such question.

added 1383 characters in body
Source Link
M W
  • 9.9k
  • 1
  • 7
  • 29
Loading
added 242 characters in body
Source Link
M W
  • 9.9k
  • 1
  • 7
  • 29
Loading
Source Link
M W
  • 9.9k
  • 1
  • 7
  • 29
Loading