3
$\begingroup$

I'm struggling to understand what exactly T(n), and f(n) is in the above text:

When we compute the time complexity T(n) of an algorithm we rarely get an exact result, just an estimate. That’s fine, in computer science we are typically only interested in how fast T(n) is growing as a function of the input size n.

For example, if an algorithm increments each number in a list of length n, we might say: “This algorithm runs in O(n) time and performs O(1) work for each element”.

Here is the formal mathematical definition of Big O:

Let T(n) and f(n) be two positive functions. We write T(n) ∊ O(f(n)), and say that T(n) has order of f(n), if there are positive constants M and n₀ such that T(n) ≤ M·f(n) for all n ≥ n₀.

image

This graph shows a situation where all of the conditions in the definition are met.

I'm taking Tim Coughgarden course and reading his books. I'm still in the beginning and I can understand the explanations, but sometimes I struggle to understand the mathematical meaning of things... there are some implicitly about what is T(n) or f(n) (in this case) that is not explained at all.

$\endgroup$

2 Answers 2

1
$\begingroup$

In simple terms, $T(n)$ is the running time or time complexity of an algorithm. while $F(n)$ can be thought of as speed bound. $F(n)$ can be:

  • constant: $F(n) = c$
  • Linear: $F(n) = a n + c$
  • Quadratic: $F(n) = a n^2 + b n + c$
  • Cubic: $F(n) = a n^3 + b n^2 + c n + d$
  • Logarithmic, exponential,... etc.

F(n) takes an input of size n, and tell us how much time would it take the algorithm to process all n elements. The running time $T(n)$, therefore, must be at exactly $F(n)$.

However, it wouldn't be nice* to use $F(n)$ as a bound, and instead of saying "the algorithm would run exactly in $2x^2 -3x + 12$" we instead say "the algorithm is faster than $O(F(n))=O(n^2)$" using big O notation.

  • constant: $F(n) = c \implies O(F(n))=O(1)$
  • Linear: $F(n) = a n + c \implies O(F(n))=O(n)$
  • Quadratic: $F(n) = a n^2 + b n + c \implies O(F(n))=O(n^2)$
  • Cubic: $F(n) = a n^3 + b n^2 + c n + d \implies O(F(n))=O(n^3)$

*for simplicity. not an actual reason for using big O :)

$\endgroup$
2
$\begingroup$

T(n) is a function that has been conceived in a effort to describe the behavior of a program. There are many other possibilities, but that should be the case, in the context of computer science.

That doesn't mean we will actually write T(n) down, because i) it can be too complicated, and ii) it is probably not essential to do so.

Then comes f(n), which is another function. It indicates a set of functions, to which T(n) belongs, denoted by O(f(n)). Now, that doesn't say much about f(n), but we can use this construct to divide the universe of functions in categories, each one corresponding to a different f(n), chosen from a list of "archetypal" functions.

The definition of O(f(n)) may seem tricky, but it is consistent, and general, which is important if you're going treat the subject scientifically.

This division has immense practical benefits, since it gives us a language to talk about the behavior of programs in a way that is both precise and meaningful. Each of these function categories has functions that have something important in common, in the way that they grow.

The notation has limits, of course. It's not the only tool in our belt.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.