I'm just starting a course on Computational Number Theory and have very little Computer Science background but definitely know enough about the big-O notation.
I currently have an assignment to work on:
An algorithm reads in the binary digits of a positive integer $N$ and determines whether $N$ is a multiple of 3. Show that the algorithm takes time at least of order $n$, where $n$ is the number of binary digits of $N$.
Basically, I have thought up of an algorithm which I'm pretty sure is correct: (I don't know what's the best/correct way to write an algorithm but here's my attempt)
Start with a binary number $x=a_na_{n-1}\dots a_1a_0$.
Then compute $x'=\sum^{i=n}_{i=0} (\frac{3}{2}+\frac{1}{2}(-1)^{i+1})a_i$.
Then $x'=0 \mod3 \iff 3|x$
This works because $2^{2k}=1\mod3$ and $2^{2k+1}=2\mod3$
I can guess that my algorithm takes time at least of order $n$ because I need order $n$ steps to compute $x'$.
However, I have my doubts:
How do I know my algorithm is the optimal algorithm?
Am I 'cheating' by saying $x'=0 \mod3 \iff 3|x$?
Am I even answering the question correctly by providing such an algorithm or is there some other sort of analysis to show lower bounds for algorithm time?