I watch this YouTuber's mostly excellent videos on .NET features (let's call him Nick). He more often than not uses Benchmark.net to demonstrate the differences in performance of different approaches to programming problems.
The results this tool shows for statements or full methods that typically have quite a bit of logic in them are typically a couple of nanoseconds. Sometimes even less than a single nanosecond.
With my understanding of modern microprocessors I can't help but think this cannot be right.
Even if I take into account that the test will be executed multiple times (filling up the cache, pipelines, optimizing branch prediction results and all those goodies, likely rendering the results meaningless anyway), the numbers don't add up for me. A 3 GHz processor can, under optimal conditions, do a couple of operations per cycle, a cycle being 1/3 of a ns. That is still a far cry from any rudimentary C# statement which would take tens to hundreds of operations.
What is the deal with those numbers? What am I missing or how do they come up with those kind of results?