1

In Bedrock Edition, when we start a single-player world, it runs on a local server. This isn't about framerate, which I found answers to when I tried searching. I want to know how the type of my processor affects the tick speed—or maybe milliseconds per tick—and if there is a set equation I can do using the number in my processor type?

For example, how many milliseconds does it take a 1.4 ghz CPU to compute a tick, as opposed to a 4.4 ghz CPU?

I'm tryin my best to explain my problem but sorry if something's unclear, you can just ask for clarification.

1
  • 1
    How do you know or why do you think the CPU has an influence on the duration of a tick?
    – Joachim
    Commented Nov 27, 2021 at 12:01

2 Answers 2

1

Unfortunately, there are far too many variables in play to calculate how quickly a CPU can process a single tick, and multithreaded code is even more difficult to calculate.

Note that the software may very well have a specified tick speed. This means the server would wait for as long as necessary to ensure that the tick length is exactly n milliseconds. In this case, your CPU speed doesn't even matter, as long as it's fast enough.


Modern CPUs use many tricks to improve their performance. For example, a modern 1.4 GHz Intel Alder Lake CPU might come very close to the single-threaded performance of an old 4.4 GHz Pentium 4 CPU (if it exists), if it isn't already faster.

In fact, nobody even bothers calculating software performance (at best, people calculate algorithmic complexity, which is related, but still very different). It's much easier to measure the performance instead, by attaching a profiler or having the software log the current timestamp on every tick.

But let's try anyway.

Assuming you are using the same CPU, and over-/underclocking it to 1.4 GHz and 4.4 GHz respectively, we can at least tell that the CPU will be roughly 3.14 times faster (ideally) at 4.4 GHz than at 1.4 GHz, because:

4.4 GHz / 1.4 GHz = 3.1428571428571428571428571428571

And if we ignore any and all performance tricks, such as pipelining and branch prediction (and many, many more), and assuming we won't be running any code, we can calculate the minimum amount of time required to complete a single CPU tick:

1 / (1.4 Billion / s) = 1s / 1.4 Billion = 0.71428571428571428571428571428571 nanoseconds

for the 1.4 GHz CPU, and

1 / (4.4 Billion / s) = 1s / 4.4 Billion = 0.22727272727272727272727272727273 nanoseconds

for the 4.4 GHz one. Multiply that number with the amount of "CPU ticks per server tick", and you'll know how long it takes for a server tick.

Unfortunately, the reality is far more complex, so this calculation is ultimately pointless.

First of all, the CPU cannot perform any calculations without any data. "Data" also includes the code to be executed. After all, the CPU cannot perform calculations without knowing what calculation needs to be performed in the first place. Even if the CPU is infinitely fast, it still needs to wait for the data to arrive. That's where the CPU cache comes in.

The CPU cache is where the CPU stores the data it needs to do its job. It's extremely fast (faster than RAM), but also extremely small. Modern CPUs can have multiple levels of cache, with the lowest level being the smallest and fastest, and the highest level being the largest but slowest.

Depending on how often the CPU can access the data from its caches, how often it needs to default to the much slower RAM (if the data isn't in the cache), and how often it needs to access the even slower HDD/SSD (if the data isn't in the RAM either), this can severely affect the performance of the software.

Secondly, some CPUs may be capable of executing certain tasks faster than others. For example, vector arithmetics can be performed more quickly if a CPU supports AVX, thus performing multiple operations in a single CPU tick, instead of using multiple ticks.

There may be many, many other aspects that can affect a CPU's speed in any given situation, making it impossible to answer the question.

1

Minecraft caps its tick rate at 20 ticks per second. Unless you use mods, you cannot change this. However, you can see the MSPT (milliseconds per tick, how long each tick takes to process) in the F3 menu. The TPS=1000/MSPT. Unless your CPU is a potato, you can get a 50 MSPT to max out the TPS.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .