1

I'm running a regular Python program on a i7 with 64GB RAM. I have lots of repeats so I have about 10 instances of this program running at the same time. Looking at the system resourses (on ubuntu 18.04), I have all 8 cores working at 100% capacity but I'm still only using 22GB RAM. I'm curious, why are all cores at maximum capacity if there is RAM still left to use?

1
  • An unoptimized endless spinning loop like while(true){} will consume as much CPU as the OS will give it, but consume 0 RAM, because the program isn't asking for RAM.
    – LawrenceC
    Commented Mar 3, 2019 at 16:03

2 Answers 2

3

I think your program is using all available processing power - that's why all cores are at 100%. But this doeasn't mean that all RAM has to be used. Ram is not a substitution for CPU, it's memory. Maybe your program just doesn't require any more memory - it doesn't have to use all of the memory all the time.

3
  • 1
    So if I'm understanding correctly, the CPU is the limiting resource - not RAM. But does that also mean I'll never be able to use my 64GB RAM, since the CPU will max out first and I should probably have just bought 32GB. Commented Feb 24, 2019 at 9:02
  • 1
    With this particular Python program you possibly may not be able to use 100% of your RAM. But there should be a way to use all of it somehow. You just need to find RAM intensive program to run. In this case, it seems like CPU is the bottleneck. If this is some multi-thread app, maybe it could run faster on CPU with more cores.
    – Ignas
    Commented Feb 24, 2019 at 9:10
  • 1
    I'm not sure if you have some specific task in your mind or just want to see how it looks when 100% of RAM is used. But you can try to find some RAM bench-marking software or search for RAM intensive software to try to utilize all of it.
    – Ignas
    Commented Feb 24, 2019 at 9:15
0

On normal conditions for a core platform cpus are limited by their clock frequency (what depend on heat dissipation), instructions per clock cycle (ipc, related to bus width, https://en.wikipedia.org/wiki/Instructions_per_cycle) and therefore pipeline support mainly (https://en.wikipedia.org/wiki/Instruction_pipelining). How fast memory channels can support pipelines through data for instructions is important, if cpus are below 100% usage.

On Ubuntu 18.04 you could install netdata system logger that would show you that relation between cpu and memory usage within detailed charts on a web browser on localhost:19999. While running your python program you then see how much cpu utilisation and ram usage this program needs for a period of time (default ~1/4 hour).

Virtualization or database management can be example for high memory usage with varying cpu utilisation.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .