2

So I am running Node.js on Windows Subsystem for Linux (WSL), and I made an experiment regarding Node.js `os.loadavg()`. Keep in mind my device has 16 CPU cores.

Consider 2 Node.js processes:

loadTest.js:

let x = 0
while (true) {
 x += Math.random() * Math.sin(x)
}

index.js:

const os = require('os');
console.log(os.loadavg());

Now,

before we run `loadTest.js`, let's run index.js and find the output (we only care about the first element)

[ 0, 0.08, 0.09 ]

Now we will divide it into 20 seconds, 40 seconds, 1 minute from when we run `loadTest.js`.

20 seconds:

[ 0.43, 0.47, 0.41 ]

40 seconds:

[ 0.55, 0.5, 0.42 ]

1 minute:

[ 0.68, 0.53, 0.43 ]

Assuming that Node.js hogs 1 CPU core, I would assume that the first value would be 1 after 1 minute, and if we would to run `loadTest` 16 times, then my computer would crash but and the first value would be 16.

Why is it that we get 0.68? If well there is 1 process that always wants the CPU in a span of 1 minute, well shouldn't the first number be AT LEAST 1? I understand that the Operating System swaps processes, but why would the CPU core be idle if `loadTest` is always doing computations?

asked Dec 18, 2025 at 16:10

0

Know someone who can answer? Share a link to this question via email, Twitter, or Facebook.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.