javascriptmathrandomceil

Math.ceil ; Math.random


Here's a very basic issue that i cannot resolve on my mind:

In JS if i do:

Math.random() i will get random numbers between 0 and 1 (including 0, not including 1)

So, if i do Math.ceil(Math.random() * 100) i will get random numbers between 0 and 100. Am i right?

I will get 0 if Math.random() was exactly 0, which is possible as i read in the documentation.

However I read everywhere that doing this "Math.ceil(Math.random() * 100)" i will get numbers between 1 and 100.

Please, can someone clarify this situation?

Thanks!

Math.random () -> [0; 1) Math.ceil(Math.random()) -> [0; 100]


Solution

  • Math.random() will give you a random number between 0 (inclusive) and 1 (exclusive), but the chance that it's exactly 0 is incredibly low.

    Math.ceil() rounds up. Given that the chances you get exactly 0 are astronomically low, you will likely get something above 0 from Math.random(), so it's almost (but not entirely) impossible to get 0 from Math.ceil(Math.random()*100).

    So Math.ceil(Math.random()*100)) will give you a tiny tiny chance to get 0, but usually between 1 and 100 (inclusive).

    Usually what you want to do is Math.floor(), which rounds down. It doesn't have this edge case because Math.random() never gives you 1 so there's never a small chance that Math.floor(Math.random()*100)) gives you 100. It's always 0-99 inclusive evenly distributed.