I'm trying to generate a random number between two given values. I'm able to produce this with a pretty standard little function, however when I try to set the maximum and minimum values through an input field, I get some unexpected results.
This is using jQuery, which isn't necessary for this particular function but is needed for the larger project.
Here's an example of what I'm finding:
https://jsfiddle.net/u2k41hzd/
function randomNumber(min, max) {
points = Math.floor(Math.random() * (max - min + 1) + min);
}
$( "button" ).on( "click", function ( event ) {
minPoints = $( ".min-points" ).val();
maxPoints = $( ".max-points" ).val();
randomNumber(minPoints, maxPoints);
$(".random").html(points);
});
In the case of the minimum number being 1 and the maximum being 6, I would expect to get numbers between 1 and 6. However, I get numbers between 0 and 5.
If the minimum number is 2 and the maximum 6, I would expect to get numbers between 2 and 6, but get numbers between 0 and 4. Passing in 3 and 6 gives numbers between 0 and 3, and so on.
Ignoring the input values and hard coding them instead seems produce expected results with no issue. Essentially I'm just unsure as to why the input values are behaving as they are. I'm sure I've just misunderstood something or made a mistake somewhere, but I've not been able to determine the reason!
Answer
The issue is that you need to add the min
to the rounded number, not to the randomly generated number:
function randomNumber(min, max) {
points = Math.floor(Math.random() * (max - min + 1)) + min;
}
To explain further, for the case of 2 and 9:
- Math.random() generates a number between 0 and 0.999999999...
- max - min + 1 = 8
- So the generated number will be in the range 8 * 0 and 8 * 0.99999999...
- Flooring it will round down in the range [0, 7]
- The result would need to be offset by the starting number (i.e. the minimum allowed number - 2)
No comments:
Post a Comment