Saturday, 30 September 2017

math - Why does adding two decimals in Javascript produce a wrong result?








Why does JS screw up this simple math?





console.log(.1 + .2)  // 0.3000000000000004
console.log(.3 + .6) // 0.8999999999999999






The first example is greater than the correct result, while the second is less. ???!! How do you fix this? Do you have to always convert decimals into integers before performing operations? Do I only have to worry about adding (* and / don't appear to have the same problem in my tests)?



I've looked in a lot of places for answers. Some tutorials (like shopping cart forms) pretend the problem doesn't exist and just add values together. Gurus provide complex routines for various math functions or mention JS "does a poor job" in passing, but I have yet to see an explanation.

No comments:

Post a Comment

casting - Why wasn't Tobey Maguire in The Amazing Spider-Man? - Movies & TV

In the Spider-Man franchise, Tobey Maguire is an outstanding performer as a Spider-Man and also reprised his role in the sequels Spider-Man...