Monday, 7 August 2017

c# - Rounding issues with decimal

I'm using the decimal data type throughout my project in the belief that it would give me the most accurate results. However, I've come across a situation where rounding errors are creeping in and it appears I'd be better off using doubles.



I have this calculation:



decimal tempDecimal = 15M / 78M;
decimal resultDecimal = tempDecimal * 13M;


Here resultDecimal is 2.4999999999999999 when the correct answer for 13*15/78 is 2.5. It seems this is because tempDecimal (the result of 15/78) is a recurring decimal value.



I am subsequently rounding this result to zero decimal places (away from zero) which I was expecting to be 3 in this case but actually becomes 2.



If I use doubles instead:



double tempDouble = 15D / 78D;
double resultDouble = tempDouble * 13D;


Then I get 2.5 in resultDouble which is the answer I'm looking for.



From this example it feels like I'm better of using doubles or floats even though they are lower precision. I'm assuming I manage to get the incorrect result of 2.4999999999999999 simply because a decimal can store a result to that number of decimal places whereas the double rounds it off.



Should I use doubles instead?



EDIT: This calculation is being used in financial software to decide how many contracts are allocated to different portfolios so deciding between 2 or 3 is important. I am more concerned with the correct calculation than with speed.

No comments:

Post a Comment

casting - Why wasn't Tobey Maguire in The Amazing Spider-Man? - Movies & TV

In the Spider-Man franchise, Tobey Maguire is an outstanding performer as a Spider-Man and also reprised his role in the sequels Spider-Man...