You should probably scale your decimal values by 100, and represent all the monetary values in whole cents. This is to avoid problems with floating-point logic and arithmetic. There is no decimal data type in JavaScript – the only numeric data type is floating-point. Therefore it is generally recommended to handle money as 2550
cents instead of 25.50
dollars.
Consider that in JavaScript:
var result = 1.0 + 2.0; // (result === 3.0) returns true
But:
var result = 0.1 + 0.2; // (result === 0.3) returns false
The expression 0.1 + 0.2 === 0.3
returns false
, but fortunately integer arithmetic in floating-point is exact, so decimal representation errors can be avoided by scaling1.
Note that while the set of real numbers is infinite, only a finite number of them (18,437,736,874,454,810,627 to be exact) can be represented exactly by the JavaScript floating-point format. Therefore the representation of the other numbers will be an approximation of the actual number2.
1 Douglas Crockford: JavaScript: The Good Parts: Appendix A – Awful Parts (page 105).
2 David Flanagan: JavaScript: The Definitive Guide, Fourth Edition: 3.1.3 Floating-Point Literals (page 31).