This question is more out of curiosity, I have already addressed the problem in my code.
I made a money mask using javascript, it was working perfectly, until I entered value 4.9.
By debugging the code I discovered that the problem was multiplying the value 4.9 per 100, the result of this multiplication gave:
490.00000000000006
What made my mascara "buggy."
Other values such as 4.8 the result is shown correctly, such as:
480
Follow the code below to illustrate:
alert(4.9*100);
//Sem problemas
alert(4.8*100);
//Outro exemplo mais grave
alert(4.6*100);
My question is this, why is there this strange behavior in Javascript? I did the same multiplication using php and it worked normally.
And for those who had the same problem and just want to solve it, I solved it like this:
var valor = 4.9;
resultado = (valor * 100).toFixed(2);
alert(valor);
var valor = 4.6;
resultado = (valor * 100).toFixed(2);
alert(valor);