Hi guys,
I've been trying to write script that counts decimals. I know Javascript won't work with them so you have to bypass it.
I've looked around on the net and it seems that most people are using the match.pow() method. However I don't understand why? But then again I was never good at Math. Considering I flunked every year :-)
The only thing I want to do is add. So if I have two times 2.48 I would like it to show 4.96. Now it just gives me 4.
If anyone could clear it up for me, that would be nice.
I've been trying to write script that counts decimals. I know Javascript won't work with them so you have to bypass it.
I've looked around on the net and it seems that most people are using the match.pow() method. However I don't understand why? But then again I was never good at Math. Considering I flunked every year :-)
The only thing I want to do is add. So if I have two times 2.48 I would like it to show 4.96. Now it just gives me 4.
If anyone could clear it up for me, that would be nice.
Comment