I'm counting total pennies in one of my games. It's purely informational and may or may not ever need more than 32 bits. I just figured I'd use LongInteger since it's available. If it weren't and the counter just wrapped around, it would be no big deal.
Roku Community Streaming Expert
Help others find this answer and click "Accept as Solution." If you appreciate my answer, maybe give me a Kudo.
If one gets a penny for every second they play, it will take 2^31 seconds or 68 years to reach the horizon and they'll have $21M at that point (do you offer cash out option? :twisted: )
For counting money though, just use Double - per IEEE it's guaranteed to preserve at least 15 digits during conversion. Which means you can hold amounts up to the US federal budget ($3+ trillion) with precision down to a cent. The nice thing is that even if exceeding that, it will start "failing" in a nice way, i.e. losing the cents like IRS does. Not to mention there is 2 hidden digits precision (it works internally at 17 decimal digit precision equivalent).
"renojim" wrote: But what if it's 1000000 pennies at a time? 😄
Then you can afford yourself ~2000 of these $10,000 wins.
Or if you switch to Double - which i advocate - then can handle 1,000,000,000 events like the above, which if they happen 1/sec would take 30+ years to lose the penny precision (not to overflow!). And after that, umm... why do you exactly need to pinch every penny? 😛