"Komag" wrote:
Your guess is good, it's about a lighting system that updates only when the lighting changes based on player walking into doorways between "chambers". It's also affected by which other open doors are nearby. I'm planning to rework that bit some to just update changed squares instead of scanning through the whole room, but in this case it would save very little, perhaps reduce from 20ms down to 5ms if I can cut out 3/4 of the work (and that's on an old classic Roku). I'll try to see if I can do similar reductions in work elsewhere where the effect will be felt.
Let me ask this, how often is that code executed?
I was guessing it was on every screen redraw and if you were striving for 30 times per second, that leaves you 33ms per frame for everything, to re-calculate and re-draw. If that were the case, reducing from 20ms down to 5ms would be huge difference.
But i see now you said it "updates only when the lighting changes based on player walking into doorways", which makes me think that recalc is need only once every few seconds, which would be 2 orders of magnitude (100x) less. In that case decreasing from 20ms to 5ms is [pea]nuts, not worth it. If it were the first case, you had potential to improve performance by almost 50% (15/33). In the second, the payoff will be only ~0.5%, totally not worth the effort.
Should we do optimizations that we don't need? Most often the answer is "no" - since it comes at the cost of something else. It may be readability - and it the case of algorithmic optimization mentioned above, it comes at cost of complexity. E.g. if when light conditions need refresh you re-calculate the whole map, it's a single piece of code that always works the same. If you optimize it into two pieces though - one full recalc and an incremental/differential recalc that does it only for the cells you know something have changed, now you have two pieces to maintain and keep in sync, when you change your structures. Also you are running the risk of missing a condition of when certain cell needs refresh - where the "brute force" approach always gets cells right since it always recalculates all. Readability and correctness are worth more than speed, most of the time.
So i say, optimize with eye on the goal. Is there a need for the program to be faster? And if so, is there much juice to be squeezed from that particular piece? If not, never mind re-working it.