Not really a bug, but I'm curious what's the logic here: I have my Game of Life gadget running at full blast, set to a glider pattern so its performance should be deterministic. It can compute about 24 frames per second, with the target timer set to maximum at 45 fps.
I make an identical copy of the gadget and reset them both to the test pattern. The old copy now computes
28-29
fps, and the new copy, with the exact same script asset, runs at ~24-25 fps.
How does making a new copy of the same script running busy calculations at max pace speed up the original one, and the copy is also on average a little bit faster? If I recompile the new copy with a tiny change to ensure they're not the same asset, the speeds remain the same. If I stop the new copy from running, or give it a very slow timer rate like 1/s, the first one returns to 24 fps.
Really not a big deal, though demanding scripts
speeding up
other scripts on the region just confuses me and it's hard to side-by-side compare finetuned versions... it could of course just be an artifact of the timer resolution and how the scripts attempt to measure the timings (it's a simple "count frames, once time_delta >= N seconds display the frames/delta, reset frames" method), but it's curiously consistent in how it behaves.