I have a NetLogo model, simplified to this:
to setup
clear-all
create-turtles 1000 [
fd 100
]
end
When I add a monitor widget to the UI, with a reporter like mean [xcor] of turtles
and then run setup
, the values in the monitor change a slight bit constantly. It might show 0.2305090322262271
one moment then 0.2305090322262268
the next, and then another similar number on and on.
What is making my monitor widget flicker or flash like this? How can I prevent it?
This is caused by a combination of a few things:
turtles
are always returned in a random order.So the monitor constantly re-runs its mean [xcor] of turtles
reporter, but the turtles
agentset gives the turtles in a random order, and so the floating-point inaccuracies for mean
will accumulate in a slightly different way each time due to the order differences. The end result is you see very slightly different numbers flashing through the monitor widget while nothing is happening.
You would see the same problem doing sum [xcor] of turtles
or variance [xcor] of turtles
- anytime you're reducing a bunch of floating point numbers from an agentset into a single value. You can also see the problem running your reporter code directly in the command center repeatedly, without a monitor widget at all.
The fixes are fortunately pretty easy:
mean sort [xcor] of turtles
, sum sort [xcor] of turtles
, variance sort [xcor] of turtles
. If the numbers are in the same order you'll still have small floating-point inaccuracies, but they'll be the the same every time so you won't see the values change. This is probably the best solution, but it can be slow if you have a really large agentset.Decimal places
setting of your monitors to a number where you don't notice the changing values. Since the differences in results should be small, this is usually possible.tick
in the go
method instead of directly in the monitor.