Optimizing a game we're developing, we're running into the phase where every CPU cycle won counts. We using radians for position calculations of objects circling around other objects and I want to cut the needless accuracy in my lookup tables. For that, we make heavy use of a predefined Pi. How accurate should this Pi be?
So, my question is:
You might as well just make it as accurate as whatever floating-point representation you can store is. It won't take longer to perform calculations using a more accurate floating-point number of the same type.
Accuracy is generally measured as number of significant digits; you'll need to decide for yourself how many digits of accuracy you're interested in. If you use a less accurate value for pi, that value's inaccuracy will propagate to the other calculations it's in.