Found an interesting problem on SQL Server yesterday, try this:
DECLARE @a REAL = 0.0000
DECLARE @c DECIMAL (18,10) = 20754/3.45 -- this is 6015.6521730000
DECLARE @b DECIMAL (18,10) =
CASE
WHEN 1 > 2
THEN @a / 100
WHEN 1 = 2
THEN 56
ELSE @c
END
SELECT @b
It seems to be a problem with precision, and how the case statement is compiled. It can be easily remedied by casting the REAL @a to a decimal in the case statement, but as it's @c we're returning and the other cases should never be hit, it's a strange problem to come across. Anyone know enough about SQL compilation to explain this?
It's down to Data Type Precedence, as detailed in the documentation
[CASE] Returns the highest precedence type from the set of types in result_expressions and the optional else_result_expression.
http://msdn.microsoft.com/en-us/library/ms181765(v=sql.100).aspx
You can break it entirely by adding the line
WHEN 1 = 3 then getdate()
There's a more detailed explanation here
I imagine that the compiler assumes that all cases may be possible (your example is of course deliberately perverse :) )