We have a stored procedure that uses SUM
on an Int column in the SELECT
statement. We have a few cases where for one of the parameter values, the SUM(integer column) exceeds the Int range and therefore throws an error
Arithmetic overflow error converting expression to data type int
The solution to this would be to convert the column to bigint in the sum function i.e. Sum(Convert(BigInt, column))
. But since this is happening for a few specific cases at this point and not for all the cases, i'm concerned that I'll end up penalizing the other cases where this conversion is not needed. How much performance difference does conversion usually make? Is there any other way to handle this issue? Thanks!
1 Answer 1
The cost of the conversion is unlikely to be significant in the context of the entire query. Worst case scenario is SELECT SUM(CONVERT(bigint, integer_column)) FROM HugeTable;
but, once you introduce additional joins and other clauses, the conversion cost quickly becomes an insignificant percentage of the overall resource cost.
One can avoid the explicit conversion by changing the int column to bigint. That has pros (avoids conversion for all queries) and cons (additional 4 bytes per row).
Explore related questions
See similar questions with these tags.