i am testing the numeric type in postgres according the precision and scale i declared the numeric type with (1000,900) which gives me 900 digits decimal points but when i test it by inserting and halving 1 it stops after the 1023 halving and throws the error: ERROR: value out of range: overflow SQL state: 22003.
I use this statement to insert:
insert into numeric_test
select ((1 / power(2, i))::numeric)+10000000, pg_column_size(( 1/power(2, i)::numeric)+10000000)
from generate_series(1, 1024) as i;
And that is my table:
create table numeric_test (
pos numeric(1000,900),
sz integer
);
alter table numeric_test add column index_x integer;
Is there a limit how big and numeric type entry could get? The size of the 102 3iteration is 127 bytes and the 1024 iteration throws an error of overflow.
1 Answer 1
The numeric(1000,900)
column has nothing to do with it; the select
alone produces the overflow error.
Calling power()
with integer arguments will default to the floating-point version of the function. power(2,1024)
is causing the overflow, as the maximum base-2 double-precision exponent is 1023.
There is also a version of power()
which takes numeric
arguments, and will handle 21024 without overflowing. If you cast your inputs to numeric
, it will call this version instead:
select ((1 / power(2, i::numeric)))+10000000, pg_column_size(( 1/power(2, i::numeric))+10000000)
from generate_series(1, 1024) as i;