How does arbitrary precision work by setting the column to numeric
?
How does arbitrary precision work by setting the column to numeric
?
From the Postgres manual we can see that the numeric has precision of:
131072 digits before decimal point
16383 digits after the decimal point
This is the default value, however it is possible to set this pattern with:
numeric(num1, num2)
In which num1
is the precision, ie how many digits do I want to have my numeric.
And num2
is how many decimals it should have.
It should be noted that num2
uses the definition of num1
that is, it is not a sum
of digits, is the use of it, eg:
EX1 : numeric(12,2) // estou dizendo que terei 12 digitos, sendo 2 decimais.
EX2 : numeric(8,6) // estou dizendo que terei 8 digitos, sendo 6 decimais.
In this situation my EX1
bears 9.999.999.999,99
.
While EX2
can not exceed 99,999999
; because I defined a total of 8 digits being 6 decimals.
subtracting num2
from num1
we will have the number of decimals supported in our column.