According to the standard SQL2003 (§6.1 Data Types), whose relevant excerpt you can see transcribed in that answer in SOen ( the standard itself is not free to query, needing to be purchased), establishes a small difference between these two types:
- The
NUMERIC
must have the exact accuracy specified as well as the scale;
- The
DECIMAL
must have at least the specified precision, and the identical scale.
In other words, if an implementation chooses to represent DECIMAL
more accurately than requested, it is free to do so. But if it wants to implement it with the exact precision - making this type functionally equivalent to NUMERIC
- this also conforms to the default.
MySQL - as pointed out by bfavaretto in the comments
a> - is one of those that does not distinguish between types (SQL Server is another). According to
documentation in MySQL "
NUMERIC
is implemented as
DECIMAL
", so that its behavior is identical. And, as required by the standard, the accuracy used is exactly as requested.
About using one or the other, I do not have enough experience to comment, but the bigown argument in your answer that the use of DECIMAL
can impair portability (ie potentially causing different results when the database is migrated from one DBMS to another) is already a good reason, in my opinion, to avoid such a type.