Actually, until you have a reason to choose something other than int
, you should choose it.
If you want to use char
, use it. It always has 1 byte. And I doubt I'll find an active platform that the byte is different from 8. Surely none is less than 7, or standard C can not work, after all it must be able to represent at least 95 different characters. And it would be the only case where int_fast8_t
would not be char
, but in practice it does not exist.
There the question says that int_least8_t
is always char
. It can be used to give a different semantics and indicate more clearly that there is not a char
but an integer with at least 8 bits. In practice it does not change anything.
Want to choose what for? Without a problem definition, any solution is can be good.
If it is going to develop for several platforms and you need to ensure that in all it has the best possible performance for the integer of at least 8 bits. It will almost always be int
. I would not "waste time" and use something like this if it is not proven to be absolutely necessary.
C and C ++ require int
to be at least 16 bits and can not be greater than long int
. If it really has 10 bits, the implementation is not standard, so anything can happen according to the will of the implementer, but in practice it is no longer the C or C ++ language.
In theory, each language implementer can put whatever he wants into these types for each platform he generates code, as long as he obeys the specification rule, and is within the standard. In doubt, they will choose int
which is quite common, but not required to be the word processor.
Actually the question is more imaginary than theoretical.