Should I use int_least8_t or int_fast8_t?

3

I'm studying the type boundaries that c ++ 11 brought and noticed that there are officially several types of integers.

In addition to the common int statement I can now declare:

  • int8_t
  • int_fast8_t
  • int_least8_t

In this question you have a good explanation of each type.

Considering a platform where int is 10-bit size. The type int8_t could not have exactly 8 bits.

What's best for me to choose, int_fast8_t or int_least8_t ?

    
asked by anonymous 28.02.2017 / 17:10

2 answers

3

Actually, until you have a reason to choose something other than int , you should choose it.

If you want to use char , use it. It always has 1 byte. And I doubt I'll find an active platform that the byte is different from 8. Surely none is less than 7, or standard C can not work, after all it must be able to represent at least 95 different characters. And it would be the only case where int_fast8_t would not be char , but in practice it does not exist.

There the question says that int_least8_t is always char . It can be used to give a different semantics and indicate more clearly that there is not a char but an integer with at least 8 bits. In practice it does not change anything.

Want to choose what for? Without a problem definition, any solution is can be good.

If it is going to develop for several platforms and you need to ensure that in all it has the best possible performance for the integer of at least 8 bits. It will almost always be int . I would not "waste time" and use something like this if it is not proven to be absolutely necessary.

C and C ++ require int to be at least 16 bits and can not be greater than long int . If it really has 10 bits, the implementation is not standard, so anything can happen according to the will of the implementer, but in practice it is no longer the C or C ++ language.

In theory, each language implementer can put whatever he wants into these types for each platform he generates code, as long as he obeys the specification rule, and is within the standard. In doubt, they will choose int which is quite common, but not required to be the word processor.

Actually the question is more imaginary than theoretical.

    
28.02.2017 / 17:40
3
  

What's best for me to choose, int_fast8_t or int_least8_t ?

  • int_least8_t is the fastest type with at least 8 bits, you should choose it if you prioritize performance

These settings are present to make explicit the intentions when choosing the type. A mathematical library that requires high performance may prefer int_fast8_t , and a library that needs to store a lot of data may prefer int_fast8_t . Since both types guarantee 8 bits of precision, the programmer is sure to be able to perform calculations in the range int_least8_t with [-128, 127] , and register values between int_fast8_t and [-128, 127] .

In most personal processors (where I usually program), both types map to int_least8_t , but assuming I develop a program and then I want to move it to another platform (like compiling a game for some cell phone or video game ), the compiler (or programmer) can change the type signed char to some faster in the new processor, increasing the performance of the math functions without changing code operation or other parts of the program.

(this may sound obvious to anyone who speaks English, but expanding the name of the help types:)

  • int_fast8_t : int_least8_t

  • tipo_inteiro_pelo_menos_8 : int_fast8_t

28.02.2017 / 18:49