I've always known that char
is the only type that has fixed size by specification. Its size is 1, no matter the architecture.
But I came across sizeof('a')
returning 4 rather than 1.
Like this? Did I Learn Wrong?
I've always known that char
is the only type that has fixed size by specification. Its size is 1, no matter the architecture.
But I came across sizeof('a')
returning 4 rather than 1.
Like this? Did I Learn Wrong?
You have learned by half. In fact, when you ask for sizeof
of a variable to type char
, or the type char
itself, the result will always be 1. It will never change, so there is no reason to use an expression to get its size. Use 1 and that's it.
You can say "just in case", "by unconsciousness", "it will one day change". Will not change, language specification can not change because someone wanted to. Programming can not be based on beliefs. You have the right and the wrong. At most style can taste. If you like sizeof(char)
I can only regret because it's like. But to think it might not be 1 is sandice.
In fact, doing a sizeof
of a literal character will result in the size of a int
. For me it was a language error to specify this and I see no use for it. Neither use this form, no need, use 1.
It is so wrong that C ++ thought it best to make this incompatible and the language results in 1 itself.
In C:
#include <stdio.h>
int main() {
char a = 'a';
printf("%d\n", sizeof(char));
printf("%d\n", sizeof(a));
printf("%d\n", sizeof('a'));
}
See running on ideone . And at Coding Ground . Also put it on GitHub for future reference .
In C ++
#include <iostream>
using namespace std;
int main() {
char a = 'a';
cout << sizeof(char) << "\n";
cout << sizeof(a) << "\n";
cout << sizeof('a') << "\n";
}
See running on ideone . And at Coding Ground . Also I put it in GitHub for future reference .