How does the use of other numerical bases in C # work?

3

When I was learning the basics of C # variables, I learned that integer numeric statements ( Int32 ) used the numeric base 10, which is what we use most commonly in our day to day. A little later, I discovered that I could use hexadecimal to declare variables Int32 , adding 0x before.

With this, my doubts are:

  • How does this behave in the compiler? Does it convert everything to decimal before turning binary?
  • Are there other types of numeric bases we can use in C # without having to do conversions?
asked by anonymous 28.07.2017 / 16:47

1 answer

4

It's just a syntax, just for the programmer to feel more comfortable typing the number in the way the domain he's dealing with usually works. Most of the time it's the same decimal. There are some typical problems when we are dealing with computer things that hexadecimal or binary is more appropriate. For the application is a number, no matter how it was represented in the code. It's just convenience, nothing special. This has nothing to do with typing.

There is no such thing as converting everything to decimal, because the decimal itself is a representation. In fact we can concretely say that everything is always converted to binary, which is how the computer understands. But it's binary same, they're bits. What you see there in the code in binary notation is a text with the characters 0 and 1 , it's not a binary number, it's just a textual representation.

C # has not adopted the octal that other languages usually have because it is practically no longer used in practice and often causes confusion. There are methods that convert texts to this, or from this base.

Then in Hexadecimal you can represent with the prefix 0x (eg, 0xD08C ) or in binary with the prefix 0b (ex .: 0b1001_1101 ), the decimal is the default and you do not need anything extra .

Note that you can use tab anywhere you want in any representation.

    
28.07.2017 / 17:01