How to convert ASCII to binary?

6

I am trying to implement the conversion of a text to binary, I found one through the internet:

static string ASCII_binary(string texto)
        {
            string converted = string.Empty;
            byte[] byteArray = Encoding.ASCII.GetBytes(texto);

            for (int i = 0; i < byteArray.Length; i++)
            {
                for (int j = 0; j < 8; j++)
                {
                    converted += (byteArray[i] & 0x80) > 0 ? "1" : "0";
                    byteArray[i] <<= 1;
                }
            }

            return converted;
        } 

But I can not understand 0x80 what is it? Is there any other way to do this conversion?

    
asked by anonymous 30.11.2015 / 13:25

2 answers

5

Take a look at code showing each step that helps you understand a little better.

The 0x80 is the hexadecimal of decimal number 128. Which in binary is 10000000 . When it applies the operator & of and , each bit of a number is compared to another and the result will be 1 if both are 1, or 0 if both are 0 or if only one of them is 1. This is how this operator works. Then in the example of the letter t it starts with

01110100
&
10000000
--------
00000000

You gave zero.

In the next step, the operator << is applied, which is the offset, that is, it throws all the bits to the left, so it looks like this:

11101000
&
10000000
--------
10000000

You gave 128, that is, greater than zero, so he knows that he should use the string 1 .

And do this with all the other bits.

Note that using 128 (0x80) we are always comparing only the first bit. Or others will always give 0 in any situation, since at number 128, all bits will be 0, except the first. The and operator, in this case, will only vary according to the first bit of the number. The rest will be 0.

This is how to walk step by step through the bits and analyze only the first. There are even other ways, but it will be less efficient.

See the inefficient way in dotNetFiddle . It may seem more readable to anyone who does not understand bit operators.

See the performance test at dotNetFiddle .

If you consider that other operations of the algorithm have a very high cost, the simple use of the bit operator instead of the arithmetic operator probably generates more than one order of magnitude in the isolated operation.

The test should be done on your computer. DotNetFiddle is not trusted by having multiple processes running at the same time. But it gives an initial basis.

    
30.11.2015 / 14:11
3

Basically what the code is doing on the line

converted += (byteArray[i] & 0x80) > 0 ? "1" : "0";

is to compare the most significant bit of the ByteArray and checking if it is 1 or 0 doing a bitwise and 0x80

Then it takes the ByteArray value and shifts the character bits one-way to the left, doing this process for all 8 bits of the character.

The representation of 0x80 in hexadecimal is this: 10000000.

    
30.11.2015 / 14:00