After finally understanding how to decode opcodes with a project of a CHIP-8 emulator I decided to go back to an old project of mine, a Gameboy emulator, so I started to implement the CPU and its registers, gameboy is similar to the 8080 and z80, the gameboy has 6 16-bit registers and most are formed by joining 2 8-bit registers as indicated by this documentation .
I came to a point where I needed to decrease register B which would be the highest end of BC, I thought of using the shift bit operator but would probably erase the value of C, then chatting in a discord group related to gameboy development how could I do this a member introduced me the following code:
typedef union RegPair{
std::uint16_t value;
struct byte{
#ifndef LITTLE_ENDIAN_HOST
std::uint8_t l, h;
#else
std::uint8_t h, l;
#endif // !LITTLE_ENDIAN_HOST
}b;
}RegPair;
struct Registers {
RegPair AF, BC, DE, HL;
std::uint16_t SP; // Stack Pointer
std::uint16_t PC; // Program Counter
};
This was even easy to understand, as value has the most significant type in the union so hel would take part of this value, but the question is LITTLE_ENDIAN_HOST
certainly this refers to the type of machine end, which I do not understand is:
Macros are for the compiler and not for the code, as I checked that my machine is Little Endian so the program would be mounted as little endian, so would it run on a Big Endian computer?
And where is the definition of this macro? I searched in google but I only find code snippets but no website explaining the definition of it, in what header it is defined etc ...