"#define" defines a global variable?

6

I have always used some define s, but now this question has arisen, when I use #define am I creating a global variable? Are there any losses in this usage? Example:

Make program to read 10 numbers:

I put% with% amount 10 and use quantity in for parameters.

    
asked by anonymous 30.11.2018 / 22:52

1 answer

7

No, not even close to this.

#define just says that one text is the same as another text, but nothing else. So everywhere your code has that first text after the initial processing is changed by the second text.

There is nothing variable. Not even constant is it, although it looks like one. Therefore it is not possible to speak in scope as well. That name does not actually exist for the code that is compiled. The change is done in the whole code where the #define is worth, that is, from the moment it is found in the code until the end of that compilation unit.

In the way you are using it looks like a constant, then you have a name that is changed to a value, literal, the value only exists there in that location, again there is no scope.

For more modern code it is not recommended to use almost any preprocessor, including #define . Not that it should ever be used, but it is best to avoid.

See:

30.11.2018 / 22:59