#define

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • srinivas reddy

    #define

    I have defined some variables using #define preprocessor instruction.
    And then later I checked whether I had defined a variable. Sometimes
    even though a variable have been defined, #if !defined(var) construct
    is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
    if any body can tell me whether this is a bug or I am doing something
    wrong.

    tia,
    Srinivas
  • David White

    #2
    Re: #define

    srinivas reddy <srinivasreddy_ m@yahoo.com> wrote in message
    news:ff8ef364.0 307071857.7d478 746@posting.goo gle.com...[color=blue]
    > I have defined some variables using #define preprocessor instruction.
    > And then later I checked whether I had defined a variable. Sometimes
    > even though a variable have been defined, #if !defined(var) construct
    > is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
    > if any body can tell me whether this is a bug or I am doing something
    > wrong.[/color]

    I would have said that this surely is a bug, but I wouldn't put anything
    past the C++ preprocessor.

    Incomprehensibl y, #if var1 == var2 simply converts var1 and var2 to "0"
    (yes, "0", even though the preprocessor is a _text_ replacer) if it hasn't
    come across definitions of them (something like Basic assuming that any
    undefined variable it comes across must be an int; and I thought C++ got rid
    of implicit this and implicit that because they are thought unsafe). 0 == 0
    is true, of course.

    I can only assume that those with influence who wish to see the end of the
    preprocessor altogether are trying to accelerate its death by ensuring that
    it works as badly as possible.

    DW



    Comment

    • Pete Becker

      #3
      Re: #define

      David White wrote:[color=blue]
      >
      > I can only assume that those with influence who wish to see the end of the
      > preprocessor altogether are trying to accelerate its death by ensuring that
      > it works as badly as possible.
      >[/color]

      Then it must be that the folks who originally came up with the idea of
      the preprocessor thirty years ago tried to accelerate its death, because
      replacing undefined symbols with 0 in arithmetic expressions has been
      the rule since the beginning.

      --

      Pete Becker
      Dinkumware, Ltd. (http://www.dinkumware.com)

      Comment

      • Howard

        #4
        Re: #define


        "srinivas reddy" <srinivasreddy_ m@yahoo.com> wrote in message
        news:ff8ef364.0 307071857.7d478 746@posting.goo gle.com...[color=blue]
        > I have defined some variables using #define preprocessor instruction.
        > And then later I checked whether I had defined a variable. Sometimes
        > even though a variable have been defined, #if !defined(var) construct
        > is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
        > if any body can tell me whether this is a bug or I am doing something
        > wrong.
        >
        > tia,
        > Srinivas[/color]

        Perhaps you have defined the variable, but not in the compilation unit in
        which your #if statement is written? In other words, if you #define the
        variable in unit1.h, and make your check in unit2.cpp, then you need to add
        #include "unit1.h" in unit2.cpp before checking if the variable exists.

        (I usually put my #define's in a precompiled header if they're going to be
        widely used in my projects. But that's with CodeWarrior...I don't know how
        to use gcc so it may be different.)

        Just a thought...

        -Howard



        Comment

        • Paul Mensonides

          #5
          Re: #define

          David White wrote:[color=blue]
          > srinivas reddy <srinivasreddy_ m@yahoo.com> wrote in message
          > news:ff8ef364.0 307071857.7d478 746@posting.goo gle.com...[color=green]
          >> I have defined some variables using #define preprocessor instruction.
          >> And then later I checked whether I had defined a variable. Sometimes
          >> even though a variable have been defined, #if !defined(var) construct
          >> is returning true. I am using gcc3.0.1 on SunOS8. I would appreciate
          >> if any body can tell me whether this is a bug or I am doing something
          >> wrong.[/color]
          >
          > I would have said that this surely is a bug, but I wouldn't put
          > anything past the C++ preprocessor.
          >
          > Incomprehensibl y, #if var1 == var2 simply converts var1 and var2 to
          > "0" (yes, "0", even though the preprocessor is a _text_ replacer) if
          > it hasn't come across definitions of them (something like Basic
          > assuming that any undefined variable it comes across must be an int;
          > and I thought C++ got rid of implicit this and implicit that because
          > they are thought unsafe). 0 == 0 is true, of course.
          >
          > I can only assume that those with influence who wish to see the end
          > of the preprocessor altogether are trying to accelerate its death by
          > ensuring that it works as badly as possible.[/color]

          It isn't the preprocessor that is bad--even the conversion to 0 that you mention
          here. It is *misuse* of the preprocessor that is bad. The preprocessor is
          actually a critical component of the C and C++ compilation process. It makes it
          possible to write code that works on multiple platforms, as well as write code
          that works on various current compilers (as opposed to the idealistic perfect
          C++ implementation) .

          Regards,
          Paul Mensonides


          Comment

          • David White

            #6
            Re: #define

            Pete Becker <petebecker@acm .org> wrote in message
            news:3F0AD150.8 B6D2F5E@acm.org ...[color=blue]
            > David White wrote:[color=green]
            > >
            > > I can only assume that those with influence who wish to see the end of[/color][/color]
            the[color=blue][color=green]
            > > preprocessor altogether are trying to accelerate its death by ensuring[/color][/color]
            that[color=blue][color=green]
            > > it works as badly as possible.
            > >[/color]
            >
            > Then it must be that the folks who originally came up with the idea of
            > the preprocessor thirty years ago tried to accelerate its death, because
            > replacing undefined symbols with 0 in arithmetic expressions has been
            > the rule since the beginning.[/color]

            I accept that, but why hasn't it been fixed along with everything else?
            Implicit int, matching of function argument types, insistence that function
            definitions be present, etc. have been some of the many improvements to C++
            since C. I don't think anyone disputes that these are all good things. The
            more prgrammer errors you can detect at compile time the better. Why leave
            something there that's so obviously bad?

            DW



            Comment

            • Pete Becker

              #7
              Re: #define

              David White wrote:[color=blue]
              >
              > Pete Becker <petebecker@acm .org> wrote in message
              > news:3F0AD150.8 B6D2F5E@acm.org ...[color=green]
              > > David White wrote:[color=darkred]
              > > >
              > > > I can only assume that those with influence who wish to see the end of[/color][/color]
              > the[color=green][color=darkred]
              > > > preprocessor altogether are trying to accelerate its death by ensuring[/color][/color]
              > that[color=green][color=darkred]
              > > > it works as badly as possible.
              > > >[/color]
              > >
              > > Then it must be that the folks who originally came up with the idea of
              > > the preprocessor thirty years ago tried to accelerate its death, because
              > > replacing undefined symbols with 0 in arithmetic expressions has been
              > > the rule since the beginning.[/color]
              >
              > I accept that, but why hasn't it been fixed along with everything else?[/color]

              Because it's not broken.
              [color=blue]
              > Implicit int, matching of function argument types, insistence that function
              > definitions be present, etc. have been some of the many improvements to C++
              > since C. I don't think anyone disputes that these are all good things. The
              > more prgrammer errors you can detect at compile time the better. Why leave
              > something there that's so obviously bad?
              >[/color]

              The fact that you don't understand it doesn't make it bad.

              --

              Pete Becker
              Dinkumware, Ltd. (http://www.dinkumware.com)

              Comment

              • David White

                #8
                Re: #define

                Pete Becker <petebecker@acm .org> wrote in message
                news:3F0B4B74.3 9BC048A@acm.org ...[color=blue]
                > David White wrote:[color=green]
                > >
                > >
                > > I accept that, but why hasn't it been fixed along with everything else?[/color]
                >
                > Because it's not broken.
                >[color=green]
                > > Implicit int, matching of function argument types, insistence that[/color][/color]
                function[color=blue][color=green]
                > > definitions be present, etc. have been some of the many improvements to[/color][/color]
                C++[color=blue][color=green]
                > > since C. I don't think anyone disputes that these are all good things.[/color][/color]
                The[color=blue][color=green]
                > > more prgrammer errors you can detect at compile time the better. Why[/color][/color]
                leave[color=blue][color=green]
                > > something there that's so obviously bad?
                > >[/color]
                > The fact that you don't understand it doesn't make it bad.[/color]

                What have I said that indicates that I don't understand it? Did I describe
                it wrongly?

                I'm interested to know: do you think that assuming that an undefined
                preprocessor symbol is "0" is a good thing, or something that wouldn't be
                improved by a compiler error saying that the symbol is undefined? If so, why
                not extend the principle to assuming that any symbol in a C++ expression is
                an 'int'?

                myVariable = 7;
                // myVariable not defined anywhere: so it must be an 'int'.

                Okay?

                myVariable = myFunction(3, "abc", 2.65);
                // myFunction not defined anywhere: so it must be int myFunction(int, char
                *, double);

                Okay?

                DW



                Comment

                • Pete Becker

                  #9
                  Re: #define

                  David White wrote:[color=blue]
                  >
                  > Pete Becker <petebecker@acm .org> wrote in message
                  > news:3F0B4B74.3 9BC048A@acm.org ...[color=green]
                  > > David White wrote:[color=darkred]
                  > > >
                  > > >
                  > > > I accept that, but why hasn't it been fixed along with everything else?[/color]
                  > >
                  > > Because it's not broken.
                  > >[color=darkred]
                  > > > Implicit int, matching of function argument types, insistence that[/color][/color]
                  > function[color=green][color=darkred]
                  > > > definitions be present, etc. have been some of the many improvements to[/color][/color]
                  > C++[color=green][color=darkred]
                  > > > since C. I don't think anyone disputes that these are all good things.[/color][/color]
                  > The[color=green][color=darkred]
                  > > > more prgrammer errors you can detect at compile time the better. Why[/color][/color]
                  > leave[color=green][color=darkred]
                  > > > something there that's so obviously bad?
                  > > >[/color]
                  > > The fact that you don't understand it doesn't make it bad.[/color]
                  >
                  > What have I said that indicates that I don't understand it? Did I describe
                  > it wrongly?[/color]

                  You said earlier that the preprocessor is "a _text_ replacer."
                  [color=blue]
                  >
                  > I'm interested to know: do you think that assuming that an undefined
                  > preprocessor symbol is "0" is a good thing, or something that wouldn't be
                  > improved by a compiler error saying that the symbol is undefined?[/color]

                  No. It would make some things much more verbose, and would only help
                  beginners.
                  [color=blue]
                  > If so, why
                  > not extend the principle to assuming that any symbol in a C++ expression is
                  > an 'int'?[/color]

                  Non sequitur.

                  --

                  Pete Becker
                  Dinkumware, Ltd. (http://www.dinkumware.com)

                  Comment

                  • David White

                    #10
                    Re: #define

                    Pete Becker <petebecker@acm .org> wrote in message
                    news:3F0B68CC.1 91AC09E@acm.org ...[color=blue]
                    > David White wrote:[color=green]
                    > >
                    > > What have I said that indicates that I don't understand it? Did I[/color][/color]
                    describe[color=blue][color=green]
                    > > it wrongly?[/color]
                    >
                    > You said earlier that the preprocessor is "a _text_ replacer."[/color]

                    Yes, and that statement was _clearly_ made in the context of #define and
                    #if.

                    #define X Y

                    Doesn't this replace the symbol 'X' found anywhere in the source code with
                    the text 'Y'?

                    Also: "Because they rearrange the program text before the compiler proper
                    sees it, macros are..." - The C++ Programming Language (3rd ed.), page 160.

                    Given that macros _do_ replace text, why should an undefined symbol become
                    '0' rather than ''?
                    [color=blue][color=green]
                    > > I'm interested to know: do you think that assuming that an undefined
                    > > preprocessor symbol is "0" is a good thing, or something that wouldn't[/color][/color]
                    be[color=blue][color=green]
                    > > improved by a compiler error saying that the symbol is undefined?[/color]
                    >
                    > No. It would make some things much more verbose,[/color]

                    Such as?

                    And is the increased verbosity worse than no message from the compiler when
                    a symbol is used without having been defined?

                    Speaking of verbosity, the way to ensure that preprocessor symbols are not
                    silently converted to 0 is:

                    #if !defined(REACTO R_TYPE) || !defined(REACTO R_NEW_MODEL)
                    #error REACTOR_TYPE or REACTOR_NEW_MOD EL not defined
                    #endif

                    Apart from the fact that if one remembers to do this then one would have
                    ensured that the symbols were defined, is it not verbose to place this in
                    every source file in which these symbols are used?
                    [color=blue]
                    > and would only help
                    > beginners.[/color]

                    I see. So, only beginners would ever forget to ensure that both of these are
                    #defined somewhere?

                    #if REACTOR_TYPE == REACTOR_NEW_MOD EL
                    [color=blue][color=green]
                    > > If so, why
                    > > not extend the principle to assuming that any symbol in a C++ expression[/color][/color]
                    is[color=blue][color=green]
                    > > an 'int'?[/color]
                    >
                    > Non sequitur.[/color]

                    void f(int reactorType)
                    {
                    // No definition of REACTOR_NEW_MOD EL given
                    if(reactorType == REACTOR_NEW_MOD EL)
                    {
                    // ...
                    }
                    }

                    Why should this be an error, but not the preprocessor version?

                    DW



                    Comment

                    • David White

                      #11
                      Re: #define

                      "Paul Mensonides" <leavings@comca st.net> wrote in message
                      news:ZoHOa.1206 5$H17.3639@sccr nsc02...[color=blue]
                      > David White wrote:[color=green]
                      > > Incomprehensibl y, #if var1 == var2 simply converts var1 and var2 to
                      > > "0" (yes, "0", even though the preprocessor is a _text_ replacer) if
                      > > it hasn't come across definitions of them (something like Basic
                      > > assuming that any undefined variable it comes across must be an int;
                      > > and I thought C++ got rid of implicit this and implicit that because
                      > > they are thought unsafe). 0 == 0 is true, of course.
                      > >
                      > > I can only assume that those with influence who wish to see the end
                      > > of the preprocessor altogether are trying to accelerate its death by
                      > > ensuring that it works as badly as possible.[/color]
                      >
                      > It isn't the preprocessor that is bad--even the conversion to 0 that you[/color]
                      mention[color=blue]
                      > here.[/color]

                      Well, I think the conversion to 0 _is_ bad. Given that you can use #ifdef or
                      #if defined() for things such as:
                      #ifdef _cplusplus

                      how can the implicit conversion to 0 of an undefined symbol be a good thing?
                      Why is it better than issuing an error?
                      [color=blue]
                      > It is *misuse* of the preprocessor that is bad. The preprocessor is
                      > actually a critical component of the C and C++ compilation process.[/color]

                      I agree. That's why I'd like it to work _safely_.
                      [color=blue]
                      > It makes it
                      > possible to write code that works on multiple platforms, as well as write[/color]
                      code[color=blue]
                      > that works on various current compilers (as opposed to the idealistic[/color]
                      perfect[color=blue]
                      > C++ implementation) .[/color]

                      Yes, but I want to do it safely. I do not want the outcome of an #if to be
                      one of these two possibilities:
                      1. The result of the expression of previously defined symbols.
                      2. A programmer's mistake in forgetting to include the defined symbols.

                      This is inherently unsafe. The possibility of no. 2 is the reason that C++
                      insists on all function definitions being present and that there is a
                      suitable match for every argument. Does not one other person here think that
                      this is a problem?

                      DW



                      Comment

                      • Alexander Terekhov

                        #12
                        Re: #define


                        David White wrote:
                        [...][color=blue]
                        > This is inherently unsafe. The possibility of no. 2 is the reason that C++
                        > insists on all function definitions being present and that there is a
                        > suitable match for every argument. Does not one other person here think that
                        > this is a problem?[/color]


                        (Subject: Using a define that hasn't been #defined)

                        regards,
                        alexander.

                        --
                        "Status quo, you know, that is Latin for ``the mess we're in.''"

                        -- Ronald Reagan

                        Comment

                        • Ron Natalie

                          #13
                          Re: #define


                          "David White" <no@email.provi ded> wrote in message news:KbLOa.8912 $eE.124878@nasa l.pacific.net.a u...
                          [color=blue]
                          >
                          > Doesn't this replace the symbol 'X' found anywhere in the source code with
                          > the text 'Y'?[/color]

                          No, it doesn't. Pete is right, you seem not to understand the preprocessor.
                          [color=blue]
                          > Given that macros _do_ replace text, why should an undefined symbol become
                          > '0' rather than ''?[/color]

                          They do not replace text, they replace tokens.


                          Comment

                          • Paul Mensonides

                            #14
                            Re: #define

                            Ron Natalie wrote:[color=blue]
                            > "David White" <no@email.provi ded> wrote in message
                            > news:KbLOa.8912 $eE.124878@nasa l.pacific.net.a u...
                            >[color=green]
                            >>
                            >> Doesn't this replace the symbol 'X' found anywhere in the source
                            >> code with
                            >> the text 'Y'?[/color]
                            >
                            > No, it doesn't. Pete is right, you seem not to understand the
                            > preprocessor.
                            >[color=green]
                            >> Given that macros _do_ replace text, why should an undefined symbol
                            >> become '0' rather than ''?[/color]
                            >
                            > They do not replace text, they replace tokens.[/color]

                            Even more specifically, they replace macro invocations:

                            #define X() Y

                            X // X

                            Regards,
                            Paul Mensonides


                            Comment

                            • Paul Mensonides

                              #15
                              Re: #define

                              David White wrote:
                              [color=blue][color=green]
                              >> It isn't the preprocessor that is bad--even the conversion to 0 that
                              >> you mention here.[/color]
                              >
                              > Well, I think the conversion to 0 _is_ bad. Given that you can use
                              > #ifdef or #if defined() for things such as:
                              > #ifdef _cplusplus
                              >
                              > how can the implicit conversion to 0 of an undefined symbol be a good
                              > thing? Why is it better than issuing an error?[/color]

                              Because it is a "reasonable default." Reasonable defaults make code less
                              verbose. This happens with templates also:

                              template<class T> void f(int reactorType)
                              {
                              // No definition of REACTOR_NEW_MOD EL given
                              if(reactorType == T::REACTOR_NEW_ MODEL)
                              {
                              // ...
                              }
                              }

                              The compiler will pass this with no problem even though it still parses the
                              expression, etc.. The reasonable default here is "non-type". The point being
                              that the language has to deal with unknown names and make assumptions about what
                              they mean in various places.

                              That is just the way it is. You know what the behavior is, so writing "safe"
                              code is up to you to use the language in safe ways. C and C++ certainly don't
                              protect you from unsafe usage many areas, why should they do that here?

                              If you changed the behavior to an error, how would you do this in a non-verbose
                              way:

                              # if !__cplusplus && __STDC_VERSION_ _ >= 199901L

                              You'd have to do something really annoying because you cannot use any
                              conditional test that uses the name outside the defined operator. You can't
                              even do this:

                              #if defined(__STDC_ VERSION__) && __STDC_VERSION_ _ >= 199901L

                              ...because that constitutes an error under your model if __STDC_VERSION_ _ is not
                              defined. You'd have to separate the test for definition from the conditional
                              expression:

                              # if !defined __cplusplus && defined __STDC_VERSION_ _
                              # if __STDC_VERSION_ _ >= 199901L
                              # // 1
                              # else
                              # // 2
                              # endif
                              # else
                              # // 2
                              # endif

                              ....and that is a code doubler for point 2.

                              If you changed the behavior to expanding to nil instead of 0, you'd have silent
                              changes in other ways. You'd also end of seeing a lot of "hacks" like this:

                              # if !(__cplusplus+0 ) && (__STDC_VERSION __+0) >= 199901L

                              In order to simulate the common scenario that we already have built into the
                              preprocessor.
                              [color=blue][color=green]
                              >> It is *misuse* of the preprocessor that is bad. The preprocessor is
                              >> actually a critical component of the C and C++ compilation process.[/color]
                              >
                              > I agree. That's why I'd like it to work _safely_.[/color]

                              It does work safely if used correctly. I said it before already: The #if and
                              #elif directives are not designed to implicitly perform the kind of verification
                              that you want--because that kind of verification (if done by default) is
                              downright annoying.

                              Further, the root problem here is 1) forgetting to include a file, or 2) design
                              error. Assuming that it is just a case of forgetting to include the file that
                              defines the symbols, there a many ways in which a program can silently change
                              meaning in C++ by not including a file (e.g. silently choosing different
                              function overloads or different template specializations ).
                              [color=blue][color=green]
                              >> It makes it
                              >> possible to write code that works on multiple platforms, as well as
                              >> write code that works on various current compilers (as opposed to
                              >> the idealistic perfect C++ implementation) .[/color]
                              >
                              > Yes, but I want to do it safely. I do not want the outcome of an #if
                              > to be one of these two possibilities:
                              > 1. The result of the expression of previously defined symbols.[/color]

                              It is totally ill-conceived. You can do what you want reasonably, but you
                              cannot do what it already does reasonably. You already have the option to do
                              what you want:

                              #if defined(REACTOR _TYPE) \
                              && defined(REACTOR _NEW_MODEL) \
                              && REACTOR_TYPE == REACTOR_NEW_MOD EL

                              You can't go back the other way.
                              [color=blue]
                              > 2. A programmer's mistake in forgetting to include the defined
                              > symbols.
                              >
                              > This is inherently unsafe.[/color]

                              No it isn't _inherently_ unsafe. It can be unsafe in certain contexts, and you
                              have to be aware of that when you write code. However, the alternative is much
                              worse. You can simulate what you want with a small amount of code; you cannot
                              simulate what it already does with a small amount of code.
                              [color=blue]
                              > The possibility of no. 2 is the reason
                              > that C++ insists on all function definitions being present and that
                              > there is a suitable match for every argument. Does not one other
                              > person here think that this is a problem?[/color]

                              C++ does not insist on all function declarations that you've defined in a group
                              of files be present at each overload resolution--which can cause silent
                              differences in overload resolution, etc..

                              Regards,
                              Paul Mensonides


                              Comment

                              Working...