TO convert a character into integer

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • iaslam
    New Member
    • Jul 2007
    • 5

    TO convert a character into integer

    How can we convert a character into integer
    char a[40]
    my input to this may be like 21466845687552
    now i want to access each element and want to convert it into integer and then apply aritmatic operation can i do this???????
  • Girish Kanakagiri
    New Member
    • May 2007
    • 93

    #2
    Originally posted by iaslam
    How can we convert a character into integer
    char a[40]
    my input to this may be like 21466845687552
    now i want to access each element and want to convert it into integer and then apply aritmatic operation can i do this???????
    Use function atoi()

    This will convert character to integer.

    Regards,
    Girish.

    Comment

    • archonmagnus
      New Member
      • Jun 2007
      • 113

      #3
      I try to avoid atoi(). In C++, it is better (in my opinion) to use stringstreams. Try this:

      [code=cpp]
      #include <sstream>

      char a[40];
      int integerValue[40];
      stringstream foo;

      // ... read in character array...

      // This connverts the character array from char to int
      for (int i = 0; i < 40; i++)
      {
      foo<<a[i];
      foo>>integerVal ue[i];
      }

      // ... do your manipulations, etc. ...
      [/code]

      Of course, the code is a bit more robust if you use vectors instead of arrays. Then, you aren't limited to fixed sizes.

      Comment

      • weaknessforcats
        Recognized Expert Expert
        • Mar 2007
        • 9214

        #4
        I think the OP said each element of the array had top be converted to an integer.

        If that's the case, all you do is:
        [code=cpp]
        char a[40];

        int i = a[1] - 48;
        [/code]

        Fashion a loop around this and off you go.

        Comment

        • JosAH
          Recognized Expert MVP
          • Mar 2007
          • 11453

          #5
          Originally posted by weaknessforcats
          I think the OP said each element of the array had top be converted to an integer.

          If that's the case, all you do is:
          [code=cpp]
          char a[40];

          int i = a[1] - 48;
          [/code]

          Fashion a loop around this and off you go.
          Yuck, that fails miserably on EBCDIC machines (IBM). Better make that:

          [code=c]
          int i = a[1] - '0';
          [/code]

          kind regards,

          Jos ( <--- nitpicker ;-)

          Comment

          • weaknessforcats
            Recognized Expert Expert
            • Mar 2007
            • 9214

            #6
            Originally posted by JosAH
            Yuck, that fails miserably on EBCDIC machines (IBM). Better make that:
            Really? Are you saying that C on a 370 isn't ASCII??

            EBCDIC is a 16 bit character set. So, sizeof(char) is 2 bytes and all the C library funcitons (and the sorts since EBCDIC sorts differently from ASCII) have been rewritten to EBCDIC??

            Really??

            Comment

            • JosAH
              Recognized Expert MVP
              • Mar 2007
              • 11453

              #7
              Originally posted by weaknessforcats
              Really? Are you saying that C on a 370 isn't ASCII??

              EBCDIC is a 16 bit character set. So, sizeof(char) is 2 bytes and all the C library funcitons (and the sorts since EBCDIC sorts differently from ASCII) have been rewritten to EBCDIC??

              Really??
              Last time I checked, EBCDIC was also an 8 bit character encoding. Have a look
              at this table. EBCDIC used 8 bits all the time where ASCII used only 7 bits;
              the 8th bit came into play when they realized that they had to go 'international' ;-)
              IBM never had those intentions; quite smart actually because ASCII lacks the
              bitwidth for true Unicode anyway (except for those UTF/8 hacks).

              This is a nice link that clearly shows where EBCDIC came from.
              And where it's going ;-)

              kind regards,

              Jos

              ps. There's no need to re-implement anything w.r.t. the sort method, i.e. the
              things just get sorted according to EBCDIC, that's all.

              Comment

              • JosAH
                Recognized Expert MVP
                • Mar 2007
                • 11453

                #8
                Originally posted by weaknessforcats
                EBCDIC is a 16 bit character set. So, sizeof(char) is 2 bytes and all the C library funcitons (and the sorts since EBCDIC sorts differently from ASCII) have been rewritten to EBCDIC??
                (forgot to reply to this part of your reply)

                If doesn't matter how many bits are in a char. The sizeof() operator returns 1 by
                definition. All that matters is that sizof(char) <= sizeof(short) <= sizeof(int) etc.
                and their minimum maximal values as defined in limits.h.

                e.g. on a typical old ARM processor a char might be 32bits and all sizeof()
                values would be equal to 1.

                kind regards,

                Jos

                ps. I don't think there were C implementations for those old Arms; I might be wrong.

                Comment

                • weaknessforcats
                  Recognized Expert Expert
                  • Mar 2007
                  • 9214

                  #9
                  I guess I remembered EBCDIC as more than 8 bits.

                  However, it was really cruel to bring up an 029 key punch. To this day I can still read those Hollerith punches.

                  Thank you for straightening me out again.

                  Comment

                  • JosAH
                    Recognized Expert MVP
                    • Mar 2007
                    • 11453

                    #10
                    Originally posted by weaknessforcats
                    I guess I remembered EBCDIC as more than 8 bits.

                    However, it was really cruel to bring up an 029 key punch. To this day I can still read those Hollerith punches.

                    Thank you for straightening me out again.
                    I remember those IBM 29s too; remember that 'control card' behind that little
                    window in the middle of that noisy thing? You could actually program that darn
                    thing a bit, like advance to position so-and-so before typing; release the card
                    after position so-and-so; oh the utmost fun we had sabotaging these things! ;-)

                    kind regards,

                    Jos

                    Comment

                    • maan
                      New Member
                      • Jul 2007
                      • 3

                      #11
                      i hope this programe will convert character into integer

                      <complete spoonfed code deleted>

                      Don't supply full code solutions. Read this part of the guidelines.

                      Please don't do this again; you stand the risk of a ban.

                      kind regards,

                      Jos

                      Comment

                      Working...