signed char and unsigned char difference

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • dam_fool_2003@yahoo.com

    signed char and unsigned char difference

    For int data type the default range starts from signed to unsigned. If
    we don't want negative value we can force an unsigned value. The same
    goes for long also.
    But I don't understand why we have signed char which is -256. Does it
    means that we can assign the same ASCII value to both signed and
    unsigned. That means the ASCII value can be represented with a type of
    signed char and also unsigned char?
    For example
    int main(void)
    {
    signed char a= 'a';
    unsigned char b = 'b';
    printf("%i %c",a,b);
    return 0;
    }

    The above code does not warn about the assignment. I went through the
    faq ,
    Section 8 but I don't find the answer. Can any one give any pointer
    regarding
    the above subject?
  • Jens.Toerring@physik.fu-berlin.de

    #2
    Re: signed char and unsigned char difference

    dam_fool_2003@y ahoo.com wrote:[color=blue]
    > For int data type the default range starts from signed to unsigned. If
    > we don't want negative value we can force an unsigned value. The same
    > goes for long also.[/color]

    Sorry, but these sentences don't make sense to me. For signed ints
    the data range is INT_MIN to INT_MAX, for unsigned ints it's 0 to
    UINT_MAX. INT_MIN must be at least -32767, INT_MAX +32767 and
    UINT_MAX 65535. For long there are similar minimum ranges, with
    "INT" replaced by "LONG" (i.e. LONG_MAX instead of INT_MAX) with
    minimum requirements being LONG_MIN == -2^31-1, LONG_MAX == 2^31-1
    and ULONG_MAX == 2^32-1. Implementations are allowed to support
    larger ranges. The actual values can be found in <limits.h>.
    [color=blue]
    > But I don't understand why we have signed char which is -256.[/color]

    The range for signed chars is SCHAR_MIN to SCHAR_MAX. Quite often
    (on machines with 8 bits in a char and 2's complement) this is the
    range between -128 and +127. The range for unsigned char is 0 to
    UCHAR_MAX (quite often this is 0 to 255). The ranges of -127 to
    127 for signed and 0 to 255 for unsigned chars are the minimum
    requirements, so you can be sure you can store numbers from these
    ranges wherever you have a standard compliant C compiler. While
    there probably are some machines where you also could store -256
    in a signed char you shouldn't rely on this, on many machines it
    won't work.
    [color=blue]
    > Does it
    > means that we can assign the same ASCII value to both signed and
    > unsigned. That means the ASCII value can be represented with a type of
    > signed char and also unsigned char?[/color]

    Yes, since ASCII characters are all in the range between 0 and 127,
    thus they can always be stored in a signed as well as an unsigned
    char.
    [color=blue]
    > For example
    > int main(void)
    > {
    > signed char a= 'a';
    > unsigned char b = 'b';[/color]

    There's nothing the compiler should complain about as long as you're
    using ASCII (it's different with EBCDIC because there 'a' is 129 and
    also most of the other letters are above 127, so you would better use
    unsigned char).
    Regards, Jens
    --
    \ Jens Thoms Toerring ___ Jens.Toerring@p hysik.fu-berlin.de
    \______________ ____________ http://www.toerring.de

    Comment

    • Tim Prince

      #3
      Re: signed char and unsigned char difference


      <Jens.Toerring@ physik.fu-berlin.de> wrote in message
      news:2mi2i6Fn79 1cU1@uni-berlin.de...[color=blue]
      > dam_fool_2003@y ahoo.com wrote:[/color]
      [color=blue][color=green]
      > > Does it
      > > means that we can assign the same ASCII value to both signed and
      > > unsigned. That means the ASCII value can be represented with a type of
      > > signed char and also unsigned char?[/color]
      >
      > Yes, since ASCII characters are all in the range between 0 and 127,
      > thus they can always be stored in a signed as well as an unsigned
      > char.
      >[color=green]
      > > For example
      > > int main(void)
      > > {
      > > signed char a= 'a';
      > > unsigned char b = 'b';[/color]
      >
      > There's nothing the compiler should complain about as long as you're
      > using ASCII (it's different with EBCDIC because there 'a' is 129 and
      > also most of the other letters are above 127, so you would better use
      > unsigned char).[/color]
      It's been a while, but I once used a PRIMOS system where the default ASCII
      representation had the high bit set.


      Comment

      • Darrell Grainger

        #4
        Re: signed char and unsigned char difference

        On Sun, 25 Jul 2004 dam_fool_2003@y ahoo.com wrote:
        [color=blue]
        > For int data type the default range starts from signed to unsigned. If
        > we don't want negative value we can force an unsigned value. The same
        > goes for long also.[/color]

        First sentence doesn't make sense to me. The rest seems obviously true.
        [color=blue]
        > But I don't understand why we have signed char which is -256. Does it
        > means that we can assign the same ASCII value to both signed and
        > unsigned. That means the ASCII value can be represented with a type of
        > signed char and also unsigned char?[/color]

        The first sentence of this paragraph makes no sense to me. There are
        systems where a char is 16 bits. For these systems you can have a signed
        char with a value of -256. Maybe the confusion is that char is not used to
        hold characters. It can be used for that purpose but it can also be used
        as an integer data type with a very small range of values. If you need to
        save space and you never need anything outside the range of a char, then
        use a char.

        As to your question, the ASCII character set is in the range 0 to 127. A
        signed char is typically in the range -128 to 127. An unsigned char is
        typically in the range 0 to 255. The ASCII character set will fit in both
        ranges. If you are on a system where CHAR_BIT (see <limits.h>) is greater
        than 8 the range could be even larger, so it would still hold true.
        [color=blue]
        > For example
        > int main(void)
        > {
        > signed char a= 'a';
        > unsigned char b = 'b';
        > printf("%i %c",a,b);
        > return 0;
        > }[/color]

        If you give printf a %i it is expecting an int. You are passing it a
        signed char. This will have undefind behaviour. Did you mean to use:

        printf("%c %c\n", a, b);
        [color=blue]
        > The above code does not warn about the assignment. I went through the
        > faq , Section 8 but I don't find the answer. Can any one give any
        > pointer regarding the above subject?[/color]

        --
        Send e-mail to: darrell at cs dot toronto dot edu
        Don't send e-mail to vice.president@ whitehouse.gov

        Comment

        • Keith Thompson

          #5
          Re: signed char and unsigned char difference

          Jens.Toerring@p hysik.fu-berlin.de writes:
          [...][color=blue]
          > There's nothing the compiler should complain about as long as you're
          > using ASCII (it's different with EBCDIC because there 'a' is 129 and
          > also most of the other letters are above 127, so you would better use
          > unsigned char).[/color]

          On any system that uses EBCDIC as the default encoding, plain char
          will almost certainly be unsigned.

          To oversimplify slightly:

          Use plain char to hold characters (the implementation will have chosen
          an appropriate representation) . Use unsigned char to hold bytes. Use
          signed char to hold very small numeric values. (I actually haven't
          seen much use for explicitly signed char.)

          --
          Keith Thompson (The_Other_Keit h) kst-u@mib.org <http://www.ghoti.net/~kst>
          San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
          We must do something. This is something. Therefore, we must do this.

          Comment

          • Jack Klein

            #6
            Re: signed char and unsigned char difference

            On Sun, 25 Jul 2004 16:06:37 GMT, "Tim Prince"
            <tprince@nospam computer.org> wrote in comp.lang.c:
            [color=blue]
            >
            > <Jens.Toerring@ physik.fu-berlin.de> wrote in message
            > news:2mi2i6Fn79 1cU1@uni-berlin.de...[color=green]
            > > dam_fool_2003@y ahoo.com wrote:[/color]
            >[color=green][color=darkred]
            > > > Does it
            > > > means that we can assign the same ASCII value to both signed and
            > > > unsigned. That means the ASCII value can be represented with a type of
            > > > signed char and also unsigned char?[/color]
            > >
            > > Yes, since ASCII characters are all in the range between 0 and 127,
            > > thus they can always be stored in a signed as well as an unsigned
            > > char.
            > >[color=darkred]
            > > > For example
            > > > int main(void)
            > > > {
            > > > signed char a= 'a';
            > > > unsigned char b = 'b';[/color]
            > >
            > > There's nothing the compiler should complain about as long as you're
            > > using ASCII (it's different with EBCDIC because there 'a' is 129 and
            > > also most of the other letters are above 127, so you would better use
            > > unsigned char).[/color]
            > It's been a while, but I once used a PRIMOS system where the default ASCII
            > representation had the high bit set.[/color]

            Then it wasn't ASCII.

            --
            Jack Klein
            Home: http://JK-Technology.Com
            FAQs for
            comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
            comp.lang.c++ http://www.parashift.com/c++-faq-lite/
            alt.comp.lang.l earn.c-c++

            Comment

            • Jack Klein

              #7
              Re: signed char and unsigned char difference

              On 25 Jul 2004 19:31:17 GMT, darrell@NOMORES PAMcs.utoronto. ca.com
              (Darrell Grainger) wrote in comp.lang.c:
              [color=blue]
              > On Sun, 25 Jul 2004 dam_fool_2003@y ahoo.com wrote:
              >[color=green]
              > > For int data type the default range starts from signed to unsigned. If
              > > we don't want negative value we can force an unsigned value. The same
              > > goes for long also.[/color]
              >
              > First sentence doesn't make sense to me. The rest seems obviously true.
              >[color=green]
              > > But I don't understand why we have signed char which is -256. Does it
              > > means that we can assign the same ASCII value to both signed and
              > > unsigned. That means the ASCII value can be represented with a type of
              > > signed char and also unsigned char?[/color]
              >
              > The first sentence of this paragraph makes no sense to me. There are
              > systems where a char is 16 bits. For these systems you can have a signed
              > char with a value of -256. Maybe the confusion is that char is not used to
              > hold characters. It can be used for that purpose but it can also be used
              > as an integer data type with a very small range of values. If you need to
              > save space and you never need anything outside the range of a char, then
              > use a char.
              >
              > As to your question, the ASCII character set is in the range 0 to 127. A
              > signed char is typically in the range -128 to 127. An unsigned char is
              > typically in the range 0 to 255. The ASCII character set will fit in both
              > ranges. If you are on a system where CHAR_BIT (see <limits.h>) is greater
              > than 8 the range could be even larger, so it would still hold true.
              >[color=green]
              > > For example
              > > int main(void)
              > > {
              > > signed char a= 'a';
              > > unsigned char b = 'b';
              > > printf("%i %c",a,b);
              > > return 0;
              > > }[/color]
              >
              > If you give printf a %i it is expecting an int. You are passing it a
              > signed char. This will have undefind behaviour. Did you mean to use:[/color]

              No, he is not. One can't pass any type of char as argument to a
              variadic function beyond the specified ones. The char 'a' will be
              promoted to int and the behavior is perfectly defined.

              Technically, passing unsigned char 'b' to printf() with a "%c"
              conversion specifier could be undefined because:

              1. The implementation might have UCHAR_MAX > INT_MAX (in other words,
              UCHAR_MAX == UINT_MAX) and so 'b' will be converted to unsigned,
              rather than signed, int.

              2. The standard suggests, but does not require, that the signed and
              unsigned integer types be interchangeable as function argument and
              return types.

              So this just could be undefined on a platform where the character
              types have the same number of bits as int (there are some, believe me)
              and unsigned ints are passed to variadic functions differently than
              signed ints are.

              I would not hold my breath waiting for such an implementation to
              appear. From a QOI point of view it would be horrible.

              --
              Jack Klein
              Home: http://JK-Technology.Com
              FAQs for
              comp.lang.c http://www.eskimo.com/~scs/C-faq/top.html
              comp.lang.c++ http://www.parashift.com/c++-faq-lite/
              alt.comp.lang.l earn.c-c++

              Comment

              • Giorgos Keramidas

                #8
                Re: signed char and unsigned char difference

                dam_fool_2003@y ahoo.com writes:[color=blue]
                > For int data type the default range starts from signed to unsigned. If
                > we don't want negative value we can force an unsigned value. The same
                > goes for long also.[/color]

                True.
                [color=blue]
                > But I don't understand why we have signed char which is -256.[/color]

                We don't.

                The smallest value that fits in 8-bits (which is the minimal size a
                signed char can hold, IIRC) is not -256 but -128. But your programs
                shouldn't depend on that. Use SCHAR_MIN instead of an inline "magic"
                value and you'll be fine ;-)
                [color=blue]
                > Does it means that we can assign the same ASCII value to both signed
                > and unsigned.[/color]

                An unsigned char can hold values up to UCHAR_MAX. I'm not sure if
                converting this value to signed char and back to unsigned will always
                work as expected.
                [color=blue]
                > That means the ASCII value can be represented with a type of signed
                > char and also unsigned char?[/color]

                No. SCHAR_MAX is usually 127 (if char values have 8-bits), which is
                smaller than some of the values that unsigned characters can store.
                [color=blue]
                > For example
                >
                > int main(void)
                > {
                > signed char a= 'a';
                > unsigned char b = 'b';
                > printf("%i %c",a,b);
                > return 0;
                > }
                >
                > The above code does not warn about the assignment.[/color]

                It depends on the warnings you have enabled. Here it doesn't even build
                because printf() is called before a prototype is visible:

                foo.c:5: warning: implicit declaration of function `printf'

                Giorgos

                Comment

                • Old Wolf

                  #9
                  Re: signed char and unsigned char difference

                  Jens.Toerring@p hysik.fu-berlin.de wrote:
                  [color=blue][color=green]
                  > > For example
                  > > int main(void)
                  > > {
                  > > signed char a= 'a';
                  > > unsigned char b = 'b';[/color]
                  >
                  > There's nothing the compiler should complain about as long as you're
                  > using ASCII (it's different with EBCDIC because there 'a' is 129 and
                  > also most of the other letters are above 127, so you would better use
                  > unsigned char).[/color]

                  The type 'char' has to be able to represent all members of the basic
                  character set, which includes 'a'. If the machine had 8-bit chars and
                  'a' == 129, then it must have 'char' being unsigned.

                  Comment

                  • Peter Nilsson

                    #10
                    Re: signed char and unsigned char difference

                    "Jack Klein" <jackklein@spam cop.net> wrote...[color=blue]
                    > (Darrell Grainger) wrote in comp.lang.c:[color=green]
                    > > On Sun, 25 Jul 2004 dam_fool_2003@y ahoo.com wrote:[color=darkred]
                    > > > For example
                    > > > int main(void)
                    > > > {
                    > > > signed char a= 'a';[/color][/color][/color]

                    Unless plain char is signed, there's no requirement that the value of 'a'
                    fit within the range of signed char.
                    [color=blue][color=green][color=darkred]
                    > > > unsigned char b = 'b';
                    > > > printf("%i %c",a,b);
                    > > > return 0;
                    > > > }[/color]
                    > >
                    > > If you give printf a %i it is expecting an int. You are passing it a
                    > > signed char. This will have undefind behaviour. Did you mean to use:[/color]
                    >
                    > No, he is not. One can't pass any type of char as argument to a
                    > variadic function beyond the specified ones. The char 'a' will be
                    > promoted to int[/color]

                    This is ambiguous since a literal 'a' is already of int and no promotion
                    would be required. Of course, Jack is talking about the promotion of signed
                    and unsigned chars a and b respectively when used as arguments to printf.
                    [color=blue]
                    > and the behavior is perfectly defined.[/color]

                    --
                    Peter


                    Comment

                    Working...