I set about trying to find a portable way to set the value of UCHAR_MAX. At
first, I thought the following would work:
#define UCHAR_MAX ~( (unsigned char)0 )
However, it didn't work for me. Could someone please explain to me what's
going on? I would have thought that the following happens:
(1) The literal, 0, whose type is int, gets converted to an unsigned char.
0000 0000
(2) The resultant unsigned char then has all its bits flipped.
1111 1111
My hunch is that there's some sort of integer promotion at work, but I
don't know exactly how it works.
Could someone please enlighten me?
--
Frederick Gotham
first, I thought the following would work:
#define UCHAR_MAX ~( (unsigned char)0 )
However, it didn't work for me. Could someone please explain to me what's
going on? I would have thought that the following happens:
(1) The literal, 0, whose type is int, gets converted to an unsigned char.
0000 0000
(2) The resultant unsigned char then has all its bits flipped.
1111 1111
My hunch is that there's some sort of integer promotion at work, but I
don't know exactly how it works.
Could someone please enlighten me?
--
Frederick Gotham
Comment