I wrote a small program to check the range for int(signed) and long
int(signed) on my machine:
#include <stdio.h>
#include <limits.h>
int main(void)
{
printf("INT_MIN :%d INT_MAX: %d", INT_MIN, INT_MAX);
printf("\nLONG_ MAX: %d LONG_MIN: %d", LONG_MIN, LONG_MAX);
return 0;
}
the o/p that i get:
INT_MIN:-2147483648 INT_MAX: 2147483647
LONG_MIN:-2147483648 LONG_MAX: 2147483647
basically the same thing. why is this happening ? also these ranges
seem to contradict with the one given in K & R 2. Does it have
something to do with how the numbers are represented on a particular
machine ?
int(signed) on my machine:
#include <stdio.h>
#include <limits.h>
int main(void)
{
printf("INT_MIN :%d INT_MAX: %d", INT_MIN, INT_MAX);
printf("\nLONG_ MAX: %d LONG_MIN: %d", LONG_MIN, LONG_MAX);
return 0;
}
the o/p that i get:
INT_MIN:-2147483648 INT_MAX: 2147483647
LONG_MIN:-2147483648 LONG_MAX: 2147483647
basically the same thing. why is this happening ? also these ranges
seem to contradict with the one given in K & R 2. Does it have
something to do with how the numbers are represented on a particular
machine ?
Comment