Hello - Someone recently mentioned there may be a difference in efficiency between an int and an unsigned int. Has anyone heard this or have more information? My boss wants me to experiement with them to check, but not sure what to look at. Thank you..
Efficiency of int vs. unsigned int.
Collapse
X
-
Tags: None
-
kind regards,
Jos -
-
I found an answer to this question in case anyone else needs to know. The docs of the processor only discuss front end stuff. So, to test the behavior of the entire compiler, create global variables of signed and also unsigned, and create for loops for each type. Run the compiler with a switch (-S for ARM) which gives the assembly language output. In the output, you can see the instructions for each of the loops, and how the compiler processes the different types.Comment
-
ARM ints
I can tell you one huge difference when it comes to ARM code, and it may apply to other architectures.
When you are using C in ARM, you should have all autos be 32 bits. If you use shorts or chars then after every operation it will do a shift-left, shift-right to either zero-extend or sign-extend the word to an int so that it is always a proper 32 bit int. This is of course horribly inefficient. If your autos are 32 bit then there is nothing to do to normalize it. In ARM the only time you should use shorts or chars is for storing in memory.
Depending on the compiler and the architecture, it may be that it will want to sign extend a signed int but not extend an unsigned int, though I have not seen this.Comment
-
Strictly you should use an int (the arm is a 32 bit processor so int will be 32 bits).
For any platform int should be the most efficient type for the platform to process (that is how it is defined in the standard). Obviously you have to offset that with support portability given the possible limitations in size of int.
This is why of all the integer types int is the one that is found to be different sizes most often when examining different platforms.Comment
Comment