Here's a little quiz: What value does
m have after running the following code:
unsigned int one = 1; long minusOne = -1; long m = MAX(one, minusOne);
If your answer was "it depends" – congratulations, you can probably stop reading now.
Here's the thing: If you run this on a modern Mac,
m will be
1, just as you'd expect. On iOS however, it will be
-1, which might surprise you, and with the default settings in a new Xcode project, the compiler won't even issue a warning. So, you might want to fix this by enabling the "Sign Comparison" warning (
GCC_WARN_SIGN_COMPARE)1 in your build settings (or by enabling all warnings).
The actual "problem" can be fixed pretty easily by casting
one to a
long (or any other signed type), but where do these strange results come from, and why do they differ between OS X and iOS? As you might already have guessed, the difference has to do with (modern) OS X running on 64 bit CPUs, while iOS is still a 32 bit platform. On the 64 bit OS X, a
long can store eight bytes, while it's only four on iOS. But that doesn't quite explain it – we're not really dealing with large numbers here after all.
When you compare two integers of different types (or apply any other binary operator), the compiler implicitly converts both operands to a common type that is determined by a number of rules and the "conversion rank" of the participants.
If the operand that has unsigned integer type has rank greater than or equal to the rank of the type of the other operand, the operand with signed integer type is converted to the type of the operand with unsigned integer type.2
Now we have our explanation: On a 32 bit platform,
unsigned int has the same conversion rank as
long because they're both 32 bit wide. This means that
minusOne is implicitly converted to an
unsigned int before comparing it to
unsigned int obviously cannot represent
-1, so it "underflows" to a very large positive number, which is then compared to
On a 64 bit platform on the other hand,
long has a higher conversion rank than
unsigned int, so
minusOne is converted to a
long instead, making the comparison behave as expected.
Of course there are other pitfalls that involve unsigned integers – subtracting from the
count of a string/collection is probably the most common one, but I thought this might be less well-known, and at least to me, the warning about signed and unsigned integers was somewhat mysterious for a long time. I knew, bad things could happen, but not exactly why or when.
Unfortunately, enabling this warning will result in a lot of false positives, which is probably why it's disabled by default. ↩
If I had used
long(which are less common in Cocoa), the result would be wrong on both platforms. ↩