top/contents search
The point is that character constants are not of type char! sizeof(char) is 1, while sizeof(int) is 2 or 4 on most machines. Now, a constant like 'a', even though it looks like a character, is actually an integer as far as the compiler is concerned, so sizeof('a') == sizeof(int).

It's only confusing if you assume that character constants are chars. It makes perfect sense if you know the rule that ``character constants are of type int'', even if that rule doesn't seem to make much sense in itself.

back


This page by Steve Summit // Copyright 1995-2004 // feedback