1 << 32?
I would argue that
232is a correct answer. It comes from an understanding of binary arithmetic, an important requirement when you need to optimize certain calculations for the digital computer hardware.
I would also argue that
0is a correct answer. It comes from awareness of the range limitations of the primitive numeric data types in your choice of programming language, an important requirement when you need to take arithmetic overflows into consideration.
I didn't think a third correct answer is possible. Alas, I just found out today that in C#,
1 << 32 == 1. I have to admit that this was a rather unexpected discovery. Fortunately I was able to find this "bug" rather quickly, and end up spending more time writing about it instead.
Apparently by design, C# shifts a 32-bit number using only the lower 5-bit of the count operand. Initially I thought this was a very odd, almost unjustifiable behavior, but apparently it comes from how x86 processors handle arithmetic shifts. An argument could be made that this behavior isn't appropriate for what is supposed to be a high-level programming language, though.
I should also add that it's quite likely that
undefinedis technically a correct answer as well. I'm not about to look up the official C/C++ spec to confirm this, though.
I wonder if there's a language that accepts a negative count operand and just do what you'd expect it to. I don't think this scenario happens often, but I do remember specifically wishing for this feature once or twice in my 15+ years of programming.
AH *$&(#! Java behaves like C# in this case too! GOOD TO KNOW!!!