I have a C project where a global variable is defined as a 16-bit unsigned integer. This variable is used in a file separately from where it is defined. This other file identifies the global variable using the extern keyword, but has the datatype (incorrectly) as a 16-bit signed integer.
So what will happen here? Will the file using the 'extern' version of the variable compile thinking it is signed? Or will the compiler see the global definition and recognize that it is unsigned?
There were no compile or link warnings about this mismatch.
Obviously I will fix this (someone else's mistake, not mine), but I was curious about how the code would actually run. This code is destined for an embedded 32-bit Motorola chip.
So what will happen here? Will the file using the 'extern' version of the variable compile thinking it is signed? Or will the compiler see the global definition and recognize that it is unsigned?
There were no compile or link warnings about this mismatch.
Obviously I will fix this (someone else's mistake, not mine), but I was curious about how the code would actually run. This code is destined for an embedded 32-bit Motorola chip.