Insomnia and integer overflow

Discussion in 'Off-Topic' started by Mark44, May 11, 2009.

  1. Mark44

    Thread Starter Well-Known Member

    Nov 26, 2007
    626
    1
    If you're implementing a sheep counter circuit for insomniacs, don't forget to declare sheepCount as a long int.
    [​IMG]
     
  2. DonQ

    Active Member

    May 6, 2009
    320
    11
    Cute, but...

    What is the size of an integer in the "C" standard?
    Is a long int guaranteed to be larger than an int?
    Where does a short int fit into this?


    Answers:

    Int is to be at least 16 bits. It is allowed to be larger and often is. Some compilers have ints of 32 and 64 bits. Even longer is allowed, and multiples of 2 are not required.

    A long int is only guaranteed to be no shorter than an int. It is not required to be longer. If int is 16 bits, a 16 bit long int is allowed. In this case, using a long int would make no difference.

    Short int is only required to not be longer than an int. If an int is 64 bits, even a short int would be allowed to be 64 bits also.

    The moral of this story is: NEVER rely on the length of integer types! All signed and unsigned integer lengths are allowed to vary by compiler.



    The usual way to deal with this is to have a define file specific to the compiler being used where types like int16_t or int32_t are defined to translate to the proper type in that compiler. When you switch compilers, all you have to do is switch to the proper header file.

    Example:

    //compiler #1
    #define int16_t short int
    #define int32_t int
    #define int64_t long int

    (in a separate file)

    //compiler #2
    #define int16_t int
    #define int32_t long int
    #define int64_t #error unsupported type

    Then use int##_t instead of the built-in types, which should never be used directly.
     
  3. thingmaker3

    Retired Moderator

    May 16, 2005
    5,072
    6
    I have found a nice cup of chamomile tea to be most efficacious.
     
  4. Mark44

    Thread Starter Well-Known Member

    Nov 26, 2007
    626
    1
    Wow! I never expected such a detailed analysis of a cartoon!

    As far as I can tell, the C standard doesn't specify how big an int is. Here's a link to what I believe is the most current C standard:
    http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1124.pdf

    In a cursory examination I didn't see anything in it about ints being at least 16 bits. You're welcome to prove me wrong by showing me where to look, of course. It's true that specific implementations of C do in fact specify sizes for int, short int, long int, and so on, but unless I'm wrong, the C standard doesn't mandate specific sizes.

    Over the course of its history, C has been pretty freeform on this issue, allowing implementors to tailor the size of an int to the size of a machine WORD on the target architecture. I believe that one of the first implementations of C was on one of the PDP machines (PDP-6?), which used 6-bit bytes, which would imply to me that a WORD on this machine was 12 bits. The 6-bit byte could hold two 3-bit nybbles, or one octal number.

    When I started coding C, 8086/8088 were pretty much the standard PC processor, and shorts, ints and longs were 16 bits, 16 bits, and 32 bits, respectively. A few years later, after 32-bit machines came into vogue, the sizes of these types were changed to 16 bits, 32 bits, and 32 bits, respectively.


     
  5. DonQ

    Active Member

    May 6, 2009
    320
    11
    I was going from the first edition K&R (1978), where short, int and long could all be the same length, and int was at least 16. By the 1988 Ansi C edition, it was added that long would be at least 32, but everything could still be the same length. Other things may have been added since then, but it really doesn't matter since some compilers still go by the old standards, especially in the embedded world. So for portable code, not much can be assumed.

    I even once worked on a compiler that had 8-bit short ints.

    How many bits in a "char" on a compiler that supports unicode? Not 8.

    I've been defining my own fixed length ints for each compiler since the '80s, so I stopped tracking the length of ints in the standard.

    My bottom line being: Just changing a var to long does not guarantee a fix.
     
  6. Mark44

    Thread Starter Well-Known Member

    Nov 26, 2007
    626
    1
    C'mon man, lighten up. It's a freaking cartoon!
     
  7. thingmaker3

    Retired Moderator

    May 16, 2005
    5,072
    6
    He obviously needs a nice cup of chamomile tea. Or a good night's sleep. :D
     
  8. DonQ

    Active Member

    May 6, 2009
    320
    11
  9. Mark44

    Thread Starter Well-Known Member

    Nov 26, 2007
    626
    1
    From the Web page cited in the link above, with emphasis added.
    I don't see that this contradicts what I said in post #4.
    Mark
     
  10. DonQ

    Active Member

    May 6, 2009
    320
    11
    But it does contradict what was in #1.
    In the big picture, "long int" is not guaranteed to be longer than an int, and there are many compilers where it is not. Using "int" and "long int" is part of the definition of writing non-portable code.

    I get a little "preachy" because of how many times I've seen this sort of coding cause problems, sometimes serious. For twiddling in the basement, this may be OK, but as a foundation for a career, not so much so. I code things where large, expensive machinery and "life and limb" things are at stake, besides my reputation. I know that a lot of "professional programmers" code like this; It's just that I choose not to.




    In your reference, int types seem to be defined by: (but even this has a slight error.)
    Code ( (Unknown Language)):
    1. [LEFT]— minimum value for an object of type int          INT_MIN -32767 // −(2^15 − 1)
    2. — maximum value for an object of type int          INT_MAX +32767 //   2^15 − 1
    3. — maximum value for an object of type unsigned int UINT_MAX 65535 //   2^16 − 1[/LEFT]
    4.  
    This is actually saying that int is exactly 16-bits, instead of at least 16-bits, as has been so since K&R 1978, and contradicts what exists in many current compilers. Your reference also makes long exactly 32-bits, whereas K&R 1988 upgraded it from "no shorter than int" to "at least 32-bits".

    Problem is, this is not a perfect world, and not all compilers conform with this standard. In fact, not all "C" compilers comply with any standard (e.g. MicroE "C", and Rabbit's "Dynamic C", and numerous free versions, among others). The myriad of different standards and non-standards available for compilers to use (or not use) is exactly why long int is not a proper fix

    The advice of your referenced article (and my post) is for being able to write code that will run on all of them, including changes to come in the future, not just the ones that currently have the implementation dependent features that we wish for. The header file I suggested is for compilers that do not include the <stdint.h> file, the only place where the bit length of integer types can be guaranteed.

    Just seemed to me that the advice of using "long int" to fix this problem was only part of the story.
     
  11. thingmaker3

    Retired Moderator

    May 16, 2005
    5,072
    6
    Does cartooning count as twiddling or foundation?

    Two floors above me is a cubicle outside of which hangs a two page essay on why Dilbert's boss is right on a certain issue in a certain day's strip.:rolleyes:
     
Loading...