Typedef enum's and integers

Discussion in 'Programmer's Corner' started by chrisw1990, Jan 19, 2013.

  1. chrisw1990

    Thread Starter Active Member

    Oct 22, 2011
    543
    41
    So.. my understanding of enums is that they are char's by default.. i was wondering, plain and simple, whether you can use integers instead of chars in the enum and obviously.. how i would do it..

    Cheers
    Chris
     
  2. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,392
    1,605
    An enumeration specifies a list of symbols and corresponding integer size values, not char size. (That's per the ANSI standard, and a compiler may not follow the standard. )

    That said, I have no idea what your question means.
     
    Last edited: Jan 20, 2013
  3. WBahn

    Moderator

    Mar 31, 2012
    17,751
    4,797
    Uhmmm... I didn't know there was an ISO standard that applied to ALL programming languages, and since the OP didn't specify a language....???

    May the use of the terms typedef, enum and char is sufficient for it to only possibly be C, but I wouldn't care to bet on that.

    @OP: Please confirm what language you are talking about, and then explain why you believe the enumerated type definitions are chars.
     
  4. takao21203

    Distinguished Member

    Apr 28, 2012
    3,577
    463
    On a 8bit chip typically you mostly want to use char.
    int is used only exceptionally.

    On a 16bit chip typically you'd want to use int.
    Char is used exceptionally only.

    You need to understand why exactly you want to use char or int.
     
  5. chrisw1990

    Thread Starter Active Member

    Oct 22, 2011
    543
    41
    ahh my apologies.. i am using C32(C language), Pic32..
    essentially, from what i gather, enums have up to 256 entities as they are chars, well how do you use integer variables inside it.. (essentially i have a load of variables in 4 byte hex.. id like to implement an integer type enum in order to implement this..)
    this is probably even more mistifying than the original question..
     
  6. takao21203

    Distinguished Member

    Apr 28, 2012
    3,577
    463
    Normally you'd use the native bitwidth. Otherwise the compiler has to do padding, and to clear out upper bytes/words. This takes extra time.

    One reason might be if you store data and the peripheral is only dealing with 8bit data. Or you want to compress a bitmap to less than 32bits and things like this.

    Sometimes you use 32bit variables but then deal with words or bytes manually. Do you want to do this via enum or type definitions?

    You should list some code in question and explain what you would like to do.
     
  7. chrisw1990

    Thread Starter Active Member

    Oct 22, 2011
    543
    41
    i havent got any code yet, and i might be able to work round it, i just wanted to find out if you could do it.. if you can see where im comin from:)
     
  8. WBahn

    Moderator

    Mar 31, 2012
    17,751
    4,797
    It sounds like this is probably going to be a compiler-specific issue. The C32 compiler may only be able to use chars for enum typedefs, or there may be ways to get it to use integers. It's possible that if you typedef an enum with more than 256 elements that it will use 16-bit integers automatically. I have no idea. You need to explore the documentation for your compiler.

    I just did a google search for "C32 compiler enum" and, without even following any of the links and only looking on the first page, I see a reference to a command line switch enum_is_int, although this is for an ARM C++ compiler. Still, it shows that at least some compilers offer options in this regard. But the very first hit, which is a link to a pdf for MPLAB XC32 (I don't know if this is your actual compiler or not), has the phrase "Allocate to an enum type only as many bytes as it needs for the ...... ", so I would guess there is some useful information to be had there.

    I downloaded that file and searched for "enum" and the fifth hit was in a table on Code Generation Convention Options for the option "-fshort-enums" where it says, "Allocate to an enum type only as many bytes as it needs for the declared range of possible values."

    A few hits later, in Appendix A, is says, "If the enumeration values are all non-negative, the type is unisgned int, else it is int. The -fshort-enums command line option can change this."

    The documentation for your tool is your friend.
     
    chrisw1990 likes this.
  9. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,392
    1,605
    In C32, an integer type, be it an int, signed int, unsigned int, long, signed long, unsigned long, are all 32 bits long.

     
    chrisw1990 likes this.
Loading...