What determines the size of integer in C?

Possible Duplicate:
size of int, long, etc
Does the size of an int depend on the compiler and/or processor?

I'm not sure if similar questions have been asked before on SO (Atleast, I couldn't find any while searching, so thought of asking myself).

What determines the size of int (and other datatypes) in C . I've read it depends on the machine/operating system/compiler, but haven't come across a clear/detailed enough explanation on things like what overrides the other, etc. Any explanation or pointers will be really helpful.


Ultimately the compiler does, but in order for compiled code to play nicely with system libraries, most compilers match the behavior of the compiler[s] used to build the target system.

So loosely speaking, the size of int is a property of the target hardware and OS (two different OSs on the same hardware may have a different size of int , and the same OS running on two different machines may have a different size of int ; there are reasonably common examples of both).

All of this is also constrained by the rules in the C standard. int must be large enough to represent all values between -32767 and 32767 , for example.


int is the "natural" size for the platform, and in practice that means one of

  • the processor's register size, or

  • a size that's backward compatible with existing code-base (eg 32-bit int in Win64).

  • A compiler vendor is free to choose any size with ≥ 16 value bits, except that (for desktop platforms and higher) a size that doesn't work with OS' API will mean that few if any copies of the compiler are sold. ;-)


    The size of C data types is constrained by the C standard, often constraints on the minimum size. The host environment (target machine + OS) may impose further restriction, ie constraints on the maximum size. And finally, the compiler is free to choose suitable values between these minimum and maximum values.

    Generally, it's considered bad practice to make assumptions about the size of C data types. Besides, it's not necessary, since C will tell you:

  • the sizeof -operator tells you an object's size in bytes
  • the macro CHAR_BITS from limits.h tells you the number of bits per byte
  • Hence, sizeof(foo) * CHAR_BITS tells you the size of type foo , in bits, including padding.

    Anything else is just assumptions. Note that the host environment may as well consist of 10.000 Chinese guys with pocket calculators and a huge blackboard, pulling size constraints out of thin air.

    链接地址: http://www.djcxy.com/p/40394.html

    上一篇: C中long int和int的大小显示4个字节

    下一篇: 什么决定C中整数的大小?