# C++: Differenct int size on different platform?



## 4bits (Feb 26, 2006)

Is it true that in C an int may be one size on one platform and a different size on a second platform?

Can someone explain to me why this happens? I just heared about this today and now I am very confused.

Thanks


----------



## Shadow2531 (Apr 30, 2001)

Lots of different types can be different sizes across platforms. It depends on how much memory the system uses to store an int or a wchar_t for example. Even on the same platform, it could be different between compilers.

I was told to use stdint.h's exact width types or the exact width types in the boost library if you really need to be sure you're working with the same size of int across platforms.

I haven't really messed with either though.


----------



## InterKnight (Oct 19, 2004)

Hello there.

Another thing (from what I learned in class) to take into consideration is the processor architecture. For example, when I was in C++ classes at my college the computers used typical 32-bit processors. Our professor said that on this architecture an int would typically be 4 bytes (32 bits) in size. I am just assuming now, that if you were working with a 64-bit processor the int size would be around 8 bytes (64 bits).

So I guess to an extent, the processor of the machine you are working with can have something to do with it.

Hope this helps.

Take care.


----------



## lotuseclat79 (Sep 12, 2003)

4bits said:


> Is it true that in C an int may be one size on one platform and a different size on a second platform?
> Can someone explain to me why this happens? I just heared about this today and now I am very confused.
> Thanks


Hi 4bits,

The size of an int in C is an "implementation dependent" feature of the C language that the C compiler implements according to the target dependent features of the archtecture on which the complier runs and generates code.

As InterKnight explained, the size of an int may be 32-bits or 64-bits, but a compiler can be designed to emit code on a 64-bit machine that utilizes a 32-bit integer if the architecture of the machine supports a 32-bit mode of execution or the compiler implementation otherwise handles that.

Also, a 32-bit integer could fit nicely as a short int on a 64-bit machine. However, on a 64-bit machine, the compiler implementor could also decide:
long int = 64-bit integer <- could also be implemented in software for 32-bit machine
int = 32-bit integer <- same as most 32-bit machines
short int = 16-bit integer <- ditto

where it would be up to the compiler implementor to implement the functionality of those integer defined implementations on the 64-bit architecture, unless there is already processor support for such constructs - i.e. depends on the processor architecture (instruction set features).

The best way to understand the difference for any one compiler/platform is to write a small C program that illustrates the difference by declaring several integers (long int, short int, int) in the C language, then print out their addresses and sizeof for each one.

-- Tom


----------

