Am Dienstag 19 September 2006 20:25 schrieb Mario 'BitKoenig' Holbe: > Hendrik Sattler <debian@hendrik-sattler.de> wrote: > > Which OS combination does not define int to be 32bit on a 64bit > > architecture? > > This is mainly compiler-, not primarily OS-dependent. And: all compilers > with an ILP64 data model. > However, the question should rather be: *why* compilers do not define > int to be 64bit on a 64bit architecture? And the answer is simple: > Yes int should be 64bit on a 64bit architecture, since int is defined > as the architectures "natural size" data type. However, it is mostly > not because of the elsewise massively increasing porting-expenses due > to many programmers who never thought about it and simply assumed int > to be 32bit. > > So, your metaphor implicitely leads to exactly the same answer ;) The answer is that the LP64 scheme is used and not ILP64. There is a good and detailed explanation available: http://www.unix.org/whitepapers/64bit.html HS
Attachment:
pgpgSsSzIyRcZ.pgp
Description: PGP signature