Compile time increases quadratically with struct size
>Submitter-Id: net
>Originator: Zoltan Hidvegi
>Confidential: no
>Synopsis: Compile time increases quadratically with struct size
>Severity: critical
>Priority: medium
>Category: c
>Class: sw-bug
>Release: 3.2.3 (Debian testing/unstable)
>Environment:
System: Linux hzoli 2.4.21-rc1-ac3 #1 Wed Apr 30 11:10:22 CDT 2003 i686 unknown unknown GNU/Linux
Architecture: i686
host: i386-pc-linux-gnu
build: i386-pc-linux-gnu
target: i386-pc-linux-gnu
configured with: ../src/configure -v --enable-languages=c,c++,java,f77,objc,ada --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --with-gxx-include-dir=/usr/include/c++/3.2 --enable-shared --with-system-zlib --enable-nls --without-included-gettext --enable-__cxa_atexit --enable-clocale=gnu --enable-java-gc=boehm --enable-objc-gc i386-linux
>Description:
The compile time increases quadratically (actually more than
quadratically) with the number of members in a struct.
>How-To-Repeat:
The following is a shell script that will generate a struct with a
given number of members and instantiates a global variable of that
type:
------- BEGIN biggen.sh -------------
#! /bin/sh
let i=0
echo 'struct foo {'
while [ "$i" -lt "$1" ]
do
echo "int i_$((i=i+1));"
done
echo '};'
echo 'struct foo f;'
--------- END biggen.sh -------------
Run it like this:
./biggen.sh 10000 > big.c; cc -c big.c
Change the number from 10000 and see how the compile time is affected.
E.g. on my 1.73GHz Athlon, 5000 members take 1.66s, 10000 takes 8.5s and 20000
takes 35.91s. Unfortunately, I need to compile a generated struct
with several hundred thousand members, which is not feasible.
>Fix:
Reply to: