[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Bug#43787: changed title, and remade the proposed change



On Wed, Sep 08, 1999 at 08:32:16AM -0400, Ben Collins wrote:
 
> I have benchmarked this, and most of the larger packages (those that
> build several megs or more of object files, which with -g on, was quite a
> lot) saw a roughly 15% increase in speed during compiles. Now this is on a
> fast Ultra30, so I'm sure other systems, like the m68k machines, will no
> doubt show better results. On top of that, the space requirements are
> reduced considerably.

Debugging information does not normally mean a lot of processing time - 15%
seems to be reasonable. In my experience compiling with -g is largely I/O
bound because the object files get a lot bigger. This is a non-issue on system
with a good I/O subsystem (I guess an Ultra30 belongs to them) but will have a
huge impact on systems with cheap I/O.

> modify the draft into a final stage. You have already voiced your dislike
> and your objections are noted. The continual rants are not allowing any
> progress.

Please calm down, Ben. I also dislike some of the comments from Raul but this
last one has a point. A point I do not agree with - there are enough reasons
to build a package with debugging info. That's a great feature of Unix like
systems - you can give the user a binary (for example the package) and let
them reproduce the problem with it. With the core file you can debug the
problem without having a machine on which the problem is reproducible.

cu
    Torsten

Attachment: pgpDVLV2dCmfr.pgp
Description: PGP signature


Reply to: