Re: how to make Debian less fragile (long and philosophical)
(I don't believe I'm getting into this flamefest. Ah well..)
On Tue, Aug 17, 1999 at 12:26:44PM -0400, Raul Miller was heard to say:
> > You can poo-poo point 1. For most people this isn't a real big issue.
> > Another 20-30M isn't a huge chunk on a modern hard disk. We do have some
> > users with older systems, but they can cope.
> I suppose we need real numbers here.
I was once trying to build a minimalist system on a floppy (long story) --
it turned out to be more efficient space-wise to have dynamically linked
programs and libc (all 800k+ of it) than to have statically linked binaries.
I think that most programs bloated by at least a factor of 2 (unfortunately
I don't have the specific numbers either :( ) This is bad in terms of disk
space for people with smaller hard drives; it's even worse in terms of memory
usage, especially with regard to shell scripts and so on -- if you statically
link (eg) fileutils and shellutils you'll impose a huge memory overhead on
just about every nontrivial shell script since large bits of code will have
to be redundantly loaded. Maybe all *your* computers have 128MB+ of RAM but
even now many computers are shipping with 'only' 64MB and many slightly older
computers (like this one) will have 32-48MB or less. At this point I think
that sacrificing that much RAM to needless redundancy is not acceptable.
> (1) This con is in a different class from the pros. More generally,
> premature optimization is bad.
So is optimization for the uncommon case (servers that need to
stay up at all costs) at the expense of the common case. And yes, this *is*
an optimization question -- you're requesting optimization for recoverability
at the expense of resources (disk space and especially memory)
I think that it might be a good idea to have a static-binaries package but
trying to statically link the stock utilities is not worth it. On _this_
computer, it would be *more* trouble for me to futz around trying to use
static versions of the standard utilities to get the system back up than to
just stick in a rescue disk and fix stuff -- and most users that I've seen
are in a similar situation. Please don't optimize things for the high end
at the expense of the 'clueless' people who aren't high-powered sysadmins.
Anything that can go wrong, wfortune: segmentation fault, core dumped