[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

compile from source: to what extent?

Hi all,

I don't want to talk about whether 15k line ./configure scripts are

I'm unofficially maintaining a nonfree package, IRAF.  It install
/usr/include/iraf.h, which it uses for compilation (it includes a
compiler: "xc") as well as to resolve the path to its files
("#define HOST /usr/lib/...").

Should I use -I to force it to look at my (pristine) iraf.h?

Also, the upstream tarball includes binaries which most people use for
compiling (if they compile, instead of downloading the other
binaries).  I strip the binaries.  Compilation creates an executable
"xc.e" which is eventually installed as /usr/bin/xc (which is actually
a symlink).  However, further down the compilation, it requires an
executable called "xc".  I've worked around the problem with
PATH=$PATH:`pwd`/tmp, where I keep a link to xc.e (and other kludges).
Should I be using PATH=`pwd`/tmp/:$PATH?  If I don't, then it will use

Obviously, compilations will fail on broken systems (like without
/usr/bin/cc or with a broken cc with missing files or whatever).  But,
it is suppose to be possible to compile a package as any normal user.
To what extent should I hack the compilation system to enforce that
goal?  Say, if /usr/bin/xc was a completely different program.  Then a
normal user would have problems. 

Can someone tell me where to draw the line?


Reply to: