On Wed, Feb 25, 2009 at 08:17:39AM -0600, John Goerzen wrote: > Kari Pahula wrote: > > I patched ghc6's Binary module to fit an Int in 4 bytes on 64 bit > > architectures, too. It's a runtime error if they try to put anything > > too large in there. If something uses Binary and hits this, it should > > be changed to use Int64. Haddock version 2.4.1-4 implements this > > change. > > I'm concerned about this though -- is upstream going to take this patch? There's a ticket open in GHC's trac, http://hackage.haskell.org/trac/ghc/ticket/3041 Upstream devels haven't commented much about it yet, but they seem to acknowledge it and aren't opposed to using uniform representation on 32/64 bit arches. At least that's what I've gathered from IRC. > I wouldn't want our ghc6 to be incompatible with the same version using > the same module on other platforms. It shouldn't matter with anything that's distributed as source and compiled elsewhere. Anything using GHC's Binary module is most likely working closely with GHC's API anyway, and nobody is expecting stuff at that level to be portable. GHC itself uses Binary in a handful of places, mainly for defining interfaces AFAICT. I had to make a choice between using 64 bit serialization and possible overflows when reading those to Ints on 32 bit arches, or testing the same on 64 bit arches when writing Ints and using 32 bit serialization. I'd be happier if this was handled by upstream too, but this just seemed to be the fastest way to get GHC working within Debian. I'm hoping that this won't prove to be a bad choice on my part. :-/ If something went wrong with this, we'd need to rebuild ghc6 and all libraries again. Hopefully we'll get everything to be easily binNMUable soon, even if we won't need that.
Attachment:
signature.asc
Description: Digital signature