[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#447143: java-gcj-compat-dev: UnboundLocalError: local variable 'MAX_CLASSES_PER_JAR' referenced before assignment



Package: java-gcj-compat-dev
Version: 1.0.76-6
Severity: normal
Tags: patch


Hi,

Building db 4.6.21-3 fails with an error as follows:

dh_nativejava -plibdb4.6-java-gcj -v
dh_nativejava: Compatibility levels before 4 are deprecated.
	aot-compile -L /usr/lib/gcj debian/libdb4.6-java
Traceback (most recent call last):
  File "/usr/bin/aot-compile", line 95, in ?
    compiler.compile()
  File "/usr/lib/python2.4/site-packages/aotcompile.py", line 99, in compile
    self.writeMakefile(MAKEFILE, jobs)
  File "/usr/lib/python2.4/site-packages/aotcompile.py", line 127, in writeMakefile
    values = job.ruleArguments()
  File "/usr/lib/python2.4/site-packages/aotcompile.py", line 276, in ruleArguments
    self.__makeBlocks()
  File "/usr/lib/python2.4/site-packages/aotcompile.py", line 227, in __makeBlocks
    if len(self.blocks[-1]) >= MAX_CLASSES_PER_JAR \
UnboundLocalError: local variable 'MAX_CLASSES_PER_JAR' referenced before assignment
dh_nativejava: command returned error code 256
make: *** [binary-arch] Error 1

This seems to be due to a bug in aotcompile.py included in
java-gcj-compat-dev.  I've created a patch to fix aotcompile.py.in.
Could you please apply it?

Many thanks,

-nori


-- System Information:
Debian Release: 4.0
  APT prefers stable
  APT policy: (500, 'stable')
Architecture: i386 (i686)
Shell:  /bin/sh linked to /bin/bash
Kernel: Linux 2.6.18-5-686
Locale: LANG=ja_JP.eucJP, LC_CTYPE=ja_JP.eucJP (charmap=EUC-JP)
--- aotcompile.py.in.orig	2007-10-18 19:35:02.000000000 +0900
+++ aotcompile.py.in	2007-10-18 20:26:06.000000000 +0900
@@ -31,9 +31,6 @@
 GCJFLAGS = ["-g", "-O2", "-fPIC", "-findirect-dispatch", "-fjni"]
 LDFLAGS = ["-Wl,-Bsymbolic"]
 
-MAX_CLASSES_PER_JAR = 1024
-MAX_BYTES_PER_JAR = 1048576
-
 MAKEFILE = "Makefile"
 
 MAKEFILE_HEADER = '''\
@@ -197,6 +194,8 @@
         __init__ method.  The reason this is not done is because we
         need to parse every class file.  This is slow, and unnecessary
         if the job is subsetted."""
+        MAX_CLASSES_PER_JAR = 1024
+        MAX_BYTES_PER_JAR = 1048576
         names = {}
         for hash, bytes in self.classes.items():
             name = classname(bytes)

Reply to: