Bug#317162: kaffe: Un/compress contentlenght problem with HttpURLConnection or URLConnection

Robert Warren rhwarren@uwaterloo.ca, 317162@bugs.debian.org
Wed Jul 6 18:58:01 UTC 2005


Package: kaffe
Version: 2:1.1.5-3
Severity: important

*** Please type your report below this line ***

java.net.URLConnection and/or java.net.HttpURLConnection report the
compressed content length according to the http headers when the webserver
sends a gzip compressed file.

However, when an InputStream for the content is created with
getInputStream(), the read() and available() methods will only return
uncompressed caracters for the length of the compressed payload. The raw
data is thus truncated.

A work-around is to prevent compression on the HTTP connection by forcing
the headers:

.setRequestProperty("Accept", "text/html, text/plain");
.setRequestProperty("Accept-Encoding","chunked;q=1.0");

But this seems to break something else with large files.

-- System Information:
Debian Release: 3.1
  APT prefers testing
  APT policy: (500, 'testing')
Architecture: i386 (i686)
Kernel: Linux 2.6.11.8bravo
Locale: LANG=C, LC_CTYPE=C (charmap=ANSI_X3.4-1968)

Versions of packages kaffe depends on:
ii kaffe-pthreads 2:1.1.5-3 A POSIX threads enabled version of

-- no debconf information






More information about the pkg-java-maintainers mailing list