[sane-devel] Potential buffer overflow when scanning more than 16384 bytes per line with Brother MFC-6490CW

Yuval Levy yuval.levy at ryerson.ca
Sat Jun 27 21:03:50 UTC 2015


Thank you, Allan, for the very quick reply.  I hate to be the harbinger
of bad news, but there is only bad news or worse news to choose from.
It is not directed at you.  Here it is:

On 15-06-27 02:35 PM, m. allan noah wrote:
> Can you try using scanimage -B ?

TL;DR: Does not change a thing, and I had to use --buffer-size because
the -B switch is broken in version 1.0.23 that is distributed from the
Ubuntu repositories.

I candidly admit that before Allan's hint I did not think of changing
the input buffer size manually, despite being familiar with the manpage,
having read it many times over the years.  Not this time, alas.  Why
should there be a need to manually set the input buffer size when the
size needed can be calculated from the other inputs?

-B itself did not work -- see worse news below, but when I used
--buffer-size=64 (or 256), the change of input buffer size was confirmed
by the sane_read() debug output line telling me that maxlen=65536 (or
262144) and the result was the same: scanner timeout.

Worse news: I may have found what I think is an IN*SANE* inconsistency
between the man page and the actual response of scanimage.

Quoting the man page:
| The -B or --buffer-size changes the input buffer size
| from 32KB to the number kB specified or 1M.

First of all, -B is not equivalent to --buffer-size.  All of the
following resulted in maxlen=0:

scanimage -B=64
scanimage -B=64k
scanimage -B=64K
scanimage -B=65536

Worse, all of the following resulted in an 'argument without option
error', making the -B switch inconsistent with other single letter
switches such as -l, -t, -x and -y, all of which accept a space and a
numeric input after the space:

scanimage -B 64
scanimage -B 64k
scanimage -B 64K
scanimage -B 65536

Finally, --buffer-size= yields different results than -B=.  Sometimes
these results are not what is intuitively expected:

scanimage --buffer-size=65536  maxlen=67108864 way too big
scanimage --buffer-size=64K    maxlen=65536, as expected
scanimage --buffer-size=64     maxlen=65536
scanimage --buffer-size=64M    maxlen=65536, way too small
scanimage --buffer-size=1M     maxlen=1024 way too small

and the worse one:

scanimage --buffer-size=64insaneblurb yields maxlen=65536

It seems that the text input after the numbers is simply ignored.

So in addition to my issue not being solved, I now have a few question
about scanimage:

(1) are there plausibility checks of the command line switches and if so
why did they not catch the gibberish that I enter after --buffer-size?

(2) has the difference between -B and --buffer-size been introduced by
mistake, or is it intentional?  What should be fixed, the code or the
man page?

(3) what prevents the automated calculation of the buffer size according
to the very simple formula that is X/254*D*B, where X is the scan width
in millimetres as entered with the -x parameter, D the DPI as entered
with the --resolution parameter, and B the number of bytes used per
pixel, based on the --mode parameter?  I recall the times when memory
was at a premium, when programming in Assembler on CPUs such as the 6502
or the 8048 one had to think hard of limiting buffers to a few bytes,
but in this day and age does a 32K default limit make any sense?

Yuv

-- 
Yuval Levy, JD, MBA, CFA

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: OpenPGP digital signature
URL: <http://lists.alioth.debian.org/pipermail/sane-devel/attachments/20150627/27dde344/attachment.sig>


More information about the sane-devel mailing list