[PATCH 05/18] Fix 64bit arch issue in synaptics eventcomm.c
Mark Kettenis
mark.kettenis at xs4all.nl
Tue Oct 12 23:38:58 PDT 2010
> Date: Mon, 11 Oct 2010 10:39:56 +0200
> From: Takashi Iwai <tiwai at suse.de>
>
> At Mon, 11 Oct 2010 10:24:44 +0200 (CEST),
> Mark Kettenis wrote:
> >
> > > Date: Mon, 11 Oct 2010 09:47:56 +0200
> > > From: Takashi Iwai <tiwai at suse.de>
> > >
> > > At Fri, 08 Oct 2010 23:09:26 +0200,
> > > Henrik Rydberg wrote:
> > > >
> > > > On 10/08/2010 10:51 PM, Mark Kettenis wrote:
> > > >
> > > > >> Date: Fri, 08 Oct 2010 21:36:02 +0200
> > > > >> From: Takashi Iwai <tiwai at suse.de>
> > > > >>
> > > > >> At Fri, 8 Oct 2010 20:48:10 +0200 (CEST),
> > > > >> Mark Kettenis wrote:
> > > > >>>
> > > > >>>> From: Takashi Iwai <tiwai at suse.de>
> > > > >>>> Date: Fri, 8 Oct 2010 19:22:29 +0200
> > > > >>>
> > > > >>> Sorry, but it isn't obvious to me what issue this fixes.
> > > > >>
> > > > >> In C, "1" is an integer, not an unsigned long.
> > > > >> Thus (1 << 33) doesn't give you the 33th bit shift, but it's undefined.
> > > > >
> > > > > But if you'd actually use 33 (or event 32) as an offset, things
> > > > > wouldn't work on a 32-bit platform would they?
> > > >
> > > >
> > > > I believe this is what the patch is supposed to fix.
> > >
> > > Yep :)
> > >
> > >
> > > > > Anyway,
> > > > >
> > > > >> If any, this must be (1UL << 32).
> > > > >
> > > > > This is the idiom that is much more common. I probably wouldn't have
> > > > > questioned things if you'd written it like that. I recommend you to
> > > > > stick to this version.
> > > >
> > > >
> > > > For a counter-example, see the definition of test_bit() in the Linux kernel.
> > > >
> > > > >> Also, it'd be better if such a test macro returns 1 instead of a
> > > > >> random non-zero value. So does my patch.
> > > > >
> > > > > In C this doesn't matter.
> > > >
> > > >
> > > > It is still useful to know if a logic expression evaluates to (0, 1) or (0,
> > > > nonzero).
> > >
> > > It does matter if the returned type is bigger.
> > > Try an example code below on 32bit and 64bit machines:
> > >
> > > ================================================================
> > > #include <stdio.h>
> > >
> > > #define LONG_BITS (sizeof(long) * 8)
> > > #define OFF(x) ((x) % LONG_BITS)
> > > #define LONG(x) ((x) / LONG_BITS)
> > >
> > > #define TEST_BIT1(bit, array) (array[LONG(bit)] & (1 << OFF(bit)))
> > > #define TEST_BIT2(bit, array) (array[LONG(bit)] & (1UL << OFF(bit)))
> > > #define TEST_BIT3(bit, array) ((array[LONG(bit)] >> OFF(bit)) & 1)
> > >
> > > static unsigned long test[4];
> > >
> > > int main()
> > > {
> > > const int bit = 33;
> > > int result;
> > >
> > > /* set the bit */
> > > test[LONG(bit)] |= 1UL << OFF(bit);
> > >
> > > result = TEST_BIT1(bit, test);
> > > printf("TEST1 %d = %d\n", bit, result);
> > >
> > > result = TEST_BIT2(bit, test);
> > > printf("TEST2 %d = %d\n", bit, result);
> > >
> > > result = TEST_BIT3(bit, test);
> > > printf("TEST3 %d = %d\n", bit, result);
> > >
> > > return 0;
> > > }
> > > ================================================================
> > >
> > > On a 32bit machine, it results in:
> > > TEST1 33 = 2
> > > TEST2 33 = 2
> > > TEST3 33 = 1
> > >
> > > while on a 64bit machine, it results in:
> > > TEST1 33 = 0
> > > TEST2 33 = 0
> > > TEST3 33 = 1
> > >
> > > See TEST2 still failed although it uses 1UL.
> >
> > This example is meaningless. On every reasonable 32-bit architecture,
> > long is a 32-bit integer. So
> >
> > > /* set the bit */
> > > test[LONG(bit)] |= 1UL << OFF(bit);
> >
> > when bit is 33 (or even 32), is completely undefined on a 32-bit
> > machine. Therefore, you should never call TEST_BIT() with a first
> > argument of 32 or larger, even on a 64-bit architecture, since the
> > code wouldn't work on a 32-bit machine.
>
> Hm? TEST_BIT() is used for an long *array*, with the unlimited size.
> Otherwise the macro should have been more simplified :)
> Ditto for the above code. It stores the value in an array member.
> That's why LONG() and OFF() macros are used there.
Hmm, ok that code is seriously weird. Guess it dates from the time
that people still cared about machines with 16-bit integers.
Sorry for the noise.
More information about the xorg-devel
mailing list