Bugzilla – Bug 5201
Not all colors work on a osx
Last modified: 2007-10-18 09:51:14 UTC
Triode wrote: > > On intel mac I get different pixels from the following: > > new:pixel(i,(j*8-k)*2, 256*256*256*255 + 256*256*255 + 256*255 + 255 ) > new:pixel(i,(j*8-k)*2+1,0xFFFFFFFF) > > The hex value one seems to be converted to a signed value by lua and prints as -1 all platforms. The decimal one is > always a positive number and displays on all playtforms. > > Possible different length of ints? [mac is C2D so could be 64 bit ints?] > > To confirm this 0x7FFFFFFF will display on the mac. After looking at this it appears to be a problem with the toloa++ wrapper over the JiveSurface. This casts everything via a double, and the casting does not work correctly on osx. The solution is to rewrite the JiveSurface lua binging using the native lua/C interface (I was planning on removing tolua++ at some stage anyway).
Are you planning to do this for the 7.0 release or should I create a new target, Richard?
I am not sure when I'll get around to this, I am happy for it to be untargeted at the moment.
Fixed in r718.