nVidia crash on filtered cube-maps

Started by
2 comments, last by RPTD 11 years, 2 months ago

I've got here a long standing problem which so far I temporarily solved by disallowing filtering on cube-maps. That's though not a possible final solution. In short whenever I try to use filtered cube maps on nVidia hardware like this

glActiveTexture( GL_TEXTURE0 );
glBindTexture( GL_TEXTURE_CUBE_MAP, texture );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glEnable( GL_TEXTURE_CUBE_MAP_SEAMLESS ); // crashes with or without this
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE );

the rendering crashes inside the nVidia driver blob

Program received signal SIGSEGV, Segmentation fault.
0x00007ffff6811e50 in ?? () from /lib/x86_64-linux-gnu/libc.so.6
 
(gdb) bt
#0 0x00007ffff6811e50 in ?? () from /lib/x86_64-linux-gnu/libc.so.6
#1 0x00007ffff23eb027 in ?? () from /usr/lib/nvidia-current-updates/libnvidia-glcore.so.304.51
#2 0x00007ffff23ec18a in ?? () from /usr/lib/nvidia-current-updates/libnvidia-glcore.so.304.51
#3 0x00007ffff23ec7a8 in ?? () from /usr/lib/nvidia-current-updates/libnvidia-glcore.so.304.51
#4 0x00007ffff2531e1b in ?? () from /usr/lib/nvidia-current-updates/libnvidia-glcore.so.304.51
#5 0x00007ffff2204d69 in ?? () from /usr/lib/nvidia-current-updates/libnvidia-glcore.so.304.51
(...)

When I use no filtering instead like this

glActiveTexture( GL_TEXTURE0 );
glBindTexture( GL_TEXTURE_CUBE_MAP, texture );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_NEAREST ); // <== !
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_NEAREST ); // <== !
glEnable( GL_TEXTURE_CUBE_MAP_SEAMLESS ); // crashes with or without this
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE );

then no crashing happens (but obviously also no filtering). On ATI/AMD no such crash happens and all renders fine.

Has anybody an idea what nVidia fucks up here in the driver? Any work-around? Any special tokens to set?

Life's like a Hydra... cut off one problem just to have two more popping out.
Leader and Coder: Project Epsylon | Drag[en]gine Game Engine

Advertisement

( Sorry for using post instead of edit but the forum editor here messes up removing all line breaks from the posts making it impossible to edit )

I forgot to mention that the CubeMap is of type GL_RGBA16F hence floating point. Regular Depth and RGB textures do not crash while filtering.

Life's like a Hydra... cut off one problem just to have two more popping out.
Leader and Coder: Project Epsylon | Drag[en]gine Game Engine

The most obvious explanation is that your hardware doesn't support linear filtering of RGBA16F cubemaps, but if so that should just generate a GL error (or best case drop you back to software emulation) rather than crashing. Nonetheless, if that is the cause (you didn't say what specific NV hardware you're talking about so it's hard to say) the driver mess-up is in the crash, not in the lack of support (which would be determined by your hardware and not by anything in software).

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

The hardware is a nVidia GeForce 9500M GS.

Life's like a Hydra... cut off one problem just to have two more popping out.
Leader and Coder: Project Epsylon | Drag[en]gine Game Engine

This topic is closed to new replies.

Advertisement