diff options
author | Eduardo Lima Mitev <[email protected]> | 2014-12-15 17:04:52 +0100 |
---|---|---|
committer | Iago Toral Quiroga <[email protected]> | 2015-01-13 12:19:32 +0100 |
commit | b8b1d83c71fd148d2fd84afdc20c0aa367114f92 (patch) | |
tree | 4af0443197d5b632896795c4bfd0227b921678d2 /src/gallium/state_trackers | |
parent | aa727c1dd9e92dfafcc1ed39a9c65478ae40ce39 (diff) |
mesa: Initializes the stencil value masks to 0xFF instead of ~0u
'4.1.4 Stencil Test' section of the GL-ES 3.0 specification says:
"In the initial state, [...] the front and back stencil mask are both set
to the value 2^s − 1, where s is greater than or equal to the number of
bits in the deepest stencil buffer* supported by the GL implementation."
Since the maximum supported precision for stencil buffers is 8 bits, mask
values should be initialized to 2^8 - 1 = 0xFF.
Currently, these masks are initialized to max unsigned integer (~0u), because
in OpenGL 3.0 and before, the initial mask values were:
"In the initial state, stenciling is disabled, the front and back
stencil reference value are both zero, the front and back stencil
comparison functions are both ALWAYS, and the front and back
stencil mask are both all ones."
The problem is that it causes the mask values to overflow to -1 when converted
to signed integer by glGet* APIs.
Fixes 6 dEQP failing tests:
* dEQP-GLES3.functional.state_query.integers.stencil_value_mask_getfloat
* dEQP-GLES3.functional.state_query.integers.stencil_back_value_mask_getfloat
* dEQP-GLES3.functional.state_query.integers.stencil_value_mask_separate_getfloat
* dEQP-GLES3.functional.state_query.integers.stencil_value_mask_separate_both_getfloat
* dEQP-GLES3.functional.state_query.integers.stencil_back_value_mask_separate_getfloat
* dEQP-GLES3.functional.state_query.integers.stencil_back_value_mask_separate_both_getfloat
Reviewed-by: Ian Romanick <[email protected]>
Diffstat (limited to 'src/gallium/state_trackers')
0 files changed, 0 insertions, 0 deletions