diff options
author | Brian Paul <[email protected]> | 2016-03-18 09:55:57 -0600 |
---|---|---|
committer | Brian Paul <[email protected]> | 2016-03-21 11:59:25 -0600 |
commit | dc9ecf58c0c5c8a97cd41362e78c2fcd9f6e3b80 (patch) | |
tree | 0f507a7a25a2d7b20d2c33432cb73af1c0f88dfe /src/gallium/drivers/svga/svga_screen.c | |
parent | b56b853ab3937d6144597f490bb38e2532d0cee2 (diff) |
svga: use shader sampler view declarations
Previously, we looked at the bound textures (via the pipe_sampler_views)
to determine texture dimensions (1D/2D/3D/etc) and datatype (float vs.
int). But this could fail in out of memory conditions. If we failed to
allocate a texture and didn't create a pipe_sampler_view, we'd default
to using 0 (PIPE_BUFFER) as the texture type. This led to device errors
because of inconsistent shader code.
This change relies on all TGSI shaders having an SVIEW declaration for
each SAMP declaration. The previous patch series does that.
Reviewed-by: Charmaine Lee <[email protected]>
Diffstat (limited to 'src/gallium/drivers/svga/svga_screen.c')
0 files changed, 0 insertions, 0 deletions